Course Overview

This 2-hour course transforms Marines from AI users to AI-assisted builders. Students will witness live demonstrations of tool creation, attempt their own builds, and leave with a concrete plan for their first project. Prerequisites: AI Fluency Fundamentals (Course 1).

Session Timing Breakdown

Module Activity Duration
Module 1 Introduction & Context: From User to Builder 15 min
Module 2 Live Build Demo: Equipment Tracker 25 min
Module 3 Student Build + Peer Review: When Something Breaks 40 min
Module 4 Decomposition Framework: Your Problem 20 min
Module 5 Wrap-Up & Assignment 10 min
Buffer Time 10 min
Total 120 min

Contingency: Technical Failures

If Power Platform, AI chat, or internet access is unavailable:

  • Option 1 (Preferred): If AI tools are unavailable during the live build, use the step-by-step walkthrough below as a guided demonstration. The instructor reads each prompt aloud, describes the expected output, and leads discussion on what the AI produced.
  • Option 2: Conduct paper-based decomposition exercise using whiteboard scenarios. Provide 3 sample problems (equipment tracker, leave request form, inventory dashboard) and have students work through full decomposition in pairs.
  • Option 3: Use offline debugging exercise (see Module 2 fallback below). Review pre-built broken code on paper, identify errors as a group, discuss how to frame debugging prompts.
  • Do NOT cancel the course. The decomposition skills are platform-independent and can be taught without live tools.

Agenda

Time Module 201 Skills Applied
0:00 – 0:15 Module 1: From User to Builder Task Decomposition overview
0:15 – 0:40 Module 2: Live Build - Equipment Tracker Task Decomposition, Context Assembly, Iterative Refinement
0:40 – 0:50 Break
0:50 – 1:30 Module 3: Student Build + Peer Review All six skills (hands-on practice)
1:30 – 1:50 Module 4: Decomposition Framework Task Decomposition, Frontier Recognition
1:50 – 2:00 Module 5: Wrap-Up & Assignment Path forward, assessment criteria

Module 1: From User to Builder

Duration: 15 minutes

Bridge from AI Fluency Fundamentals

You have learned to manage AI as a user. Building tools is the same skills at higher complexity. Instead of managing one conversation, you are managing an entire project through AI.

The Builder Mindset

Before you open any tool, decompose on the whiteboard. What are the components? What does the data look like? What does the user need to see? This is Task Decomposition applied to system design.

Instructor Note

Verify all students completed AI Fluency Fundamentals. Ask: “What are the six 201 skills?” If students cannot recall at least four, do a 2-minute refresher.

Module 2: Live Build #1 — Equipment Tracker

Duration: 25 minutes

The Problem

“I need a way to track equipment checkout for my section.”

Instructor Setup

Before class: Ensure you have access to GenAI.mil (preferred), or a commercial AI tool (ChatGPT or Gemini) for unclassified exercises, projected on screen. Have Power Apps portal open in another tab. Do NOT pre-build anything. The value is in the live narration of decisions.

Teaching approach: Think out loud constantly. Verbalize every decision: "I'm choosing to start simple because..." "I'm adding this detail to the prompt because..." "I'm ignoring this AI suggestion because..."

Preferred Tool Order

For live builds, prefer GenAI.mil or CamoGPT (Army-managed) if the classroom has CAC-enabled workstations on a DoD network. These tools allow you to use CUI-level unit data in prompts. Anonymize PII (real names, SSNs) unless a PIA authorizes it. Fall back to commercial tools (ChatGPT, Gemini) for unclassified-only exercises. Note: ChatGPT, Gemini, and Grok are also available on GenAI.mil. See the Approved Tools page.

Data Handling Boundaries

GenAI.mil / CamoGPT (Army-managed, IL5): Authorized for CUI. PII and PHI must be anonymized unless a PIA authorizes it. Classified data is never authorized (except CamoGPT on SIPR/IL6).
Commercial tools (ChatGPT, Gemini via web): Unclassified, non-sensitive data only. Treat every prompt as public disclosure.

Step 1: Decompose BEFORE Touching AI (5 min, whiteboard)

Write on whiteboard or share screen with notepad. Ask students to call out answers:

  • What data fields do we need? Item name, serial number, assigned to (name), date checked out, date due back, status (out/in/overdue)
  • What does the user need to do? Check out item, check in item, see what is currently checked out, see overdue items
  • What is the simplest useful version? A form to check out/check in, and a list showing current status
  • What data structure? A table/spreadsheet with one row per item

Teaching Point: Decomposition First

Emphasize: "We spent 5 minutes on paper before touching AI. This is the most important 5 minutes. Bad decomposition means you will spend 30 minutes building the wrong thing."

Step 2: Build with AI (20 min)

Open GenAI.mil in one browser tab and Power Apps in another. In GenAI.mil, describe your data structure and ask it to generate the Power Apps formula code. Copy the generated code into Power Apps. GenAI.mil generates the code as text — you copy it into Power Apps to run it.

Narrate each prompt before typing it. Follow this sequence:

When AI Output Doesn't Match

The expected outputs below show what a well-configured agent should produce. Your AI may use different variable names, different ordering, or include extra explanation. This is normal. Check for three things:

  1. Correct syntax patterns.Value on Choice columns, {Value: "..."} for Patch
  2. Correct function names — Power Fx functions only, not JavaScript or DAX
  3. Delegation warnings on data formulas

If those three are right, the output is usable even if it doesn't match word-for-word. If the AI produces JavaScript syntax (forEach, =>), re-prompt: "Use Power Fx only, not JavaScript."

Prompt 1: Define the Problem (2 min)

Say: "I start with WHO uses this and WHAT platform we have. I ask for a plan, not code."

Prompt to AI:

I need an equipment checkout tracker for a Marine Corps rifle section (~30 people, ~50 items) in Microsoft Power Apps. It needs to: check out items (item name, serial, person, date out, date due), check in items, show all checked-out items, and highlight overdue ones. Give me a step-by-step plan before we build.

Expected: AI suggests a SharePoint list as data source + Canvas app with form and gallery.

Say: "It gave us a plan, not code. We verify the plan makes sense before building."

Prompt 2: Create the Data Structure (3 min)

Say: "Now I get the exact data structure. I specify field names and types because I know my domain."

Prompt to AI:

Give me the exact SharePoint list configuration: list name, all columns with data types, and any calculated columns for overdue status. Include a CSV template I can import into SharePoint and tell me which columns I need to add manually after import.

Expected AI Output:

List Name: EquipmentCheckout

| Display Name    | Internal Name  | Type               | Required | Notes                        |
|-----------------|----------------|--------------------|----------|------------------------------|
| Title           | Title          | Single line of text | Yes      | Item name (built-in column)  |
| Serial Number   | SerialNumber   | Single line of text | Yes      |                              |
| Assigned To     | AssignedTo     | Single line of text | Yes      |                              |
| Date Out        | DateOut        | Date and Time       | Yes      | Date only                    |
| Date Due        | DateDue        | Date and Time       | Yes      | Date only                    |
| Status          | Status         | Choice              | Yes      | Checked Out, Checked In, Overdue |
| Checked In Date | CheckedInDate  | Date and Time       | No       |                              |
| Is Overdue      | IsOverdue      | Calculated (Yes/No) | —       | Add after import             |

CSV Template (save as EquipmentCheckout.csv):
Title,SerialNumber,AssignedTo,DateOut,DateDue,Status,CheckedInDate
M4 Carbine,W123456,SSgt Morgan,2026-03-01,2026-03-15,Checked Out,

Import steps:
1. Save the CSV above in Notepad (UTF-8 encoding)
2. SharePoint → Site Contents → New → List → From CSV → upload
3. Convert Status column to Choice (add values: Checked Out, Checked In, Overdue)
4. Add Calculated column "IsOverdue":
   Formula: =IF(AND([Status]="Checked Out",[DateDue]<TODAY()),TRUE,FALSE)
   Return type: Yes/No
5. Click Integrate → Power Apps → Create an app

Action: Import this CSV into SharePoint now (or describe where you would). AI gave us the structure, the CSV, and the post-import steps. After import, complete the manual steps: convert Status to Choice, add the Calculated column for IsOverdue. CSV handles most columns — only Choice values, Calculated, and Person columns need post-import setup.

Shortcut: Instant App from SharePoint

Once the SharePoint list exists, you do not need to build the app from scratch. Go to the list, click IntegratePower AppsCreate an app. Power Apps generates a fully functional app connected to your list in seconds — forms, gallery, navigation, all wired up.

This is your starting point. From here, use AI to refine the auto-generated app to match your needs. The remaining prompts show how.

Prompt 3: Refine the Interface (5 min)

Say: "Power Apps gave us a working app. Now I use AI to customize it. I describe what I want changed in plain English."

Prompt to AI:

I used Integrate → Power Apps → Create an app from my EquipmentCheckout SharePoint list. The auto-generated app has BrowseGallery1 showing all items. I want to: filter the gallery to only show checked-out items, color overdue items in red, and add a check-in button next to each gallery row. Give me the formulas.

Expected AI Output:

// BrowseGallery1.Items
Filter(EquipmentCheckout, Status.Value = "Checked Out")

DELEGABLE — safe for large lists.

// lblDateDue.Color (label inside gallery)
If(IsBlank(ThisItem.DateDue), Black, If(ThisItem.DateDue < Today(), Red, Black))

// btnCheckIn.OnSelect (button inside gallery)
Patch(EquipmentCheckout, ThisItem, {Status: {Value: "Checked In"}, CheckedInDate: Now()})

Say: "I described what I wanted in plain English. AI translated it to Power Apps formulas. This is Context Assembly: I provided domain knowledge, AI provided technical knowledge."

Prompt 4: Iterate (3 min)

Say: "It works but needs polish. I iterate — I do not rebuild. This is Iterative Refinement."

Prompt to AI:

I added a Text Input control named SearchBox above BrowseGallery1. I also added a Warning icon named icnOverdue inside each gallery row. Now update the gallery to: search by item name or assigned person, sort overdue items to the top, and show the warning icon only for overdue items.

Expected AI Output:

// BrowseGallery1.Items
SortByColumns(
    Filter(EquipmentCheckout,
        Status.Value = "Checked Out" &&
        (SearchBox.Text in Title || SearchBox.Text in AssignedTo)),
    "DateDue", Ascending)

NOT DELEGABLE — the "in" operator is not delegable. Only the first 2,000
rows are searched. For small lists this is fine. For large lists, use
StartsWith(Title, SearchBox.Text) instead (delegable, but only matches
the beginning of the text).

Set SearchBox.DelayOutput to true (prevents a query on every keystroke).

// icnOverdue.Visible
!IsBlank(ThisItem.DateDue) && ThisItem.DateDue < Today()

// icnOverdue.Color
RGBA(204, 0, 0, 1)

Say: "I did not rebuild from scratch. Incremental improvements. Functional first, then refine."

Prompt 5: Debug an Error (4 min)

Say: "Now I intentionally break something to show you debugging with AI."

Action: Introduce a small error (wrong column name or type mismatch).

Prompt to AI:

I am getting this error: "Invalid argument type. Expecting a Text value instead." Here is my formula:

Filter(EquipmentCheckout, Status = "Checked Out" && SearchBox.Text in SerialNumber)

What is wrong?

Expected:

Two issues: (1) The "in" operator expects text on both sides — SerialNumber may be a number. (2) Status is a Choice column — must use Status.Value. Fix:

Filter(EquipmentCheckout, Status.Value = "Checked Out" && SearchBox.Text in Text(SerialNumber))

Say: "Exact error message + exact code + what you were trying to do. AI diagnoses it immediately. Most people quit at the first error. Builders treat errors as normal."

Prompt 6: Final Polish (3 min)

Say: "Last step: make it look professional. Optional but improves adoption."

Prompt to AI:

Give me a professional color scheme for a Marine Corps app: background, buttons, header. Use red and gold accents on dark background.

Expected AI Output:

Background: RGBA(20, 20, 20, 1) - Dark gray
Header:     RGBA(140, 0, 0, 1) - Marine red
Buttons:    RGBA(200, 160, 0, 1) - Gold
Text:       White
Alt rows:   RGBA(40, 40, 40, 1) / RGBA(30, 30, 30, 1)

Say: "Even aesthetic choices can be AI-guided. I provided the constraint, it provided options."

Step 3: Review (5 min)

Show the final result (even if only partially built). Ask the class:

  • "What would you change for your section?" (Take 2-3 answers)
  • "What did you notice about the process?" (Emphasize: decompose first, iterate, debug is normal)
  • "How long do you think this would have taken before AI?" (Ans: Days or weeks for a custom app)

Contingency: Debugging Exercise Fallback

If technical issues prevent the live build, use this pre-built debugging case study:

Scenario: A Marine built an equipment tracker but it has bugs. Present this broken code on screen:

Gallery Items Property:
Filter(EquipmentCheckout, Status = "CheckedOut")

Check In Button:
Patch(EquipmentCheckout, ThisItem, {Status: "Returned"})

Overdue Highlight:
If(DateDue < Today(), Red, Black)

Four bugs to diagnose as a class:

  1. Bug 1: Status filter uses "CheckedOut" (no space) but the Choice value is "Checked Out" (with space). Must match exactly.
  2. Bug 2: Status is a Choice column — must use Status.Value to compare, not bare Status.
  3. Bug 3: Status value "Returned" does not exist in the Choice column. Should be "Checked In". And Patch must use Choice record syntax: {Status: {Value: "Checked In"}}.
  4. Bug 4: Overdue formula errors when DateDue is blank (checked-in items). Needs null check: If(IsBlank(DateDue), Black, If(DateDue < Today(), Red, Black))

Class activity: Break into pairs. Give 5 minutes to identify bugs. Then discuss as group how to frame debugging prompts to AI. Example:

"My gallery is not showing any items even though I have data. Here is my filter formula: Filter(EquipmentCheckout, Status = "CheckedOut"). The Status column is a Choice field with values: Checked Out, Checked In, Overdue. What is wrong?"

Teaching point: Effective debugging prompts include: exact code, exact error or symptom, and relevant context (like column configuration).

Module 3: Student Build + Peer Review

Duration: 40 minutes

Purpose

Students attempt their own build and learn to debug when things break. Peer review reinforces quality judgment and iterative refinement.

Student Build Activity (25 min)

Each student selects one of these starter problems or uses their own:

  • Option 1: Leave request tracker (who, dates, type, approval status)
  • Option 2: Training attendance log (event, date, attendees, completion status)
  • Option 3: Vehicle inspection checklist (vehicle ID, date, checklist items, pass/fail)
  • Option 4: Their own section problem (must get instructor approval for scope)

Instructor Guidance During Build Time

Circulate the room constantly. Do NOT let students struggle silently for more than 3 minutes. Key interventions:

  • If stuck on decomposition: "What is the simplest version that would be useful?"
  • If stuck on a prompt: "Read me your prompt out loud." (Often they hear the problem themselves.)
  • If stuck on an error: "What does the error message say? Copy it exactly and ask AI to explain it."
  • If building something too complex: "Stop. What is the one core function this needs to do? Build only that first."

Do NOT fix their code. Ask questions that lead them to the answer. The debugging process is the lesson.

When It Breaks

Expect every student to hit at least one error. When they do:

  1. Read the error message out loud. Do not guess. Read exactly what the system says.
  2. Copy the exact error message from Power Apps. Include any error codes or line numbers.
  3. Switch to your GenAI.mil conversation. Paste the error along with what you were trying to do. Example prompt:

I am building a leave request tracker in Power Apps. I added this formula to my gallery:

Filter(LeaveRequests, Status = "Pending" && RequestDate > Today())

But I am getting this error: "The function 'Filter' has some invalid arguments."

The RequestDate column is a Date field in my SharePoint list. What is wrong and how do I fix it?

Instructor Narration

“I copied the error message from Power Apps. I switched to my GenAI.mil conversation. I pasted the error along with what I was trying to do. GenAI.mil diagnosed it and suggested a fix. I copy the corrected code back into Power Apps. This is the debugging loop: Power Apps shows the error, GenAI.mil diagnoses it, you apply the fix.”

Key Teaching Point

The first time something breaks, most people quit. That is the 80% who stop building. Debugging with AI is a skill. Errors are not failures, they are part of the process. Builders expect errors and know how to fix them.

Debugging Across Tools

Debugging techniques work the same regardless of which AI tool you use. If a student gets stuck on CamoGPT (Army-managed), they can paste the same error message into ChatGPT or Gemini (commercial, unclassified only) for a second opinion (unclassified prompts only — do not paste CUI into commercial tools). Using multiple tools is a valid strategy.

Peer Review Protocol (15 min)

Pair students up. Each student demonstrates their prototype (even if incomplete) to their partner.

Peer Review Criteria

Reviewer: check each of these items and provide feedback

  1. Clear Problem Statement: Can the builder explain in one sentence what problem this solves?
  2. Core Function Works: Does the most basic version do what it needs to do? (Does not need to be pretty or complete.)
  3. Evidence of Iteration: Did the builder refine at least once? (Example: added a feature, improved formatting, fixed a bug)
  4. Can Explain Decisions: Can the builder explain why they made specific design choices? (Example: "I used a dropdown for status because there are only 3 options.")

Facilitation Protocol for Peer Review

Time Activity Instructor Role
0-2 min Explain peer review criteria (show on screen) Brief the process, assign pairs
2-8 min Partner A demonstrates their build, Partner B reviews using criteria Circulate, listen, ensure reviews are constructive
8-14 min Partner B demonstrates their build, Partner A reviews using criteria Continue circulating, note good examples to share
14-15 min Debrief as full class: "What did you learn from seeing your partner's approach?" Facilitate discussion, highlight 1-2 standout examples

Simple Peer Review Rubric

Criteria Yes Partially Not Yet
Clear problem statement One-sentence explanation Can explain but takes several tries Unclear what problem this solves
Core function works Basic feature is functional Partially works or works with errors Does not function yet
Evidence of iteration At least one refinement visible Attempted refinement but incomplete First version only, no changes
Can explain decisions Articulates reasoning for design choices Can explain some choices but not all Cannot explain why built this way

Instructor Note: Authentic Failures

Do not fake failures. If a student build goes smoothly, that is fine. But if something breaks (and it will for most students), treat it as a teaching moment for the whole class. Stop and demonstrate debugging live. Authentic debugging is far more instructive than a scripted error.

If time allows and most students finish early, challenge them to add one more feature that pushes them slightly out of their comfort zone. This will likely force at least one error, which is valuable practice.

Module 4: Decomposition Framework Applied to Your Problem

Duration: 20 minutes

Now that students have built something and debugged it, they apply the full decomposition framework to a real problem they will build later. This bridges today's orientation to the post-course assignment.

Individual Decomposition Exercise (10 min)

Each participant identifies a real problem from their section/unit and decomposes it on paper. Use the worksheet template:

Decomposition Worksheet

1. Problem Statement (one sentence):

2. Core Requirements (what must this do?):

3. Decompose into 4-6 subtasks:

Subtask Human Role AI Role Pattern
Example: Define data structure Specify fields needed Suggest field types Centaur

4. Potential Frontier Issues (what might AI struggle with?):

  • Example: Integration with existing CAC-authenticated system

5. Simplest Useful Version (what is the MVP?):

Instructor Guidance During Decomposition

Walk the room actively. Quality of decomposition predicts quality of build. Common issues to watch for:

  • Problem too big: If subtasks list more than 8 items, the problem is too complex for a first build. Help them narrow scope.
  • Problem too vague: If the problem statement is more than two sentences or uses words like "better" or "improved," it is not specific enough. Push for concrete requirements.
  • No frontier recognition: If they do not list any potential AI limitations, they are over-confident. Prompt: "What part of this might be hard for AI?"
  • Skipping human role: If all subtasks say "AI builds," they are missing the point. Human provides domain knowledge, AI provides technical execution.

Pair Review (8 min)

Pair up and review each other's decompositions. Use these review questions:

  • Clarity Check: Is the problem statement clear? Could someone unfamiliar with the unit understand it?
  • Scope Check: Is any subtask too big? Should it be broken down further?
  • Frontier Check: Is there a frontier risk the builder missed? (Examples: authentication, data security, integration with legacy systems)
  • MVP Check: Is the simplest useful version actually simple? Could it be built in 1-2 hours?

Facilitation Tips for Pair Review

Time allocation: 4 minutes per person (4 min for Partner A's review, 4 min for Partner B's review). Use a visible timer.

Intervention signals: If you hear a pair stuck on clarifying the problem for more than 2 minutes, intervene. Ask the builder: "What does success look like for this tool? Describe the end state."

Debrief (2 min)

Ask the class: "Who changed their decomposition based on peer feedback?" (Show of hands.) "What did your partner catch that you missed?"

Take 1-2 examples. Emphasize: Decomposition is a skill. You will get better at it with practice. The first time you decompose, you will miss things. That is normal.

Success Indicator

By the end of this module, every student should have a one-page decomposition of a real problem they will attempt to build before the next course. If a student's worksheet is blank or vague, they are not ready for the assignment. Pull them aside during the break and help them scope appropriately.

Module 5: Wrap-Up & Assignment

Duration: 10 minutes

Assignment Before Platform Training

Your Task Before Next Session

  1. Build your decomposed solution using AI. Use the problem you decomposed in Module 4. Spend 1-3 hours on it.
  2. Document what worked and what did not. Keep notes on: prompts that worked well, errors you encountered, features you had to cut.
  3. Identify at least one failure case to share. We will debrief these at the start of Platform Training.
  4. Bring a working (or partially working) prototype. It does not need to be complete. Functional basics are enough.

Deadline: Before Platform Training (Course 3). Estimated time: 2-4 hours over the next week.

How You Will Be Assessed

Your prototype will be evaluated using this rubric during Platform Training:

Criteria Exceeds Meets Developing
Functional Prototype Core feature works and at least one refinement implemented (e.g., search, filtering, validation) Core feature works as intended, even if rough around the edges Attempted build but core feature does not work yet
Task Decomposition Applied Clear decomposition visible in build structure; can articulate how each subtask was approached Evidence of decomposition; built in logical phases Build appears ad-hoc; no clear decomposition strategy
Iterative Refinement Shown Multiple iterations visible; can explain what changed between versions and why At least one iteration beyond initial build Single attempt only; no refinement
Can Explain Build Decisions Articulates reasoning for all major design choices; can identify tradeoffs made Can explain why key features were included or excluded Cannot explain design rationale; unclear on decision-making process

Grading Guidance

This is not a pass/fail assessment. The goal is to identify who needs additional support before Platform Training. Use this rubric during the Platform Training intake:

  • 4 "Exceeds": Student is ready for advanced challenges. Consider them for peer mentoring roles.
  • 3-4 "Meets": Student is on track. Standard Platform Training path.
  • 2+ "Developing": Student needs 1-on-1 coaching before Platform Training. Schedule 30-min session to troubleshoot blockers.
  • Did not attempt: Student may not be ready for builder track. Discuss motivation and barriers.

Path Forward

Today: Builder Orientation (2 hr) → This Week: Build your prototype (2-4 hr) → Next: Platform Training (4 hr) → Then: Advanced Workshop (4 hr) → Ongoing: Builder Community & Contribution

Support Resources

If you get stuck while building your prototype:

  • Office Hours: Wednesdays 1500-1600, Building 5, Room 201
  • Email: Send build questions to ai-builders@1stbn99thmar.mil
  • Teams Channel: #builder-orientation in the 1st Bn, 99th Marines AI Fluency workspace
  • Documentation: EDD Toolkit at \\battalion-share\AI-Fluency\EDD-Toolkit

Final Instructor Remarks

Closing the Session

Key messages to reinforce:

  • "You are not expected to be experts after 2 hours. You are expected to try."
  • "Every builder hits errors. The difference is that builders debug instead of quit."
  • "Decomposition is the skill that makes everything else work. If you master nothing else, master that."
  • "This is a cohort. Help each other. Share failures openly. That is how we all get better."

Logistics reminder: Confirm date/time of Platform Training. Remind students to bring their prototype (on laptop, thumb drive, or screenshots if system access is limited).