Last updated: March 26, 2026
MSBAi: Design Principles & Constraints
Purpose: This file defines the program design philosophy, guiding principles, and constraints for the MSBAi online degree. For specific course details, credit allocations, semester timelines, and instructor assignments, see program/curriculum.md.
Core Guiding Principles
1. AI-First, Not AI-Optional
- Every MSBAi course integrates practical AI tool usage
- AI is treated as a thinking partner and productivity accelerator — not a shortcut that bypasses learning
- Students graduate with hands-on LLM experience + prompt engineering skills + the judgment to know when AI helps vs. hinders
- “AI tools” (generic term: ChatGPT, Claude, Copilot, open-source models, etc.) are standard part of analytical workflow
- AI literacy → competency → expertise progression across all courses
- Cognitive friction by design: Students formulate their own hypotheses, analyze problems, or generate arguments independently before consulting AI tools. AI extends thinking; it doesn’t replace the productive struggle that builds understanding. Neuroscience grounds this choice: dopamine neurons respond to prediction errors (the gap between expected and actual outcomes), not to rewards themselves — when outcomes are fully predicted, the brain’s teaching signal goes flat (Schultz, Dayan & Montague, 1997). Removing struggle removes the neurological mechanism by which learning becomes meaningful. (Vendrell & Johnston, 2026; Furze, 2026; Machulla, 2026)
2. Critical Engagement with AI, Not Passive Consumption
- AI outputs are treated as provisional — students must evaluate, critique, and revise them, not accept them at face value
- Pre-AI → AI-mediated → Post-AI sequencing: Each week’s activities include phases where students work without AI (build independent reasoning), with AI (extend and challenge their thinking), and after AI (reflect on what AI added, missed, or got wrong). This is the “scenic route” — the destination is the same, but the brain has time to build predictions, invest attention, and experience the prediction error that makes the resolution register (Machulla, 2026)
- Epistemic integrity: Students learn to ask “whose knowledge is represented?” and “what perspectives are missing?” when working with AI-generated content
- Metacognitive regulation: Students maintain awareness of their own thinking processes through prompt logs, reflective journals, and structured self-assessment — preventing the “metacognitive laziness” that unstructured AI use can create
- Assessment rewards reasoning, not fluency: Rubrics explicitly value the quality of students’ analytical reasoning over the polish of AI-assisted outputs. Oral defense is the primary mechanism for this.
- Grounded in: Vendrell & Johnston (2026) — 8 design principles for scaffolding critical thinking with generative AI; Furze (2026) — 5 principles for rethinking assessment; Machulla (2026) — neuroscience of anticipation, effort, and the dopamine prediction error
3. Projects Are the Primary Learning Vehicle
- Assessment model: One major team project per 8-week course, scaffolded by weekly assignments (cases, labs, discussions, exercises) that build toward the project. Project milestones threaded throughout all 8 weeks, ramping up toward final deliverable + oral defense. 4-week courses: one individual project. This structure exploits the IKEA effect: people value what they build through their own effort, but only when labor leads to completion — build and disassemble produces no effect (Norton, Mochon & Ariely, 2012). Every course ends with a finished, defensible artifact.
- Live project sessions: Separate, branded tutorial sessions dedicated to:
- Hands-on project work (guided problem-solving)
- Tool showcasing (Power BI, SQL, AWS, etc.)
- AI-augmented analysis demos (how AI accelerates work)
- Q&A and debugging support
- Key differentiator from competitors
- Portfolio-driven: All projects contribute to GitHub portfolio
- Real data: Projects use public datasets, case studies, or simulated business problems
- Authentic assessment: No traditional exams; grading based on project quality
4. VS Code + Copilot + Colab as AI-First Development Environment
- Primary IDE: VS Code with GitHub Copilot (free for students via GitHub Education — full Copilot Pro for 1 year)
- Notebook workflow preserved: Google Colab extension for VS Code connects
.ipynbnotebooks to Colab cloud runtimes. Students who prefer browser-based Colab can use it directly — both workflows produce the same notebook files. - Low floor, high ceiling:
- Floor: Google Colab in browser (zero install, Week 1 ready)
- Step up: VS Code + Colab extension (real IDE, Git integration, debugging)
- Accelerator: Copilot inline completions (ghost text while coding)
- Power user: Copilot Chat + Agent Mode (autonomous multi-step coding)
- Ceiling: Custom agent skills, MCP integrations, agentic workflows
- AI companion tools: Google Gemini Pro (free for students for 1 year via Google AI for Students — includes Deep Research, Workspace AI, NotebookLM, 2 TB storage)
- Career asset: Students graduate fluent in AI-augmented development — not just AI-assisted analysis. Portfolio of Jupyter notebooks + GitHub repos demonstrates both technical skill and AI tool proficiency.
- Python primary: Supplementary tools allowed (Power BI, AWS, R, etc.) but Python is the common language across all courses
- GitHub integration: All project repos hosted on GitHub (version control habit-building)
- See program/tools.md for the complete standard tools reference
5. Three-Layer Content Model (Content Delivery Standard)
- Layer 1: Conceptual videos — Traditional recorded lectures explaining concepts, frameworks, and theory. Hosted on Canvas/Mediaspace. Target: 60-90 min of recorded content per week (benchmark from iMBA Teaching & Learning). Courses with strong textbook coverage (e.g., BDI 513) may substitute overview/transition videos for full lectures.
- Layer 2: Code content — Jupyter books, notebooks, and screencasts for hands-on technical material. Delivered via Colab or VS Code. Not all code content needs to be video — interactive notebooks with explanations can substitute where appropriate.
- Layer 3: Studio sessions — Weekly 60-min live sessions for hands-on project work, tool demos, and Q&A. All studios are recorded for async access.
- YouTube and external content OK — Fewer copyright constraints than Coursera. YouTube videos can be embedded directly in Canvas pages. Tool setup tutorials, DataCamp courses, and third-party content count toward weekly content budget.
- Canvas is the homepage — All courses start on Canvas, directing students to other tools (Colab, GitHub, Power BI) as needed.
- Recording approach: Faculty work with Teaching & Learning team (Cheng Li / Eric French). Two approaches available: PowerPoint-based or EDL documents. Batch recording recommended (plan 3-4 weeks, then record 3-4 weeks).
6. Synchronous Live Sessions for High Engagement (Differentiator)
- Studio Sessions (weekly, 60 min): Hands-on project work (not lectures)
- Real-time demonstration of tools + AI-assisted workflows
- Student Q&A and live debugging
- Showcase best practices + common pitfalls
- Analytics Conversations (bi-weekly, 60 min): Case discussions, guest speakers, current events
- Bring in practitioners, guest speakers, industry insights
- Student-led discussions (builds communication skills)
- Office hours: 1-on-1 or small group mentoring
- Brand these distinctly: “Studio Sessions” (project-focused), “Analytics Conversations” (discussions), “Office Hours” (support)
- Async-first, sync-supplementary: All sessions recorded; attendance optional for flexibility
7. Modular, Flexible Design
- 8-week course format: Every course is self-contained within 8 weeks, enabling flexible scheduling
- No single “right path”: Students choose electives based on career goals
- Future stackable pathways: Certificate options (e.g., exit ramps and re-enrollment) are under development for future cohorts
- Coherent progression: Core → elective → capstone sequence has clear learning objectives and career outcomes
8. Affordability & Accessibility
- Target: 20% cheaper than peer programs (UT Austin, Penn State, Michigan)
- Flexible scheduling: 8-week courses allow working professionals to take 1-2 courses while employed
- Global accessibility: No time zone requirement; all async content available 24/7
- Career pivoters: Primary audience is professionals (age 25-40) transitioning from technical/STEM or non-analytics backgrounds into analytics roles. See program/target_profile.md
9. Applied Experiential Learning
- Real capstones: Research Park partnerships + corporate consulting projects (Fall 2027)
- Alternative capstone: Independent research with real business impact
- Portfolio + Project: Capstone split into portfolio curation (weeks 1-4) + new project (weeks 5-8)
- Public artifact: Final portfolio published on personal GitHub/website (career visibility)
10. Coherent Course Naming & Rubrics
- Naming convention: Use official university catalog names across all materials. Proposed MSBAi-friendly names are documented in program/course_names.md for future formal rename requests.
- Consistent learning outcome structure: All courses use L-C-E (Literacy → Competency → Expertise)
- Clear prerequisites: First core courses have no prerequisites; electives assume core completion
Course details: See program/curriculum.md
11. Version Control & Reproducibility
- GitHub as default: All student code lives on GitHub
- Best practice teaching: Students learn version control as part of course workflow
- Public portfolios: Final projects have public GitHub repos (career visibility)
- Reproducibility: README files, requirements.txt, documentation standards enforced
12. Alignment with Gies Campus AI Framework
- Four campus tracks: MSBAi courses contribute to official Campus AI curriculum
- AI Basics/Fundamentals ← Core courses
- AI & ML Technologies ← Electives
- Agentic Systems & Workflows ← Advanced electives + capstone
- Human-Centric AI ← Ethics woven throughout
- Credential pathway: MSBAi students can earn Campus AI certificates alongside MSBAi degree
13. Career Transition by Design
- MSBAi Cohort 1 is purpose-built for career pivoters entering analytics for the first time
- Every course builds toward a portfolio that demonstrates job-readiness to employers
- Pre-course bridge modules (Python, CLI, math, case method) ensure no prior analytics background is required
- Career services, networking, and capstone are designed for full-time job placement, not promotion-within-current-role
- Elective choices help students discover their analytics specialization (analyst, engineer, scientist)
- See program/target_profile.md for full student profile and admission criteria
Hard Constraints (Non-Negotiable)
Curricular Constraints
- Total program: 36 credits (fixed)
- 8-week format: All courses run 8 weeks (enables flexible, modular scheduling)
- Fall 2026 launch: First courses must be ready Aug 2026
- VS Code + Copilot + Colab primary: All courses use this as the AI-first development environment (see program/tools.md)
- Project-based assessment: No traditional final exams; all courses graded on projects + participation
- Statistics pre-requisites: Coursera Exploring and Producing Data for Business Decision Making + Inferential and Predictive Statistics for Business (both University of Illinois) required before FIN 550
- Team projects required: Every 8-week course must include at least one team project; 4-week courses are individual only
- Team size: 3 students per team (standard). Teams of 2 or 4 permitted in exceptional circumstances (odd cohort size, scheduling conflicts, etc.)
- Oral defense required: Every course includes oral defense (20-30% for 8-week courses, 25-35% for capstone with minimum 20%)
- AIAS levels per assignment: Every assessment component specifies its AI Assessment Scale level (0-4), adapted from Perkins et al. (2024). Level 0 = no AI; Level 4 = AI as subject of analysis. See individual course syllabi for per-assignment levels.
- Low-stakes iteration with peer review: At least one project per 8-week course must include a draft → peer feedback → revision cycle. Peer review is structured (rubric-based) and trained in the first studio session. See ASSESSMENT_STRATEGY.md for implementation details.
Course details: See program/curriculum.md
Technology Constraints
- IDE: VS Code + GitHub Copilot Pro (free for students, 1 year). Google Colab in browser as fallback.
- AI tools: Copilot (code), Gemini Pro (research/writing), Claude/ChatGPT (general). No single-vendor lock-in.
- Python version: 3.10+ (modern, stable)
- Notebook environment: Google Colab (free-tier GPU) via browser or VS Code extension
- Cloud infrastructure: AWS Free Tier + student credits (reimbursable, optional)
- LMS: Canvas (institutional standard)
- GitHub: Required for all coursework (public portfolios)
- Standard tools reference: See program/tools.md for complete setup guide
Capstone Constraints (Fall 2027)
- Split structure:
- Part 1 (Weeks 1-4): Polish 4 projects from prior courses + portfolio pitch
- Part 2 (Weeks 5-8): New applied project (faculty decides format: team/individual, client/independent)
- Portfolio component:
- Polish 4 projects (one per course minimum diversity)
- Professional GitHub repos, READMEs, reflection narratives
- Portfolio pitch presentation
- Applied project component:
- Faculty determines format (client project, independent research, team or individual)
- Must demonstrate skills from at least 3 of 6 core competency areas
- Client NDAs honored; portfolio versions may be anonymized
- AI assist allowed (documented)
- Assessment: Faculty allocate weights within suggested ranges (no component >40%, oral defense min 20%)
Soft Constraints (Flexible)
- Synchronous session timing: Flexible, accommodate global time zones (record all)
- Capstone project type: Either real client OR independent, based on student preference
- Elective flexibility: Industry tracks can evolve based on student demand
- Tool choices: Supplementary tools flexible (R, Scala, different cloud platforms) as long as Python primary
Design Constraints from Context
From IBC Market Research
- Employer demand: AI/GenAI, applied learning, communication skills
- Student drivers: Flexibility, affordability, ROI, career outcomes
- Market gap: Cost, industry integration, emerging skills
- Differentiation needed: Not just “another MSBA” (MSBAi AI-first positioning)
From Undergrad AI Strategy
- L-C-E progression: All learning outcomes follow this model
- Campus AI alignment: Contribute to four campus tracks
- Responsible AI: Ethics + governance woven throughout
- AI literacy for all: Every student graduates AI-ready
From On-Campus MSBA Analysis
- Working professional focus: Adapted for asynchronous, 8-week format
- Hands-on labs: Preserved in cloud-native way
- Real data: Finance, business datasets maintained
- Applied assessment: Shifted from exams to weekly assignments + major project with oral defense