Universities optimize online courses for speed and consistency. Instructional designers work within fixed templates to scale development across hundreds of courses. The result is efficiency—but also predictability: static slides, long readings, recorded lectures, and reference lists learners rarely touch.
With small, scalable interventions, I redesigned courses to feel more interactive, personal, and motivating without abandoning standardized structures. This approach led to:
- **Higher learner engagement**, reflected in increased interaction with content and references
- **Clearer learner navigation**, reducing confusion and frustration
- **Widespread instructor adoption**, including over 200 instructors using AI-assisted feedback tools
- **A 5% reduction in drop-out rates** across redesigned courses
### The Problem
Template-driven courses scale well, but interaction is often reduced to static prompts or optional add-ons. When interaction is superficial, learners disengage. Instructors, meanwhile, lack the time to provide individualized support at scale.
**How do you introduce meaningful interaction without slowing development, increasing instructor workload, or breaking standard course templates?**
### Constraints
- Courses had to be built quickly and consistently
- Core templates could not be replaced
- Instructor time was limited, especially in large-enrollment courses
- Any solution needed to scale across hundreds of sections
These constraints ruled out bespoke design, synchronous support, or instructor-heavy interventions. Rather than adding complexity, I focused on embedding interaction directly into existing structures and making engagement immediate, contextual, and low-friction.
### Design Decisions
#### 1. Providing immediate support through AI-powered course assistants
Learners frequently had time-sensitive questions about readings, assignments, or interpretation. Instructors could not respond fast enough at scale.
I developed AI course assistants trained on course content and logistics. Learners could ask questions at any time and receive contextual responses, including optional reading suggestions. Unanswered questions were logged, revealing friction points in the course design.
The same system supported instructors. By aligning the assistant with assignment rubrics, instructors could generate structured, rubric-based feedback, review it, and deliver it in their own voice. This shifted instructor time away from repetitive clarification and toward higher-value conversations.
#### 2. Embedding interaction into presentations and videos
Learners frequently multitask, letting static slides and recorded lectures run in the background.
I replaced traditional slide decks with interactive Storyline presentations that allowed learners to manipulate variables and see outcomes change in real time. Instructors recorded lectures within these presentations, modeling the learning process rather than narrating static content.
To reduce passive viewing, I embedded low-stakes questions directly into videos using Panopto and PlayPosit, making interaction part of the content itself.
#### 3. Reframing references as discovery tools
Optional resources are often ignored because they feel disconnected and overwhelming.
I redesigned references as interactive flipcards. Instead of scrolling through citations, learners encountered prompts framed as questions, turning references from a checklist into an invitation to explore.
Learners engaged with supporting material more frequently, and discussion responses increasingly drew on ideas beyond the required readings.
#### 4. Making progress visible to sustain motivation
In long, self-paced courses, learners lose momentum when progress feels invisible.
I added lightweight progress indicators within lessons, giving learners a clear sense of completion and control. This required no instructor involvement and fit cleanly within existing templates.
Learners reported less frustration and were more likely to complete lessons rather than abandoning them midway.
None of these changes were large on their own but together, they altered how learners experienced the course. Interaction did not come from new platforms or additional instructor presence. It emerged from **small, embedded design decisions** that respected scale, templates, and time constraints.