It is common to assume online education is guided by an abundance of data. But platforms like Canvas provide only surface-level analytics, and when you’re managing programs with thousands of learners, how many times a page on Canvas was visited isn’t useful enough to explain learner behavior.
How do you get useful data when there isn't any? I often found myself in situations where existing analytics weren’t enough. Here are four examples where creating my own data improved outcomes, and in some cases, led to decisions I never expected.
---
##### 1. Find ways to create your own data
**⚠️ The problem:** Canvas doesn’t track whether learners visit external resources in the course. I wanted to understand if these resources were being used but no reliable data existed. Assignments rarely referenced optional resources, which hinted at low usage, but we wanted concrete data to be sure.
💡 **What I did:** After identifying this problem, I partnered with the tech team to build a server that logged clicks before routing learners to external links. The data confirmed what we suspected: hardly anyone was clicking optional resources, and many weren't even visiting the required ones.
To address this, I redesigned these resources using **Articulate Storyline**, transforming static links into interactive flipcards. Each card posed a question or challenge, requiring learners to click to reveal supporting material.
- **More on this project:** [[Small Changes Can Make Learning More Interactive#3. Made interactive references |Making references interactive]]
- **See also:** [[Small Changes Can Make Learning More Interactive#1. Developed AI-powered course assistants|Using AI logs to improve courses]]
🎯 **The result:** In the following semester, click-through rates rose, and instructors noticed learners referencing optional resources more often. A small design change, powered by self-created data, turned passive materials into active engagement points.
---
##### 2. Delay until you have data
**⚠️ The problem:** When building new courses, one of the hardest challenges is that you often don’t know who your learners are. I design by keeping learners at the center, but at the beginning, I have to work without the very information that matters most.
💡 **What I did:** I approach these situations by separating course content into two categories: what can remain as-is (the concepts, skills, and outcomes) and what can be personalized (the examples, applications, and projects).
- I work with instructors and subject matter experts to help them see that **delays are OK**—not everything needs to be locked in at the start.
- Wherever possible, I **delay finalizing personalizable content** until data about our learners became available.
- If delays aren’t possible, I design with the most likely audience in mind, based on enrollment patterns for comparable programs.
- To maximize flexibility, I build **banks of examples** and design activities which learners could link to their personal interests, backgrounds, or professional contexts.
This makes sure that courses are ready to launch with strong core content, but can still be quickly adapted once we understand more about learners.
🎯 **The result:** Learners connect more deeply with the course because I created a richer learning environment where examples were never “one size fits all.” Instructors also become better equipped to engage a global audience, and this personalization strengthens learner-instructor relationships.
---
##### 3. Get data directly from your learners
**⚠️ The problem:** In one of my large-enrollment programs, sign-ups for some courses were strong but completion rates were lagging. Standard analytics from Canvas and external tools couldn’t tell us whether the problem was content difficulty, tool fatigue, or something else entirely.
💡 **What I did:** I created my own dropout survey in addition to the university dropout survey to pinpoint the issue. Learner responses revealed three themes:
1. The textbook discussion tool we used frustrated learners with its outdated interface. In fact, the tool had frustrated instructors too because of its inability to differentiate between posts and comments.
2. We used too many different tools across courses and created cognitive overload. Learners were expected to learn how to use a new tool every few classes.
3. Many of these tools had learners leave Canvas, adding friction.
With this data, I researched alternatives and piloted a different tool, which integrated fully into Canvas and replaced multiple tools with a single, modern interface. This tool was more expensive, but its capabilities allowed us to retire several other tools, which made the replacement feasible.
🎯 **The result:** Drop out rates decreased and learner satisfaction increased because learners could focus on learning the subject instead of learning how to use tools. Instructors became less frustrated and grading became more transparent.
---
##### 4. Get data from previous learners
**⚠️ The problem:** Feedback is usually gathered right after learners complete a course. While useful, this feedback is limited—it tells you how learners felt in the moment, not what actually mattered later. To improve long-term program design for another large-enrollment program, I wanted to understand what learners valued most _after the course was behind them_.
💡 **What I did:** I reached out to alumni to understand what they found truly valuable in the long run. I found out learners who earned external certificates stood out more in the job market.
> Certificates were valued because they highlighted expertise in a precise area. A degree might also cover that same knowledge but the certificate made it more _visible_ on a resume, giving graduates an edge in demonstrating targeted skills to employers. This was something LMS data could never have revealed.
Acting on this, I figured out which certificates added the most value, and built partnerships to integrate these certificates in our courses. I negotiated discounts for learners and also collaborated to design mini-courses, which were offered to our learners and those on partner platforms.
> [!Tip] An unexpected twist
> Many times, the content of these external courses overlapped with our content. Instead of treating this as duplication, we were able to partner to offer our learners support in **multiple languages**—a need we hadn’t anticipated at the start.
🎯 **The result:** Directly increased the employability of our learners while building partnerships and expanding the reach of our programs by offering support in multiple languages. Getting feedback from alumni helped us redesign our approach to career relevance.