It is common to assume online education is guided by an abundance of data. In reality, the opposite is often true. Platforms like Canvas provide only surface-level analytics, and when you’re managing programs with thousands of learners, how many times a page was visited isn’t useful enough to explain learner behavior.
How do you get useful data when there isn't any? I often found myself in situations where existing analytics weren’t enough. Here are four examples where creating my own data improved outcomes, and in some cases, led to decisions I never expected.
---
##### 1. Find ways to create your own data
**⚠️ The problem:** Canvas doesn’t track whether learners visit external resources in the course. I wanted to understand if these resources were being used but no reliable data existed. Assignments rarely referenced optional resources, which hinted at low usage, but we wanted concrete data to be sure.
💡 **What I did:** Partnering with the tech team, we built a server that logged clicks before routing learners to external links. The data confirmed what we suspected: hardly anyone was clicking on optional resources, and many students weren't visiting even the required ones.
To address this, I redesigned these resources using **Articulate Storyline**, transforming static links into [interactive flipcards](https://d36d0xoi3o4xl7.cloudfront.net/Interactives/LE/CE/CELH/CE0191_IntroToCELH_Flipcards/story.html). Each card posed a question or challenge, requiring learners to click to reveal supporting material.
- **More on this project:** [[Small Changes Can Make Learning More Interactive#3. Made interactive references |Making references interactive]]
- **See also:** [[Small Changes Can Make Learning More Interactive#1. Developed AI-powered course assistants|Using AI logs to improve courses]]
🎯 **The result:** In the following semester, click-through rates rose, and instructors noticed learners integrating these references more often into assignments. A small design change, powered by self-created data, turned passive materials into active engagement points.
---
##### 2. Delay until you have data
**⚠️ The problem:** When building new courses, one of the hardest challenges is that you often don’t know who your learners are. I design by keeping learners at the center, but at the beginning, you’re working without the very information that matters most: their backgrounds, their prior knowledge, and their interests.
💡 **What I did:** I approach these situations by separating course content into two categories: what can remain as-is (the concepts, skills, and outcomes) and what can be personalized (the examples, applications, and projects).
- I work with instructors and subject matter experts to help them see that **delays are OK**—not everything needs to be locked in at the start.
- Wherever possible, I **delay finalizing personalizable content** until data about our learners became available.
- If delays aren’t possible, I design with the most likely audience in mind, based on enrollment patterns for comparable programs.
- To maximize flexibility, I build **banks of examples** and design activities where learners could link the material to their personal interests, backgrounds, or professional contexts.
This makes sure that courses are ready to launch with strong core content, but can still be quickly adapted once we understand more about learners.
🎯 **The result:** Learners connect more deeply with the material because it was personalized to them. By inviting learners to bring in their own contexts, I create a richer learning environment where examples were never “one size fits all.” Instructors also become better equipped to engage a global audience, and this personalization strengthens learner-instructor relationships.
---
##### 3. Get data directly from your learners
**⚠️ The problem:** In one of my large-enrollment programs, sign-ups for certain courses were strong but completion rates were lagging. Standard analytics from Canvas and external tools couldn’t tell us whether the problem was content difficulty, tool fatigue, or something else entirely.
💡 **What I did:** I created my own dropout survey in addition to the university dropout survey to pinpoint the issue. Learner responses revealed three themes:
1. Perusall, a textbook discussion tool, frustrated learners with its outdated interface. In fact, Perusall's poor grading logic frustrated instructors too because of its inability to differentiate between posts and comments.
2. Too many tools across courses created cognitive overload. Learners used a new tool every few classes.
3. Several tools being used required learners to leave Canvas to use them, adding friction.
With this data, I researched alternatives and piloted **FeedbackFruits**, which integrated fully into Canvas and replaced multiple tools (Perusall, Panopto, VoiceThread, PlayPosit, Harmonize) with a single, modern interface. Even though FeedbackFruits was expensive, its capabilities allowed us to retire several other tools, which made the replacement feasible.
🎯 **The result:** Drop out rates decreased and learner satisfaction increased because learners didn’t have to juggle multiple tools and could focus on learning the subject itself. Instructors were less frustrated, since students could no longer game the system, and grading became more transparent.
---
##### 4. Get data from previous learners
**⚠️ The problem:** Feedback is usually gathered right after learners complete a course. While useful, this feedback is limited—it tells you how learners felt in the moment, not what actually mattered later. To improve long-term program design for another large-enrollment program, I wanted to understand what learners valued most _after the course was behind them_.
💡 **What I did:** I reached out to alumni to understand what they found truly valuable in the long run. I found out that while the knowledge gained in our program was valuable, earning external certificates made graduates stand out in the job market.
> Certificates were valued because they highlighted expertise in a precise area. A degree might also cover that same knowledge but the certificate made it more _visible_ on a resume, giving graduates an edge in demonstrating targeted skills to employers. This was something LMS data could never have revealed.
Acting on this, I analyzed which certificates added the most value, and built partnerships to integrate these certificates in our courses. I negotiated discounts for learners and also collaborated to design mini-courses, which were offered to our learners and those on partner platforms.
> [!Tip] An unexpected twist
> Many times, the content of these external courses overlapped with our content. Instead of treating this as duplication, we were able to partner to offer our learners support in **multiple languages**—a need we hadn’t anticipated at the start.
🎯 **The result:** Directly increased the employability of our learners while building partnerships and expanding the reach of our programs by offering support in multiple languages. Getting feedback from alumni helped us redesign our approach to career relevance.