At ASU, I inherited a small but costly operational problem inside Canvas: pending invitations broke SpeedGrader for large online courses, and the team had been manually fixing them for more than five years. The fix was straightforward, but at scale it turned into a recurring burden that consumed roughly **250 to 300 hours per year**.
I built a lightweight browser-based automation that reproduced the exact manual steps a designer would take to clear those pending tags. It turned a repetitive support task into a faster, more reliable workflow without requiring changes from Canvas, the university, instructors, or students.
This approach led to:
- **Hundreds of hours saved annually** on a recurring deployment task
- **Fewer grading disruptions** for instructors and TAs using SpeedGrader
- **More team time available for design work**
- **A practical fix** inside Canvas and university constraints
### The Problem
MLFC’s course model means that large numbers of students are supported by multiple lead instructors and co-instructors in a single Canvas course. To make this structure work in Canvas, designers assigned students to manually created groups and sections tied to specific instructors, inadvertently creating a failure mode.
The university enrolled students in the courses automatically, but invitations to the manually created sections needed to be accepted. Until they were, student enrollment showed a blue `pending` tag on Canvas, which didn't affect student experience but disrupted the TA's and instructor's view in SpeedGrader. In courses with thousands of students, that introduced grading delays at exactly the point when instructors and TAs needed the workflow to be most reliable.
The university's form-based process for resolving pending tags was too slow and too rigid for courses whose groups changed during delivery, and using it would have forced our team to depend on that process hundreds of times each year.
Our workaround required designers to:
- go through each course with groups or sections
- find each student with a `pending` tag
- open the section editor
- remove the student from the instructor section
- re-add them to the same section
That cleared the pending tag, but it took around **45 to 60 seconds per student**, adding up to hundreds of hours each year. The team had been relying on this workaround for more than five years, and the issue never became important enough at the institutional or vendor level to justify dedicated product or engineering attention.
### Constraints
- The solution had to work within Canvas and the university’s existing constraints
- The solution had to reduce manual work performed inside Canvas
- The fix could not depend on student action
- Student data had to be handled carefully and stay protected
### Design Decisions
#### 1. Automate the exact human workflow instead of redesigning the system
The task was repetitive, not conceptually difficult. A designer was effectively clicking through the same sequence for each affected student, which made it a good candidate for browser automation.
I inspected the relevant Canvas page, reviewed the underlying JavaScript, and used AI to build a script that reproduced the same steps a designer would take:
- identify students with pending tags
- remove them from instructor sections
- re-add them to those same sections
This was not a new integration or product. It was a focused operational tool that eliminated a recurring manual course-launch task.
#### 2. Build around the naming convention already in use
The script depended on an existing section naming pattern:
- course sections began with numbers
- instructor-specific sections began with letters
This pattern allowed the automation to distinguish which sections to temporarily remove and restore. Instead of trying to generalize across every possible Canvas structure, I optimized for the standard workflow the team actually used.
#### 3. Use AI as a development aid without exposing student data
I used AI to help speed up development, but only within strict boundaries. Any student-specific information was redacted before sharing code or page structure, and the script was designed to run locally in the browser without sending data anywhere else. That meant no student data was exposed to AI tools, and the automation did only the exact work needed inside Canvas.
#### 4. Keep the tool lightweight and easy to run
Rather than having the team install a separate application, I made the solution usable directly from the browser console on the Canvas `People` page. Designers could open a course roster, paste the script, and let it work through the list while keeping the page open. That decision kept adoption friction low. The team did not need training, setup, or a new system to trust. They could use the tool inside the workflow they already knew, which made it practical during the busiest part of course launch.
The result was a focused operational improvement: a recurring course-launch task that had consumed hundreds of hours each year became a lightweight self-serve workflow, reducing grading disruption and freeing the team to spend more time on real, design work.