top of page

Usability Course Testing & Reflection

 

 

 

 

 

 

 

 

 

 

Introduction

For my usability testing process, I invited five participants, three students and two colleagues, to explore the Start Here/Overview section of my Canvas course and complete a short activity from Module 1. All participants were already familiar with Canvas, which allowed them to focus on course structure and clarity rather than the platform itself. This approach aligns with Krug’s usability guidance, which emphasizes observing real users as they naturally navigate a design to uncover what works and what causes confusion (Krug, 2010, 2013). The insights they provided helped me understand how effectively my course supports learners from the moment they begin.

Navigation of the Start Here Section

One of the strongest results from the test was how easily participants were able to navigate the Start Here/Overview section. Every participant reported that this section was very easy to find and that the instructions were clear or mostly clear. They appreciated the navigation guide, the layout of the learning objectives, and the technology expectations. Their comments confirmed that the Start Here section is functioning as an accessible, welcoming entry point to the course.

Transition into Module 1 Activity

Participants had no difficulty moving from the Start Here section into Module 1. They completed activities such as Quizlet vocabulary practice, Savvas exercises, and a drag-and-drop scramble with ease. The smooth transition demonstrates effective course structure, aligning with Bates’ emphasis on organizing digital learning experiences in ways that reduce cognitive load and support clear, intuitive progression for students(Bates, 2019). Their feedback confirmed that the flow between modules helps learners stay oriented and confident.

Length of the Usability Test

The usability test took most participants between five and ten minutes to complete. Several finished the task in under five minutes, while one took a bit longer due to connection issues. Krug states that short, focused usability tests are not only efficient but often reveal the most actionable insights(Krug, 2010). This quick format worked well, allowing participants to genuinely engage with the course without feeling overwhelmed.

Criteria for the Activity

The Module 1 activity helped me evaluate whether participants could locate assignments, interpret instructions, and complete tasks with ease. It also allowed me to check how well the activities aligned with learning outcomes, Participants agreed that instructions were clear, and the activity aligned well with module goals. One participant reported feeling briefly overwhelmed due to visually dense content, which connects to Bates discussion on minimizing cognitive overload by presenting materials in organized, digestible sections(Bates, 2019). This feedback guided my decisions to improve spacing, pacing, and visual grouping within the module.

Reporting Methods

Participants provided feedback through surveys, informal conversations, and direct observation. This flexible approach reflects Schön’s concept of reflective practice, where designers learn by interpreting authentic user experiences and adjusting their decisions accordingly( Schön’s, 1983). Their diverse reports helped me see not only where the design was strong but also where clarity or support needed to be improved.

Participants and Alternate Plans

The mix of student and teacher participants gave me a balanced view of how different users experience the course. For future usability tests, I would also like to include a participant who is new to Canvas, since Krug emphasizes that testing with beginners reveals hidden usability challenges that experienced users may not notice (Krug, 2013). If needed, I will recruit additional students or a graduate-program colleague as an alternate.

Course Improvements Based on Feedback

The usability test led to several meaningful course updates. I improved organization by correcting mislabeled activities, spreading out content more intentionally, and considering a weekly layout to reduce visual clutter. I also fixed broken links and updated instructions for external tools, supporting the principle that usability improves when barriers are minimized (Krug, 2010). Additionally, I enhanced visual design by adding icons, color coding, and cleaner formatting. Finally, I will add more student support, including troubleshooting guides and clearer instructions for copying materials.

Final Reflection

Reflecting on this process strengthened my course significantly. Usability testing helped me improve clarity, organization, accessibility, and alignment between learning outcomes and activities. The experience also reinforced Schön’s argument that reflective practice is central to improving professional work (Schön, 1983). Overall, the updates I made ensure that the course is not only student-centered but also more intuitive, visually clear, and supportive for all learners.

Important Links

Video Script 

Google Form Feedback

 

References

Bates, A. W. (2019). Teaching in a digital age: Guidelines for designing teaching and learning. BCcampus. https://opentextbc.ca/teachinginadigitalage/

Krug, S. (2010). Rocket surgery made easy: The do-it-yourself guide to finding and fixing usability problems. New Riders.

Krug, S. (2013). Don’t make me think, revisited: A common-sense approach to web usability. New Riders.

Schön, D. A. (1983). The reflective practitioner: How professionals think in action. Basic Books.

bottom of page