A person will attempt to complete a task in an unfamiliar environment until frustration hits a critical level, according to user experience research. Frustrated online learners may abandon assignments or drop courses. This is why it's important to understand how the student experiences the course. One way to accomplish this is through basic user-experience assessment, which does not require an extensive background in Web design, just a willingness to take learners' perspectives into consideration when designing course elements and assignments.
User-experience assessment can help identify frustration points for online learners, and direct observation of a test user in an online course can illuminate troublesome areas and allow for optimization of the online course layout and structure.
Integrating learner feedback or complaints during a course is an organic form of ongoing user assessment. A more direct assessment is a user test, a brief and structured session in which the test facilitator (the course instructor or designer) issues tasks to a mock student (a test user) to complete in an online course.
Firsthand observation of course use allows the facilitator to identify potential pain points or confusing content. For example, if a test user struggles to identify the help section of an online course when it is labeled “Further Instructions” on a navigation menu, it may warrant a new title. Or perhaps a test user struggles with a particular test question format and states a preference for a static list of answers with check boxes rather than a drop-down option set.
Test users: one is better than none
The Nielson Norman Group, a premier usability assessment organization, suggests observing a minimum of three test users during a website assessment, but one user test can yield valuable observations and lead to impactful changes.
An optimal test user is a student from within the institution, as he or she is more likely than others to have prior experience with the learning management system (LMS) and expectations based on comparable online courses. A more practical test user might be a family member or friend with some interest in online learning. Any person with adequate computer skills and a willingness to participate will suffice as a test user.
Plan to provide your test user with student-level access to the online course during the test or at least an approximation of the student experience. Depending on your LMS, you may be able to log in to the course as an administrator and toggle the course content to the “student view” setting before sitting the test user down at a computer. Or you may temporarily add your test user to the course as a student for the duration of the test. Remove any time-release restrictions on content prior to the test session if you plan to have your test user access it.
Develop sample tasks and directive statements
Develop a script of four to six task statements to read aloud to the test user during the session. Brainstorm sample actions that a student would complete at the start of a course or accomplish in different areas of the course, and transform these steps into directive statements. A task might include directions to locate a document, review a specific course module, or write and publish a blog post assignment.
Resist the urge to coach your test user with instructive task statements such as “find the syllabus PDF under the ‘Class Schedule' link in the navigation menu.” Instead, give your test user brief statements in plain language, such as the directive “locate the syllabus.” Estimate ample time for each task in your test script so as not to exhaust your test user or run out of time.
Consider issuing a few short tasks at the start of the test to build your test user's confidence, and reserve one or two elaborate tasks for the end of the session. Remember that you are the expert user of your online course; it may take you 30 seconds to open the syllabus, but it may take your test user two or three minutes to locate the document.
Establish a low-pressure, think-aloud environment
A small amount of preparation will ensure that the test session runs smoothly for you and your test user. Secure access to a computer with consistent Internet access for testing purposes, preferably in a quiet room with minimal distractions. Schedule your test session to last about an hour, allowing time to greet the test user and log him or her in to the course environment. Explain that you are testing the online course and not the user's computer skills, performance, or knowledge of content. Encourage your user to “think aloud” by narrating his or her thoughts while working through the sample tasks. An Internet-savvy test user may speed through tasks, but the think-aloud process forces a user to slow down and helps you intuit his or her experience more completely.
Facilitate without interference
A smooth user test will move swiftly and end early; be prepared to multitask in order to make the most of your time. As test facilitator, you will run technical support, issue task directions, and encourage the test user to think aloud. But as course designer, your primary goal is direct observation of the user's experience and detailed notation of his or her interaction with course elements. Document troublesome points in detail, including the page location, the tools or content in use, and the task with which the user struggles.
Many user test facilitators choose to record sessions for future reference (with consent of the test user). Consider using an audio recording application on the test computer or a smartphone to capture think-aloud comments for future reference. Screen-recording tools can also create screen shots or video files of the user's activity on the computer monitor during the test.
You may be tempted to offer instruction to the user to facilitate task completion during the test. Avoid this coaching instinct, as assistance will interfere with the results. In an authentic-use case, a student at home would not have your help. Exercise patience with user confusion or false starts during a task. Expect small delays or detours, but plan to move to the next task if the test user becomes excessively frustrated, stuck, or over-challenged. Better to allow the user to begin a fresh task than to cause him or her to become discouraged and fatigued.
Integrate findings
The first user test will likely reveal a few quick usability fixes such as renaming content, adding just-in-time instructive text, or reorganizing content into more intuitive groupings. A single user test may yield obvious opportunities, but a series of two or three tests may distinguish content that consistently frustrates users from content that poses an occasional challenge.
Make small improvements with confidence, but massive overhauls in course layout or design may merit an additional user test to evaluate your adjustments. Also consider issuing a small survey to students a few weeks into the semester to further assess issues. Usability assessment is a reiterative process of evaluation and change that may span several semesters.
Reference
Nielsen, Jacob. Usability Articles. Nielsen Norman Group, 2014. Web. October 2014.
Lauren Linn Sears is a graduate of the Master of Science in Information Studies program at the University of Texas at Austin and is presently employed at Sam Houston State University. She is enthusiastic about the presentation of information on the Web, particularly as it relates to academia.