Using Analytics to Improve Student Engagement in Online Courses

Credit: Wongnatthakan
Credit: Wongnatthakan

Studies have demonstrated not only a strong correlation between student time devoted to academics and success but also that students spend less time studying than ever before (Brint & Cantwell, 2010). This is a problem in both face-to-face and online classes, but online courses offer an opportunity to measure student study time through learning management system (LMS) analytics and use that information to influence student behavior.

In looking through the data from one of my own classes, I found that while many students do make frequent and extensive use of the provided materials, at least as many do not. Quite a few appear to “click through” the content items to mark them as completed in the system while spending as little as a single second of their time on the content itself.

I decided to apply two well-established principles from psychology to improve student engagement and, as a result, student achievement. The first is the observer effect (or Hawthorne effect; see Roethlisberger & Dickson, 1939), which holds that behavior can be changed by making people aware they are being observed.

The second principle is that becoming aware of how other people are behaving can provide us with context to judge our own behavior. For example, students will perceive an exam score of 85 as good when the course average for that exam is 60 percent and bad when the average is 95 percent. That context can provide us with a sense of discrepancy—at least insofar as we tend to think of ourselves of being at least average—which can motivate a change in behavior (Campion & Lord, 1982).

With these two phenomena in mind, I decided that each week I would examine the analytics data from my LMS to determine how many minutes each of my students spent logged in to the course. My presumption is that time spent logged in would be a reasonable—though far from precise—proxy for the time students spent engaged with course content. I would summarize that class-level information, including both written descriptions of means and standard deviations along with a histogram offering a visual depiction of the overall distribution of students, in an announcement to the entire class. I would then look up each student’s individual minutes tally, email them that figure along with the distribution data, and briefly state how they compared to the class overall (Figure 1). This combination of class announcement and individualized email thus alerted students to the fact that I was observing how much time they devoted to the class, along with the implicit message that this was something I cared about. It also allowed them to see how their engagement compared to their peers’. My hope was that this information would improve time spent engaged with course materials and, in turn, improve their final grades.

Histogram titled "Minutes Engaged with Course Content," with accompanying text as a note to student:

"I'm writing to follow up on the News items I've been posting about the time that students have been spending in class. Here is a histogram showing the overall distribution students and their time spent on class activities (which does not include the time spent on textbook readings). [histogram here] As of today you have logged a total of 677 minutes engaged with course content. That puts you in the top half of the class and a good bit above the mean (which is 475). That places you in the bar labeled 601–750, which you share with three others, and there are three students in bins above you. There are 15 students in bins below you—your class engagement is great! Keep up the great effort, and please contact me with any questions about how to interpret this data. Best, Dr. Burton"
Figure 1. An example of individualized feedback regarding time spent in the class

I implemented this intervention over a few terms of my eight-week, online research methods courses (what I’ll call the feedback group), and then compared the average time spent and grades of those students the students I had taught previously without this intervention (the control group). Available course materials, grading rubrics, and instructor stayed constant throughout the project, and it worked! There was a statistically significant difference between the instruction groups’ final grades. The mean grades were 75.1 percent for the control students and 81.6 percent for the feedback group. There was a similar trend toward a significant effect for the total time spent on course materials, with the control group spending an average of 532 minutes engaged with course materials and the feedback group spending an average of 659 minutes. (These figures do not include textbook reading time, time spent on quizzes, or any work done outside of the LMS.)

This relatively straightforward intervention produced significant improvements in final grades and a similar improvement in overall course engagement, despite the always-extant variation in student circumstances that have such an enormous impact on online student success. My hypothesis that students will modify their engagement behavior once made aware of their instructor’s observations of their engagement with the course content and of how they compare to their peers was, happily, supported.

This relatively low-effort intervention appears to have some utility for those of us seeking to improve student engagement with our online course material, resulting in some modest but significant improvements in overall grades. While far from a panacea for student engagement ailments, it may prove a useful strategy for motivating at least some of our students. Moreover, the effectiveness of this relatively nonspecific intervention raises the possibility that more focused or targeted strategies could prove as helpful, if not more so. For example, if the LMS offers the analytics, could feedback on the time spent in or number of posts made in the discussion forums have a similar effect on discussion engagement? Might sharing similar class-wide data on grades from specific activities provide students with better context for evaluating their own efforts or improve their willingness to seek guidance from their instructors? Might there be a way for instructors to drill down into the LMS data to identify engagement with specific class materials that are most predictive of student success? There would seem to be many possibilities for using LMS analytics to improve not only student engagement but perhaps also the course itself.


Brint, S. & Cantwell, K. (2010). Undergraduate time use and academic outcomes: Results from the University of California Undergraduate Experience Survey 2006. Teachers College Record, 112(9), 2441–2470.

Campion, M. A., & Lord, R. G. (1982). A control systems conceptualization of the goal-setting and changing process. Organizational Behavior and Human Performance, 30(2), 265–287.

Roethlisberger, F. J., & Dickson, W. J. (1939). Management and the worker. Harvard University Press.

Keith Burton, PhD, is an assistant professor of psychology and associate chair in the department of social sciences at Saint Leo University. He teaches exclusively online in their undergraduate and graduate psychology programs.

Leave a Reply

Logged in as Julie Evener. Edit your profile. Log out? Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...