academic integrity in online education

A Method to Prevent Cheating on Online Exams

Face-to-face instructors who give in-class exams have a challenge when moving their courses online: How to ensure that students do not cheat on the exams by collaborating? Different methods have been developed to address the problem. For instance, institutions can send each student a 360-degree

Read More »

Promoting Academic Integrity: Are We Doing Enough?

Cheating continues to be a pervasive problem in college courses. Institutions have policies designed to prevent it and faculty employ a range of strategies that aim to catch those who do. And still the problem persists. A study at a university in Australia, where it

Read More »
Archives

Get the Latest Updates

Subscribe To Our Weekly Newsletter

Magna Digital Library
wpChatIcon

Face-to-face instructors who give in-class exams have a challenge when moving their courses online: How to ensure that students do not cheat on the exams by collaborating? Different methods have been developed to address the problem. For instance, institutions can send each student a 360-degree video camera to watch them as they take the exam, but few institutions are willing to purchase a camera for every online student, let alone manage the process of mailing them back and forth. Plus, who will watch all that video?

But there is a way to help ensure academic integrity in online exams via underused features of the learning management system (LMS). These features allow instructors to send students different versions of the same question. Students who try collaborating will get different questions, and if they do not notice this, they will introduce essentially random errors into their answers. Random errors on questions are a better indication of cheating than two students’ providing the same answers. If two students get the same question right, they might simply have both known the answer. Similarly, if two students get a question wrong, they might have both not known the answer. But if two students make the same random error, such as transposing a number in carrying it from one page of an exam to another, that is evidence of a common source to the answer.

Randomizing questions in an LMS

LMSs generally have a method for creating a bank of questions and setting an exam to give each student a different random sample of those questions. This is helpful, but it is not quite what we are trying to do. We want students to get different versions of the same question. Not only can that more easily detect cheating, but it also makes the scoring fairer. If an instructor creates 20 different questions sets per quiz to send each student 10 at random, then students will receive the same questions, and some exams might be harder than others because they pulled harder questions. Plus, instructors generally want to test students on specific topics, and a random draw of questions might not cover all of those topics evenly.

The secret to using different versions of the same question is to use two different functions common to the LMS: the “question group” and “question bank” functions in Canvas or the similar “question set” and “question pool” functions in Blackboard. I will explain how to set up quizzes with different versions of the same question in Canvas, but the process is essentially the same for Blackboard and many other LMSs.

First, many instructors create their quiz questions right inside of their quiz, but I don’t recommend this. It is better to create your questions within a question bank, which is merely a bucket of questions that can be used in multiple quizzes, and then pick the questions you want from one or more banks when making a quiz. As we will see, this makes it easier to reuse questions across quizzes or even classes.

Question banks are generally distinguished by topic. In a history of philosophy class, I might set up a bank for each thinker we study: Socrates, Plato, Aristotle, and so on. Then, if I wanted to have an exam on the three ancient philosophers, I would set up a quiz in Canvas, and instead of creating the questions, I would click “find questions,” which brings up my question banks, and pick the questions that I want for the quiz.

But if I want to have different students get different questions, then, instead of picking questions from the question banks, I would set up three question groups within the quiz—one for each of the question banks. Question groups are quiz specific and used to pull questions randomly from a question bank. Imagine that each of the three question banks has 10 questions in it, and I want students to get five questions picked at random from each bank. I would set up a question group within the quiz for each question bank, link the group to the corresponding question bank, and set each question group to pull five questions. Each of the three question groups will pull five questions at random from its associated question bank, making a quiz with 15 questions total.

If, however, I want each student to receive different versions of the same question, then I will need to make one question bank per question, with different versions of that question within each bank. There are different ways to vary the versions. Perhaps the wording of the question is the same for each version, but the possible answers to choose from are different. In a quantitative class, the question structure might be the same, with different values in different versions. In my case, I would need to set up 10 question banks per philosopher, one for each question, and might create three to four versions of each question within each bank.

Labeling conventions are important here, and most people do not handle these well. If I label each bank with just the question title, then I will have a hard time scanning through my list of question banks to find what I want. It is best to always label files, folders, and the like from the most general category to the most specific so that the titles sort in a way that is easy to scan. If the main category that distinguishes questions is the philosopher, I would start with the philosopher’s name in the title. Then, if there are multiple questions on a topic for a philosopher, I might put the topic, such as “politics.” Then I might end with the specific question, such as “duty to the state.” Thus, a question bank title might be “Socrates_Politics_Duty to the State.” An alphabetic sorting of the list will first group all the Socrates question banks together, then all the political questions, and then the individual questions. This makes it easy to scan down to what I want.

Finally, I create the quiz and set up 15 question groups within it. Each group gets linked to the question bank associated with that question and is set to pull one version of the question at random from that bank. Now each student gets the same 15 questions in different versions.

Notice how I set up 10 question banks per philosopher but only used five of them in this quiz. I could now set up a final exam using question groups linked to the other five questions I created for each philosopher. This demonstrates the flexibility that comes with creating questions within banks. I can reuse, and remix, questions in future quizzes and even different class, as the question banks are connected to the user’s profile.

Take a look at this tutorial on how to randomize questions in Canvas. It takes some time, but is effective in addressing cheating problems in online exams.