Contract Cheating—and What to Do about It

Credit: iStock.com/Chinnapong
Credit: iStock.com/Chinnapong

Contract cheating is a relatively new phenomenon that is gaining attention in higher education because it is particularly difficult to detect. Instead of purchasing a paper from an outside source, contract cheating involves hiring someone specifically to create that work for the student. The problem is that traditional plagiarism detection tools scan databases of academic articles and prior student work for matches. But because contract cheating involves creating new work, it will not show up in a database search.

To combat the problem, new tools are starting to emerge that look for signs within the work itself that the student did not produce it. In the past, faculty did the same by looking for a radical departure in writing style from past examples of a student work. By contrast, new systems compare a student’s work to general characteristics of commercially produced work that distinguish it from student work, and thus they do not need the specific student’s past work for comparison.

Olumide Popoola, who has investigated contract cheating, found eight characteristics of commercially produced essays (Lee, 2021). Compared to student essays, commercial essays

  1. use a more sophisticated vocabulary;
  2. use longer sentences;
  3. are less likely to use in-text citations;
  4. use more prepositions;
  5. use more abstract words;
  6. are less organized and clear (which probably comes as a surprise to many faculty);
  7. do not use the same additive words, such as in addition or moreover; and
  8. are more descriptive and less analytical.

These characteristics suggest an underlying theme: that the contracted writer does not have much, if any, expertise in the subject matter of the work and compensates with linguistic tricks that attempt to cover up the lack of substance. After all, the outside writer is not in the course and so has not done the readings or sat in the lectures that would produce the requisite knowledge. This is essentially a new take on the old student claim that essays allow them to con the instructor with flowery language. The strategy might work with an instructor who reads essays for keywords, but not one who tries to follow the line of reasoning.

Plagiarism detection tools are starting to incorporate these analytics into their originality reports. Turnitin, for instance, is now able to check a student’s work against their past work for differences in writing style. Interestingly, these tools are codifying principles that faculty used to determine when work “smelled fishy” prior to the internet and plagiarism detection tools. They are just doing it in a more objective way. Back then the problem, which is still an issue today, was that without an outside example that the student’s work clearly copied, the instructor could not prove that the student cheated. The instructor needed to confront the student with the evidence and hope that the student fessed up.

Recently, Sikanai, a tech startup, added a new tool for plagiarism detection. Instructors send the company a student work, and the Auth+ system analyzes that work to come up with questions about it. It then sends the student a six-question quiz that the company asserts will determine whether the student wrote the work. That quiz is proctored, and the questions are even timed in relation to their difficulty to determine whether the student has trouble with easy questions.

Sikanai also offers Slate Desktop, a citation and reference management desktop app that integrates with Microsoft Word, like RefWorks and other such software. The system comes with a built-in originality checker that will alert the student when they have matches. While some many worry that the feature will only help students hide their cheating, many student plagiarism problems result from poor paraphrasing skills, and this type of system can help students learn how to properly paraphrase using their own words rather than examples given in tutorials.

While no system is perfect, the mere knowledge that an instructor is using these tools may deter student cheating. Along with instruction on how to avoid plagiarism, these tools give instructors more ways to teach students how to synthesize and summarize an author’s ideas while giving them proper credit for the work, which is the end goal of all academic honesty efforts.

Reference

Lee, C. (2021, April 6). International Center for Academic Integrity Conference 2021 panel recaps part 1 of 2. Turnitin. https://www.turnitin.com/blog/international-center-for-academic-integrity-conference-2021-panel-recaps-part-1-of-2

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon

Contract cheating is a relatively new phenomenon that is gaining attention in higher education because it is particularly difficult to detect. Instead of purchasing a paper from an outside source, contract cheating involves hiring someone specifically to create that work for the student. The problem is that traditional plagiarism detection tools scan databases of academic articles and prior student work for matches. But because contract cheating involves creating new work, it will not show up in a database search.

To combat the problem, new tools are starting to emerge that look for signs within the work itself that the student did not produce it. In the past, faculty did the same by looking for a radical departure in writing style from past examples of a student work. By contrast, new systems compare a student’s work to general characteristics of commercially produced work that distinguish it from student work, and thus they do not need the specific student’s past work for comparison.

Olumide Popoola, who has investigated contract cheating, found eight characteristics of commercially produced essays (Lee, 2021). Compared to student essays, commercial essays

  1. use a more sophisticated vocabulary;
  2. use longer sentences;
  3. are less likely to use in-text citations;
  4. use more prepositions;
  5. use more abstract words;
  6. are less organized and clear (which probably comes as a surprise to many faculty);
  7. do not use the same additive words, such as in addition or moreover; and
  8. are more descriptive and less analytical.

These characteristics suggest an underlying theme: that the contracted writer does not have much, if any, expertise in the subject matter of the work and compensates with linguistic tricks that attempt to cover up the lack of substance. After all, the outside writer is not in the course and so has not done the readings or sat in the lectures that would produce the requisite knowledge. This is essentially a new take on the old student claim that essays allow them to con the instructor with flowery language. The strategy might work with an instructor who reads essays for keywords, but not one who tries to follow the line of reasoning.

Plagiarism detection tools are starting to incorporate these analytics into their originality reports. Turnitin, for instance, is now able to check a student’s work against their past work for differences in writing style. Interestingly, these tools are codifying principles that faculty used to determine when work “smelled fishy” prior to the internet and plagiarism detection tools. They are just doing it in a more objective way. Back then the problem, which is still an issue today, was that without an outside example that the student’s work clearly copied, the instructor could not prove that the student cheated. The instructor needed to confront the student with the evidence and hope that the student fessed up.

Recently, Sikanai, a tech startup, added a new tool for plagiarism detection. Instructors send the company a student work, and the Auth+ system analyzes that work to come up with questions about it. It then sends the student a six-question quiz that the company asserts will determine whether the student wrote the work. That quiz is proctored, and the questions are even timed in relation to their difficulty to determine whether the student has trouble with easy questions.

Sikanai also offers Slate Desktop, a citation and reference management desktop app that integrates with Microsoft Word, like RefWorks and other such software. The system comes with a built-in originality checker that will alert the student when they have matches. While some many worry that the feature will only help students hide their cheating, many student plagiarism problems result from poor paraphrasing skills, and this type of system can help students learn how to properly paraphrase using their own words rather than examples given in tutorials.

While no system is perfect, the mere knowledge that an instructor is using these tools may deter student cheating. Along with instruction on how to avoid plagiarism, these tools give instructors more ways to teach students how to synthesize and summarize an author’s ideas while giving them proper credit for the work, which is the end goal of all academic honesty efforts.

Reference

Lee, C. (2021, April 6). International Center for Academic Integrity Conference 2021 panel recaps part 1 of 2. Turnitin. https://www.turnitin.com/blog/international-center-for-academic-integrity-conference-2021-panel-recaps-part-1-of-2