campaign
Turnitin launches iThenticate 2.0 to help maintain integrity of high stakes content with AI writing detection
Learn more
Blog   ·  

International Center for Academic Integrity Conference 2021 panel recaps part 1 of 2

Contract Cheating and Proctoring

Christine Lee
Christine Lee
Content Manager

In March 2021, the International Center for Academic Integrity held its annual conference online. There were many valuable panels from which to choose, with subject matter experts presenting on topics pertaining to the student experience and emerging trends in misconduct, alongside strategies to mitigate academic misconduct.

In this post, we offer recaps on two panels that discussed emerging trends (contract cheating and proctoring solutions) and the ways in which they can be detected.

First Panel: CONTRACT CHEATING

Olumide (Olu) Popoola , Academic Skills Developer at Queen Mary, University of London, shared their insights with Detecting Contract Cheating Using Investigative Linguistics . This panel was a technical deep dive into forensics linguistics with immense insights. How would we collect evidence and detail transgressions? Would there be metaphorical gloves and microscopes?

The panel began with an exercise. As participants, we were asked to view a number of writing samples and determine if they were written by a student or by a commercial (essay mill) writer. Presented with the evidence, we assessed the scene. It was an illuminating experience--the results were split, showing us that it was very difficult for a human to determine whether or not an essay was written by a third party and therefore, the importance of this very panel.

Detection is not easy without training or decision support,” said Popoola. They said that software reports can help detect ghostwriting to some extent. And then offered up analyzing writing styles in a cohort of students to detect outliers, with the responsibility of “instructors having to assess and see if a paper is external to the peer group.” Still, they said, this isn’t a complete solution.

So, what is the training or decision support needed to determine contract cheating? Popoola focused on investigative corpus linguistics, or “describing and analyzing text that could be usefully applied to the contract cheating problem.”

Popoola identified 32 linguistic features and found 8 significant characteristics to commercially written essays, all while describing their methodology in detail. The defining components are:

1. Lexical sophistication
Commercial essays had a more sophisticated vocabulary.

2. Sentence length
Because long sentences “limit the number of points one can make in an essay and make it easier to put in irrelevant points,” longer sentence length is a commercial writing feature.

3. Citation
Student writing, according to the research, was more likely to use parenthetical citation using a particular citation style (MLA, APA, Chicago, etc.) format.

4. Sparsity
Commercial writing demonstrated more prepositional phrases and wordier writing, resulting in sparse content.

5. Lexical concreteness
Student writing demonstrated more concrete words and fewer abstract words.

6. Relative clauses/subordination
Student writing had higher quality writing, prioritizing information in a clear manner.

7. Additives
Additives (e.g., “in addition,” “also,” “as well as,” “moreover,” etc) were distinct in student writing whereas in commercial writing the use of the more ambiguous “or” (reflecting a lack of knowledge) was more frequent.

8. Reasoning
Student writing contained more reasoning, which includes analysis, cause and effect, and categorization. Commercial writing was more vague and descriptive than analytical.

In conclusion, Popoola presented a “Contract Cheater Writer Profile,” which they described as writing with “padding” and “lazy thinking” within an exterior of lexical sophistication. (Offender profiling! Forensic psychology! I kept thinking of B.D. Wong in Law & Order). In sum, contract cheaters submit writing that reads sophisticated with complex syntax but lacks analysis and original thinking.

As a result of these findings, Popoola suggested reducing word count for essays and to “reward compression and mark padding as a negative,” given that commercial essays tend to pad and write for length rather than content.

We concluded by revisiting the opening exercise and reading more writing samples to determine whether or not they were written by a third party. My score this time was lower (50% correct) than the initial round (80% correct). Can a human even do this? Clearly, I needed more than this panel to uplevel my forensics skills.

Bottom line: assessment design is a way to mitigate contract cheating and enable later detection. Design your essay assignments so that they focus on content rather than word length to prevent contract cheating in person or via remote learning .

Second Panel: PROCTORING

Walking the Line Between Academic Integrity and Privacy in Online Exams featured Jennifer Lawrence, the Program Director of Digital Education at the University of New England, Australia. In this talk, Lawrence largely advocated the normalization of proctoring software and focused on navigating fear, uncertainty, and doubt associated with online exams and academic integrity. I hoped Lawrence was wearing their armor. It was tense from the get-go; proctoring is on everyone’s minds right now and elicits strong feelings from students, parents, and teachers alike.

The audience reaction was very engaged, and ultimately the presentation steered responsibility for integrity away from proctoring and towards assessment design. Intriguing were participant questions in the chat window--which ranged from questions about accessibility for disabled students to ethical questions about normalizing what could be an invasion of privacy, which then informed a large part of the panel discussion. If you wanted a temperature check on the controversies surrounding proctoring, that chat window was illuminating.

We are encouraging students to see the supervised exam as the normal, as the default,” responded Lawrence, “So we feel that a big part of it is about trust and normalizing the technology and giving people a chance to have a go at it, while we still cater to people who do have legitimate privacy and technical issues.” Lawrence concluded by saying that they “have almost no students asking for the alternative.”

Lawrence clarified that the students who did ask for alternatives to online proctoring went on record saying that as opposed to privacy concerns, “they preferred a different format of assessment.”

And then came the major inflection point in this presentation: that exams are imperfect, and thus any ensuing solution is also imperfect. That frequent, low-stakes assessments beforehand and assessment design thereof are most significant to student learning and experience: “So there's a lot of moving parts here that we need to think about--particularly people's comfort levels--and perhaps a footnote slightly less on the note of academic integrity we've definitely taken a perspective as educators ourselves that an exam, from an educational perspective, is not usually the ideal assessment type format,” said Lawrence. They then encouraged instructors to instead seek out “a lot of formats that are more authentic assessment for learning instead of just assessment of learning.”

While true, the call for more formative assessments was a well-timed pivot away from online summative exams and proctoring.

Lawrence brought up further elements of friction and burden that result in summative assessments and exams, stating that “an exam is set, not only because of the integrity need to confirm identity and control environment but also because it creates a different workload pressure for the academic staff in marking,” deflecting responsibility away from proctoring solutions towards the larger teaching burden and ensuing practical concerns.

Ultimately, Lawrence veered towards assessment design to conclude, “One of the overarching things that we found is it really comes back to assessment design; you can design your assessments in such a way that the issues around privacy respond to their concerns. If you have a whole bunch of students who are really stressed about a particular assessment format, that's probably a red flashing light.”

While both panels focused on different pieces of assessment and academic integrity, they did intersect on the importance of assessment design.

Assessment design is an important part of student learning--and offering different assessment formats offsets the high-stakes nature of summative exams. By providing support to students along the educational journey and making students feel seen via feedback loops and scaffolding, educators can mitigate academic misconduct and offer assessments with integrity .

Attending these two panels made it very clear that assessment design is crucial for effective instruction and student learning outcomes. And that forensics detection is best left to experts.