Letter Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX
pubs.acs.org/jchemeduc
Crime in the Classroom: Conclusions after 27 Years David N. Harpp*
Downloaded via 46.148.115.161 on July 30, 2018 at 04:16:45 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.
Department of Chemistry, McGill University, Montreal, Quebec H3A 0B8, Canada ABSTRACT: For the past 27 years, McGill University has required that questions on multiple-choice exams be scrambled into multiple versions. Consequently, copying during exams has been virtually eliminated, suggesting a cultural shift in the prevalence of cheating. In a follow-up experiment to determine whether students might still try to copy adjacent exams today, a single version exam was administered to a class of over 1500 students; it was labeled as “Version 1” as its header. Using an algorithm to detect common wrong answers, no prosecutable cases were detected. KEYWORDS: Ethics, Testing/Assessment
■
first 15 students were counted and led to their seats. Students 16−30 were taken to the adjacent row, to eliminate students being able to sit adjacent to each other intentionally. Cheating within a row by copying from back to front was still possible, however, and was shown to occur in previous exams before the new exam scrambling regulations were approved by the Senate of McGill in 1991.1 The hypothesis we are testing in our experiment is that campus culture at McGill has been significantly altered over the past 27 years in the minds of the students, such that they do not even consider copying owing to the policy of requiring scrambled versions of multiple-choice exams. Prior to 1991, our statistical analysis revealed that cheating rates of 4−6% were commonplace on multiple-choice exams across campus.1 The results of this experiment are unequivocal. Comparison of the 1,544 students who took the multiple-choice exam with identical labels (1,192,296 unique pairs)1,5 revealed only three possible collaborations. Two pairs did not quite reach the threshold for prosecution.2,3 In one of these pairs, the two students were seated directly adjacent to each another. Even though the similarity of their responses did not warrant further investigation, it is likely some collaboration took place. The second pair sat eight seats and two rows apart, minimizing the opportunity to view each other’s exam responses. The third student pair indicated similarities in answers sufficient for further investigation into potential cheating.2,3 However, because the two students were seated 7 rows apart in the gymnasium and at different ends of their respective rows, it is unlikely that there was any collusion. The similarity of the third student pair’s responses could be the result of communication by cell phone or surreptitious planting of answers in, say, a washroom during the exam, but we have no evidence for these events. We posit that, given that there were over a million pairs of answers to compare, the similarities of the students’ answers are likely a coincidence and a result of the nature of the specific questions. Of their 11 errors in common, 7 were among the least well-answered questions in the exam, and in 8 of the 11 answers, the pair chose the most popular wrong answer.
INTRODUCTION In 1991, the McGill University Senate passed regulations requiring answers on all multiple-choice exams to be scrambled.1 Subsequently, two sensitive indices were established, with considerable experimental input, to ensure that honest, but similar, exam answers from a given pair of students would not be unfairly prosecuted. One index suggests that similarities for a pair that calculate to over 5 standard deviations from the mean (i.e., the odds that the answers are the same) should be investigated.2,3 The other index involves the ratio of the number of exact errors in common for the pair divided by the total differences in the pair’s exam. Values of ∼1.0 or more indicate likely collusion and should be examined further.2,3 In the event that both indices are exceeded by a pair of students, their seating arrangement can be investigated, given that McGill also requires sign-in seating. As in our recent publication on the topic of cheating on multiple-choice exams,4 we conducted an experiment to determine whether the university’s policy1 to require multiple exam versions for the previous 26 years had successfully discouraged attempts at copying adjacent optical (bubble) answer sheets on multiple-choice exams campus-wide. In that experiment, we gave over 1400 students the same version of an exam, but with different labels (Versions 1−4). No instances of copying were detected in our analysis of student responses. While the lack of any cheating attempts suggests a shift in campus culture away from cheating by copying, it is nevertheless possible that the labeling of the exams as different versions may have simply discouraged committed copiers. Here, we report the results of a follow-up experiment to determine more precisely the reason for the lack of cheating by copying at McGill, whether it is a wholesale cultural reduction in cheating or can be attributed to students believing that exam questions and answers are scrambled into multiple versions.
■
THE EXPERIMENT In the most recent iteration of our study, we administered an exam to 1,544 students. The exam was supplied as a single version, with each copy clearly marked as Version 1 in the top corner of the exam, lacking the different labels of the previous study.4 Seating during the exam was partially controlled. The © XXXX American Chemical Society and Division of Chemical Education, Inc.
Received: June 8, 2018
A
DOI: 10.1021/acs.jchemed.8b00434 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Letter
Had the regulations at McGill not been implemented, and had the usual exam procedures (single version exams with permissive seating) of the pre-1991 era been continued, there would have been roughly 70−100 individuals who had cheated in this examination.
■
CONCLUSIONS The lack of any significant evidence for pervasive cheating on an unscrambled, clearly labeled single version of an exam administered to over 1,500 students at McGill clearly supports the hypothesis that student attitudes toward academic dishonesty can be shifted campus-wide. With appropriate regulations in place that prohibit identical exams and that minimize opportunities to control where students sit during exams, students receive a clear message that the university cares about academic honesty and is willing to make the effort to protect students’ academic efforts. We strongly suggest that it is the institution-wide adoption of anticheating exam protocols1,6 that has resulted in the positive change in student behavior.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. ORCID
David N. Harpp: 0000-0001-8228-6038 Notes
The author declares no competing financial interest.
■
ACKNOWLEDGMENTS The author thanks McGill University for continued vigilance and support of academic integrity with special mention to Ms. Anik Vranken, Ms. Milena Taibi, and Mr. Stanley Whyte of Enrolment Services at McGill. I also thank Professor K. S. Harpp, Colgate University for helpful discussions.
■
REFERENCES
(1) Harpp, D. N.; Hogan, J. J. Crime in the Classroom. Detection and Prevention of Cheating on Multiple-Choice Exams. J. Chem. Educ. 1993, 70, 306−311. (2) Harpp, D. N.; Hogan, J. J.; Jennings, J. S. Crime in the Classroom. Part II. An Update. J. Chem. Educ. 1996, 73, 349−351. (3) Harpp, D. N. Crime in the Classroom Part IV. Conclusions. J. Chem. Educ. 2008, 85, 805−806. (4) Harpp, D. N. Crime in the Classroom: Analysis Over Twenty-Six Years. J. Chem. Educ. 2018, 95, 338−339. (5) Wesolowsky, G. O. Detecting Excessive Similarity in Answers on Multiple Choice Exams. Journal of Applied Statistics 2000, 27, 909− 921. (6) To Stop Exam Cheats, Economists Say, Try Assigning Seats. http://www.chronicle.com/article/To-Stop-Exam-Cheats/233741 (accessed May, 2018).
B
DOI: 10.1021/acs.jchemed.8b00434 J. Chem. Educ. XXXX, XXX, XXX−XXX