Article pubs.acs.org/jchemeduc
Undergraduate Oral Examinations in a University Organic Chemistry Curriculum Andrew P. Dicks,* Mark Lautens,* Katherine J. Koroluk, and Stanislaw Skonieczny Department of Chemistry, University of Toronto, Toronto, Ontario, Canada M5S 3H6 S Supporting Information *
ABSTRACT: This article describes the successful implementation of an oral examination format in the organic chemistry curriculum at the University of Toronto. Oral examinations are used to replace traditional written midterm examinations in several courses. In an introductory organic class, each student is allotted 15 min to individually discuss one pre-selected “named” reaction with the course instructors. To stimulate further learning, students must choose a particular reaction that has not been presented during lecture or performed in the laboratory. In upper-level courses, students review selected literature syntheses of either a drug molecule or natural product. The oral examination format provides a dynamic assessment approach that can be individualized to facilitate an in-depth analysis of student comprehension. KEYWORDS: Second-Year Undergraduate, Upper-Level Undergraduate, Chemical Education Research, Organic Chemistry, Communication, Inquiry-Based/Discovery Learning, Testing/Assessment, Applications of Chemistry, Reactions, Student-Centered Learning
■
INTRODUCTION: THE CASE FOR ORAL EXAMINATIONS Undergraduate chemistry students are commonly appraised on their learning and understanding via written examinations, quizzes, practical work, and post-laboratory reports. These assessment methods can encourage rote memorization of material without an in-depth appreciation of chemical concepts. Students attending smaller institutions may experience evaluation during oral presentations or poster presentations,1−4 but these approaches are often impractical for large classes. Even less common is use of oral examinations as a testing mechanism, with few documented examples in the chemical education literature.5,6 Oral examinations are not a new approach to student assessment, but their incorporation into courses with more than a handful of students is unusual. Indeed, the first time many students face a formal oral evaluation of any kind is during a graduate school, professional school, or job interview. Written examinations often constitute a large proportion of the final grade for undergraduate chemistry classes, being viewed as an effective way to assess the cumulative learning of a large number of students. However, such evaluations are often able to assess only a limited portion of student knowledge, and generally cannot probe in-depth understanding or rectify misunderstandings of chemical principles.7−10 Moreover, academic dishonesty during written examinations is sometimes an important problem for educators to address.11−15 The format of an oral examination permits students to practice presentation and oral communication skills together (thus © XXXX American Chemical Society and Division of Chemical Education, Inc.
encouraging cooperative learning), but avoids academic integrity issues as inappropriate collaboration is not possible during the examination.16 In addition, students cannot be successful by solely using memorization as an examination preparation technique.5,6 Oral examinations provide an opportunity to rigorously probe understanding, as leading questions can be asked that facilitate student−faculty discussion. Questions can be tailored for each individual depending on their academic ability, making it a dynamic assessment method. Students are often more prepared for oral discussions, with a face-to-face examination motivating them to process material at an advanced level so they are better able to answer questions.17 Faculty-run office hours are often poorly attended at some institutions, with many students never having significant interactions with professors during their university life.18 Consequently, an oral examination provides occasion for faceto-face contact where it might not otherwise occur. The National Survey of Student Engagement (NSSE) includes “student−faculty interactions” as a benchmark of effective educational practice, citing that such interaction allows for firsthand observation of expert problem solving and provides students with mentors and role-models for continuous learning.19 The benefits of student−faculty interaction are numerous, including objective increases in grade point averages and continuation to post-graduate studies.18 NSSE results indicate that students with higher grade point averages report
A
dx.doi.org/10.1021/ed200782c | J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
more creativity and self-directed learning on the part of students. However, each student was expected to discuss four topics: • a brief history of the reaction discovery • a detailed reaction mechanism, as well as relevant stereochemistry and regioselectivity where applicable • a use of the reaction in a synthetic application • advancements and applications of the reaction with reference to recent chemical literature (published during the previous five years) The remainder of the evaluation content was different depending on the reaction selected and the specific material presented. Each 15-min examination was arranged during the final two weeks of the academic semester using Doodle, a free online scheduling tool.23 Students were provided with a handout that outlined the framework and organization of the examination. In addition, a course instructor explained the structure during one of the lectures. It was emphasized to students that the oral examination was to function more as a discussion of the named reaction with the evaluators, rather than taking the format of a formal presentation.
higher rates of interaction with faculty, with the rate of meaningful student−faculty interaction being twice as high among students achieving mostly A grades compared to those with mostly C grades.20 Improved cognitive development and satisfaction in their education, as well as more developed selfconfidence and leadership skills, further corroborate the improved educational outcomes for increased student−faculty interaction. It has also been noted that students are often unaware of the benefits of in-depth interactions with faculty,18 so utilizing an oral examination as a compulsory interaction early in the university experience promotes initial exposure to these effects. At a large institution, such as the University of Toronto, student−faculty interaction is invaluable as it provides a personal experience in what can often appear a seemingly impersonal environment.
■
“NAMED ORGANIC REACTIONS” AS A FOCUS FOR INTRODUCTORY ORAL EXAMINATIONS In recent years, oral examinations have been implemented in several organic chemistry courses at our institution. Two cohorts (approximately 100 students) have been evaluated via an oral assessment in a single-semester second-year introductory organic chemistry21 course. This offering typically ranges in size from 40 to 70 students and is intended for undergraduates enrolled in a chemistry program of study. During the oral examination, students were required to personally select and discuss a “named reaction” for 15 min. The current organic course evaluation scheme is shown in
Examination Format and Evaluation
Each oral examination began with the course instructors informally asking the student why they selected their particular named reaction. This helped to promote dialogue and calm any nerves. Students provided a wide range of answers to this question. They included similarities to reactions studied during the course (e.g., the Darzens condensation); having the same country of birth as the discoverer (the Favorskii reaction); choosing a reaction founded by a chemist with the same surname (the Corey−Seebach reaction); or a completely random choice. General questions were subsequently asked by the instructors throughout the examination while maintaining an informal tone. If necessary, students were prompted to guide them toward answering a specific question appropriately. Some reactions were easier to discuss than others (e.g., those with a more straightforward operative mechanism), so this was taken into account when assigning grades. The depth of knowledge students were expected to display regarding the mechanism was variable between reactions, with simpler mechanisms warranting more detailed questioning. Students were given flexibility regarding the medium through which they presented their material. Some students elected to use digital slides during the examination, while others used a chalkboard or paper-and-pen to facilitate discussion of the reaction. Some students provided printed handouts, although this was not a requirement. Developing a rigid rubric for an oral examination was not feasible when each student individually determined the content. This required the evaluation criteria to remain roughly defined to facilitate evaluation of all students, no matter which reaction they chose. The rubric shown in Table 2 was used to guide the evaluators when grading. Presentation skills included oral and nonverbal communication
Table 1. Organic Chemistry Course Evaluation Scheme Assessment
Fraction of Total Grade (%)
Two midterm examinations Oral examination Laboratory Final examination
10 each 10 35 35
Table 1. The overarching goals of implementing an oral examination in the organic course were threefold: • to promote direct, individual interaction of each student with the course instructor(s) • for students to learn new chemistry concepts apart from the formal course lecture and laboratory curriculum • to provide students with an examination experience that was different from anything they had previously encountered Examination Preparation and Setup
Students were instructed to access the Organic Chemistry Portal Web site22 and to navigate the sidebar tab entitled “Organic Reactions”. Each student then picked one of the 384 transformations listed under the “Name Reactions” link. It is important to note that not all of the reactions listed are true “named reactions” (e.g., the benzoin condensation). Therefore, it was made clear to students that they had to choose a reaction named after the discovering chemist(s) that was not included in the lecture and laboratory curricula. Reactions were approved on a first-come, first-served basis, with instructional staff verifying that each student studied a reaction not covered in the course syllabus. The structure of the oral examination was left fairly openended, with few concrete details provided. This allowed for
Table 2. Oral Examination Evaluation Scheme
B
Criteria
Proportion of Oral Examination Grade (%)
Presentation skills Presentation content Ability to answer questions
30 40 30
dx.doi.org/10.1021/ed200782c | J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Figure 1. Student response to statements characterizing their experience using a numerical Likert scale.
(such as writing a reaction mechanism) and delivering information in an organized manner. Student knowledge about their chosen subject was evaluated partly through reviewing the content they had prepared for the session and whether they presented the required information. The ability of each student to answer questions was used to assess depth of knowledge and the extent of the research undertaken and to gauge how capably a student could synthesize information quickly, problem-solve, and think critically. For example, one student mentioned a recent innovation of their selected reaction being undertaken via solid-phase synthesis. Their knowledge regarding solid-phase techniques was subsequently assessed through further questioning to determine the depth of their understanding. To limit inconsistencies and subjectivity, which are reported problems in oral testing,24,25 two of the course instructional staff were present for each examination. This approach may not be possible at other colleges or universities where a single instructor teaches both lecture and laboratory components of an introductory course. In such instances, a upper-level undergraduate or graduate student could be selected as an additional examiner. The evaluators were typically in close agreement with respect to the overall mark a student should receive. Grades for the oral examination component have typically ranged from 60% to 95%. When analyzing the grades of students in the 2009 and 2010 offerings of the course, it became clear that the oral examination served to boost final grades, especially among weaker students. The majority of students attaining final grades below the course average achieved oral examination grades that were 10−20% higher than their final course grades. This observation supports the notion that students prepare more rigorously for an oral examination due to the social pressures of answering direct questions.17 The related findings of Roecker published in this Journal regarding inorganic chemistry oral examinations are also bolstered.6 The self-correcting nature of oral examinations, the process of “thinking out loud” and the relatively small amount of material tested are cited as possible explanations for the results. Reasonable modifications can be made to the oral examination format to accommodate undergraduates with physical or learning disabilities. For example, students with particular learning disabilities may be given more time to complete the assessment. Students who are hearing- or speechimpaired may be offered a related alternative written assess-
ment. Using large-type print and enlarged images can accommodate visually impaired students.26 However, it has been found that students with disabilities fare better when mathematics tests are orally administered as compared to regular written exams, thereby “leveling the playing field” with respect to their nondisabled peers.27 Correspondingly, the use of oral examinations may in fact benefit these students.
■
ORAL EXAMINATIONS IN UPPER-LEVEL ORGANIC COURSES Oral evaluations have additionally been incorporated into two upper-level organic courses at the University of Toronto. The setup, administration, and operation of these examinations were identical to that outlined for the introductory organic course. Over three years in a third-year undergraduate course,28 approximately 150 students were asked to individually select a medicinally important drug compound that targeted a specific disease state. They were subsequently examined on their ability to orally compare and contrast two synthetic routes toward their chosen substance. To focus their selection, students were provided with broad compound-classification categories such as “neuraminidase inhibitors”, “histamine H3 receptor antagonists” and “Hepatitis C virus protease inhibitors”. A fourth-year undergraduate course cross-listed as part of the departmental graduate program29 has also implemented an oral examination over a three-year period. In this instance, over 50 students were charged with personally selecting a natural product from the primary literature, which was vetted by the course instructor. Upon approval, the students researched two distinctly different synthetic approaches to the compound and presented the approaches with the aid of a written or typed handout. Graduate students were additionally expected to outline a new potential route to the molecule they had chosen and to highlight the key new step or steps in a retrosynthetic disconnection. As an example, one undergraduate selected the cytotoxic natural product (−)-FR182877 as a target compound, which has a potency similar to Taxol toward certain cancers.30 This student discussed and was examined on separate synthetic approaches published by Evans and Starr31 and Tanaka et al.30
■
STUDENT FEEDBACK A survey was distributed following the completion of the oral examination in the introductory organic course to gauge student response to the testing approach. Listed below are C
dx.doi.org/10.1021/ed200782c | J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
had to do one every year, it would be easy.” If the format is used in consecutive years of study, then students can be more prepared for the experience and can build on previously developed skills. A common objection to including oral examinations in chemistry curricula is the amount of time educators must devote to organizing and administering them. In the introductory organic course, each student was allotted a 20min time slot, with 15 min devoted to discussion and questions and 5 min used for grading after the examination. This typically amounted to 10−15 h of assessment time, which can be reduced for larger classes by arranging for students to undergo the assessment in pairs. Although this seems like a significant commitment, it is spread over two weeks, and replaces the hours spent grading one midterm examination. Also, instructors need not prepare questions as they would for a written examination, as each student selects the discussion topic. Therefore, oral examinations simply use faculty resources differently than other forms of evaluation.6
some written comments received from students at the end of the course in 2010. • The oral exam was very useful for exploring one reaction in depth. I liked the chance to meet with the profs. • The oral exam was a very different university assignment as it was an opportunity for students to present what they are interested in yet still show what they have learned from lecture. Also, it was nice to have some individualism in such a large university. • I wish they would explain it to us beforehand that it is more like a discussion instead of a presentation. • Perhaps clearer outlines for the oral exam can be given overall, a good experience! • The oral exam was an incredibly valuable experience, and greatly increased my interest in organic chemistry. It added another dimension to the class. • The oral exam was intimidating. But despite that and not being able to answer most of the questions Prof. Lautens/Dicks threw at me, I took time to think about it as a whole after the fact, and truly appreciated the experience. It was a great component of the course. Wish all my courses had an oral exam. The students were also asked to respond to different statements regarding their experience using a numerical Likert scale32 (Figure 1). It is clear from this feedback that the vast majority of students believed the named reaction oral examination to be a valuable assessment method that provided an opportunity to learn a great deal. Almost 65% of respondents considered they learned more than a moderate amount from the exercise and over 90% believed it at least moderately important as an assessment tool (Figure 1). However, answers to other questions revealed that students found the oral examination stressful, with half of them preparing material for 10 h or more and an average preparation time of greater than 8 h. Most students had never encountered such an assessment format, so were naturally anxious and did not know what to expect. Students were also concerned about the types of questions that might be asked during the oral examination. The stress levels of undergraduates undergoing written examinations and oral presentations33 as well as oral examinations34,35 have been measured. Students had significantly elevated saliva levels of stress hormones leading up to all evaluations, but levels were highest prior to oral examinations. The pressures of oral testing and apprehension regarding an unfamiliar evaluation approach made for a stressful student experience.6 While conducting oral examinations over several years in three different courses, no student has been incapacitated by anxiety such that it prevented the satisfactory completion of the assessment. Importantly, the presence of the laboratory instructor as one of the evaluators helped to alleviate anxiety, as they knew each student personally by name. Each examination could also be conducted in student pairs, which may reduce stress levels as well as decrease the total amount of instructor time required for grading. It has been suggested that the oral examination format be used more than once in a course, so that students know more of what to expect after the first evaluation.6,17,23 In another postcourse survey, one upperlevel graduate student29 (who completed undergraduate studies at another institution) specifically commented on the oral examination format. This student stated that “it is stressful because I made [sic] only one oral examination in five years. If I
■
PEDAGOGICAL IMPORTANCE OF ORAL EXAMINATIONS Universities not only provide knowledge to their students, but afford them the opportunity to develop other desirable attributes to produce well-rounded graduates. However, many traditional testing techniques are not able to assess more than pure knowledge. Oral testings provide an alternative form of assessment that can evaluate students on skill sets beyond general chemistry knowledge. Oral examinations often cannot probe the breadth of understanding that written examinations might be able to due to time limitations. Rather, they aid assessment of depth of understanding, effective oral communication, critical thinking, presentation skills, and problemsolving abilities that are not easily appraised by other methods.16,24 One upper-level student29 noted “...a student will have to learn a topic more thoroughly when he/she has to explain it to someone else.” Although some modes of written examinations are able to evaluate problem solving and critical thinking, only oral testing allows instructors to gauge these abilities in real time. Examiners can directly observe how a student reacts to introductory questions and can adapt further questioning to suit the abilities of the student, making it a rigorous and dynamic form of evaluation.36 Some educators have developed methods, such as scratch cards, to provide the opportunity for students to gain help on examinations, where use of hints results in a small grade deduction.37 The oral examination format allows for aid to be given to students who are struggling to elucidate an acceptable answer to a question, with any hints provided being taken into account when grading. A common issue in science education is that students use rote memorization and test-taking savvy as a method to succeed, but “learning how to take chemistry tests is not the same as learning chemistry”.38 For example, a student may be able to memorize how to draw a particular reaction mechanism for a written examination, but an oral examination can better evaluate if the basis of mechanisms and arrow pushing are understood. Oral examinations can help identify these key conceptual misunderstandings, whereas correct answers on a written examination may mask them.6,39 The oral examination design in all three courses stipulates that the student selects the topic and content of the examination, thereby promoting selfdirected learning. Making the student directly involved in their own education gives them the opportunity and responsibility to D
dx.doi.org/10.1021/ed200782c | J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
(19) National Survey of Student Engagement (NSSE) Benchmarks of Effective Educational Practice. http://nsse.iub.edu/pdf/nsse_ benchmarks.pdf (accessed Oct 2012). (20) NSSE Report Builder. http://bl-educ-cprtest.ads.iu.edu/SAS/ rb_nsse.html (accessed Oct 2012). (21) University of Toronto Chemistry Undergraduate Courses: CHM 249H (Organic Chemistry). www.chem.utoronto.ca/ undergrad/viewcourse.php?code=CHM249H (accessed Oct 2012). (22) Organic Chemistry Portal. www.organic-chemistry.org (accessed Oct 2012). (23) Doodle Event Scheduling. www.doodle.com (accessed Oct 2012). (24) Pearce, G.; Lee., G. J. Marketing Educ. 2009, 31, 120−130. (25) Rangachari, P. K. Adv. Physiol. Educ. 2004, 28, 213−214. (26) Supalo, C. J. Chem. Educ. 2005, 82, 1513−1518. (27) Huynh, H.; Meyer, J. P.; Gallant, D. J. Appl. Meas. Educ. 2004, 17, 39−57. (28) University of Toronto Chemistry Undergraduate Courses: CHM 342H (Modern Organic Synthesis). www.chem.utoronto.ca/ undergrad/viewcourse.php?code=CHM342H (accessed Oct 2012). (29) University of Toronto Chemistry Undergraduate Courses: CHM 440H (Synthesis of Modern Pharmaceutical Agents). www. chem.utoronto.ca/undergrad/viewcourse.php?code=CHM440H (accessed Oct 2012). (30) Tanaka, N.; Suzuki, T.; Matsumura, T.; Hosoya, Y.; Nakada, M. Angew. Chem., Int. Ed. 2009, 48, 2580−2583. (31) Evans, D. A.; Starr, J. T. J. Am. Chem. Soc. 2003, 125, 13531− 13540. (32) Likert, R. Arch. Psychol. 1932, 140, 1−55. (33) Preuß, D.; Schoofs, D.; Schlotz, W.; Wolf, O. T. Stress 2010, 13, 221−229. (34) Harl, B.; Weisshuhn, S.; Kerschbaum, H. H. Neuroendocrinol. Lett. 2006, 27, 669−674. (35) Schoofs, D.; Hartmann, R.; Wolf, O. T. Stress 2008, 11, 52−61. (36) Davis, M. H.; Karunathilake, I. Med. Teach. 2005, 27, 294−297. (37) Ellis, A. B. J. Chem. Educ. 1993, 70, 768−770. (38) Moore, J. W. J. Chem. Educ. 2001, 78, 855. (39) Bergquist, W.; Heikkinen, H. J. Chem. Educ. 1990, 67, 1000− 1003. (40) Garrison, D. R. Adult Educ. Quart. 1997, 48, 18−33.
engage in independent learning, which stretches beyond pedagogy to a “basic human competence”.40
■
CONCLUSION Oral examinations have been successfully integrated into the organic curriculum at the University of Toronto, with substantial positive student feedback. The spotlight on a selfselected named organic reaction at the introductory level facilitated an active evaluation process easily adapted to individual students. The open-ended format promoted creativity, and a one-on-one interaction with teaching staff proved to be a highly memorable experience for students. The oral examination approach can be easily incorporated into curricula by using it to replace an existing written assessment. The named reaction focus has additionally been modified at our institution for upper-level undergraduates to critically evaluate two synthetic routes toward a medicinal agent or natural product. This provides students with exposure to oral testing conditions throughout their undergraduate career.
■
ASSOCIATED CONTENT
S Supporting Information *
Course handouts describing the oral examination format in each course, typical named reactions and compounds selected; two sample student presentations; and additional student feedback. This material is available via the Internet at http:// pubs.acs.org.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail: (A.P.D.)
[email protected]; (M.L.)
[email protected]. Notes
The authors declare no competing financial interest.
■
ACKNOWLEDGMENTS We are grateful to the Faculty of Arts and Science, University of Toronto for a President’s Teaching Award (A.P.D). We also thank Landon Edgar and Andrei Hent for providing the sample student presentations available in the Supporting Information.
■
REFERENCES
(1) Sisak, M. E. J. Chem. Educ. 1997, 74, 1065−1066. (2) Huddle, P. A. J. Chem. Educ. 2000, 77, 1154−1157. (3) Mills, P. A.; Sweeney, W. V. J. Chem. Educ. 2000, 77, 1158−1161. (4) Brown, L. R. Chem. Educator 2008, 13, 54−58. (5) Muldoon, J. A. J. Chem. Educ. 1926, 3, 773−776. (6) Roecker, L. J. Chem. Educ. 2007, 84, 1663−1666. (7) Bergquist, W.; Heikkinen, H. J. Chem. Educ. 1990, 67, 1000− 1003. (8) Ellis, A. B. J. Chem. Educ. 1993, 70, 768−770. (9) Toby, S.; Plano, R. J. J. Chem. Educ. 2004, 81, 180−181. (10) Smith, K. C.; Nakhleh, M. B.; Bretz, S. L. Chem. Educ. Res. Pract. 2010, 11, 147−153. (11) Harpp, D. N.; Hogan, J. J. J. Chem. Educ. 1993, 70, 306−311. (12) Harpp, D. N.; Hogan, J. J. J. Chem. Educ. 1996, 73, 349−351. (13) Harpp, D. N.; Hogan, J. J. J. Chem. Educ. 1998, 75, 482−483. (14) Harpp, D. N. J. Chem. Educ. 2008, 85, 805−806. (15) Christensen Hughes, J. M.; McCabe, D. L. Can. J. Higher Educ. 2006, 36, 1−21. (16) Kehm, B. M. Assess. Educ. Princ. Pol. Pract. 2001, 8, 25−31. (17) Hambrecht, G. Delta Kappa Gamma Bulletin 2003, 69, 31−32. (18) Cotten, S. R.; Wilson, B. Higher Educ. 2006, 51, 487−519. E
dx.doi.org/10.1021/ed200782c | J. Chem. Educ. XXXX, XXX, XXX−XXX