Classroom Response Systems for Implementing Interactive Inquiry in

Aug 11, 2014 - Richard W. Morrison,* Joel A. Caughran, and Angela L. Sauers. Department of Chemistry, The University of Georgia, Athens, Georgia 30602...
1 downloads 0 Views 618KB Size
Article pubs.acs.org/jchemeduc

Classroom Response Systems for Implementing Interactive Inquiry in Large Organic Chemistry Classes Richard W. Morrison,* Joel A. Caughran, and Angela L. Sauers Department of Chemistry, The University of Georgia, Athens, Georgia 30602, United States ABSTRACT: The authors have developed “sequence response applications” for classroom response systems (CRSs) that allow instructors to engage and actively involve students in the learning process, probe for common misconceptions regarding lecture material, and increase interaction between instructors and students. “Guided inquiry” and “discovery-based learning” are based on the premise that the best learning occurs when students are actively engaged in developing hypotheses and arriving at conclusions for themselves, rather than learning in a passive lecture format. In this regard, we use CRSs to actively engage large lectures of 300+ students, where the traditional interaction between students and instructors is commonly limited to the first several rows of the lecture hall. Moreover, series response applications allow a nearly free response format for questioning students, as opposed to the traditional multiple-choice question format commonly used with CRSs. As such, we have observed that students are more engaged and actively involved in answering questions. This paper provides several examples to illustrate how our stepwise technique can be used to demonstrate the depth of insight into student understanding, even of multistep thought processes, afforded through this stepwise analysis. KEYWORDS: First-Year Undergraduate/General, Second-Year Undergraduate, Organic Chemistry, Collaborative/Cooperative Learning, Problem Solving/Decision Making, Testing/Assessment, Reactions, Student-Centered Learning, Synthesis



unable to identify the point where their logic becomes flawed. Since mcqs are an “all or nothing” type of evaluation, there is no indication of where the student faltered in the multistep problem solving strategy. Consequently, it is difficult to extract useful information regarding specific misconceptions held by the student, only that the student ultimately arrived at the wrong answer. With this in mind, we developed CRS questions that went beyond the traditional multiple choice question format that required responses we described7 as “sequence response applications for CRSs”. “Sequence-response” questions allow the instructor to embed in a question more than one variable resulting in many more potential answer choices. As such, the instructor can have more confidence that student responses result from reasoned answers rather than random guessing. Additionally, with more than four or five answer choices, students are less successful narrowing them down by process of elimination. In this manner, the sequence response format approximates free response questions and differs significantly from the more limited mcq format in how students derive their answers. Furthermore, in-class analysis of the number sequences provides immediate feedback and pinpoints specific steps of the problem-solving strategy that are problematic for those students that answered incorrectly. Sequence response applications provide for a nearly free response format when questioning students. Our initial examples of this question type for CRSs were presented at the National ACS Meeting in Spring 20078 and have since been adapted by others. For example, Ruder and Straumanis, citing our previous work, describe using this approach to assess

INTRODUCTION Guided inquiry and discovery-based learning are remarkably powerful teaching methods1 with potential applications in large-lecture settings. We have taken from “guided inquiry” and “discovery-based learning” the fundamental precept that the best learning occurs when students are actively engaged in developing hypotheses and arriving at conclusions for themselves, as opposed to learning them in a passive lecture format. To accomplish this level of student involvement, we use Classroom Response Systems (CRSs)2 to interactively engage students in large lectures (300+ students), where the traditional “give-and-take” between student and instructor is most commonly limited to the first several rows of the lecture hall.3 In 2006, when we initially considered using CRSs in this way, discussions of their use within the chemical education community were primarily focused on classroom participation credit. We proposed that assignment of bonus credit associated with CRS questions to elicit student responses was unnecessary if questions were written to be engaging and informative. Traditionally, CRSs are implemented using multiple choice questions (mcqs).4−7 Several limitations are encountered when implementing such mcqs as an assessment tool for “live” feedback. First, with only four choices, a significant component of the response distribution will reflect a statistical representation of random guessing. Simply put, there is no distinction between an intentional answer and a random guess. Further, organic chemistry often requires students to devise problemsolving strategies involving logic strings. In order to construct a logical organic synthesis or mechanism, students must follow a sequence of if/then decisions that leads to the right conclusion. Students that possess a rudimentary understanding of a given concept but are unable to arrive at the correct solution are often © XXXX American Chemical Society and Division of Chemical Education, Inc.

A

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

student use of curved-arrow notation using CRSs.9 Flynn describes using this approach to develop problem-solving skills through retrosynthetic analysis.11 Since 2007, other specific uses of CRSs have also been described in this Journal.10−17 A comprehensive presentation of sequence response questions across the spectrum of organic chemistry topics is beyond the scope of a single article. In this paper we present additional examples of how the use of Classroom Response Systems and sequence response questions allows students and instructors to collaboratively analyze a problem or concept, providing immediate clarification of misunderstandings.7 We describe this real-time analysis of student misconceptions using CRSs as Interactive Inquiry to demonstrate the depth of insight into student understanding of additional topics in organic chemistry, even of multistep thought processes, afforded through this stepwise analysis.

to be endothermic, students look for a profile that ends at higher E than it begins. However, there are other more challenging aspects of reaction profiles that are not easily asked in the mcq format, and these additional aspects are more revealing of student comprehension. For example, can the student generate a reaction profile for a given reaction de novo? In so doing, can the student correctly predict the relative energy levels of the transition states for multistep profiles? Thus, the more probing and insightful questions are those that require students to consider a given reaction and draw its reaction profile. Unfortunately, a freehand drawing is not possible in the mcq format. To illustrate the sequence response technique using CRSs, we will return to the reaction profile question described in Figure 1. Rather than choosing between reaction profiles that are provided in the body of the question, students are asked to generate one using the provided template. Energy levels 1 through 5 loosely approximate energy maxima and minima for the reaction. Figure 2 illustrates how the question was posed as a CRS question and shows the five most common student responses. The electrophilic addition of HBr to 1-pentene is an exergonic reaction. In order to describe an exergonic reaction using the template provided in Figure 2 the first number of the sequence must be greater than the last number. Thus, the correct description of an exergonic reaction in student responses is quickly identified by comparing the first and last numbers. Each of the five most common student responses shown in Figure 2 has the first number greater than the last number, demonstrating that these students understand that the reaction is indeed exergonic. The electrophilic addition of HBr to 1-pentene is a two-step reaction that proceeds through a carbocation intermediate. This important reaction characteristic is described using the above template by a five-digit sequence. Sixteen students responded “351” and 13 students responded “342”. These responses describe one-step reactions that differ in ΔG⧧ and ΔG°rxn. These student errors are best examined in the discussion of the correct five-number sequence. At this point of the analysis, inclass discussion involves comparing the three number sequences “25341”, “34521”, and “45321”. Figure 3 illustrates

Example 1: Reaction Coordinate Diagrams

A typical multiple choice question (mcq) for reaction coordinate diagrams is shown in Figure 1.

Figure 1. Representative organic chemistry multiple choice question for CRSs.

In our experience, students easily recognize and select the correct profile from the possibilities provided in Figure 1. Because the question statement indicates a 2-step reaction, students look for 2 “humps”. And, because the reaction is stated

Figure 2. Reaction profile template and student response data. B

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

locate the substituents. For example, students are provided with the following key and instructed to enter into their CRS device the alphanumeric sequence that correctly names the given compound (Figure 4).

Figure 3. Energy profile for the electrophilic addition of HBr to 1pentene.

that the sequence “25341” describes a two-step process because the fourth digit is greater than the third digit, thereby describing the activation energy associated with the second step of the reaction. Additional classroom discussion focuses on the relative heights for ΔG⧧1 and ΔG⧧2 by comparing the second and fourth digits of the sequence, emphasizing that the ionization in step 1 of the reaction has the greatest activation barrier. In summary, the use of CRSs in this manner to ask a question pertaining to reaction energetics requires students to generate de novo reaction profiles. Analysis of the number sequences is vastly more revealing of student comprehension of reaction energetics than responses to analogous mcqs.

Figure 4. A “series response” nomenclature question for CRSs.

Students correctly name this simple alkane, 3-ethyl-2methylpentane, by referring to the key and entering the following alphanumeric sequence: 3h2ga. Nomenclature is sequential by design. Each character in the answer corresponds to a step in the systematic process of developing the compound’s name, so by analyzing student responses in a stepwise fashion, the instructor can clearly see which step was problematic for the students. The top five CRS responses for this question are shown in Figure 5.

Example 2: Nomenclature

CRS devices that accept alphanumeric responses are particularly useful with organic nomenclature. A typical mcq nomenclature problem asks students to select the correct name for the following compound from a list of five possibilities:The five choices, A through E, have certain

embedded cues. The first cue for students is that all answer choices identify the parent chain as pentane. The next cue is found as students compare the answer choices and discover that A, D, and E name pentane chains with two branches whereas B and C name parent chains containing only one branch, thus cueing students to think about maximizing the number of branches when naming carbon chains. The next cue is discovered as students compare the three answer choices with two branches. Choices A and D have numbered the carbon atoms of the parent chain in the same way, giving locants 2 and 3. Choice E has numbered the carbon atoms differently, giving locants 3 and 4. By comparing these choices, students are cued to number the parent chain so as to minimize the number sequence for locants. The final cue is found as students compare choices A and D and discover that they differ only in the alphabetical order of substituents. Thus, students can arrive at the correct answer through a process of elimination that is prompted by the embedded cues. To set up a series response nomenclature problem, a problem key is developed. The parent chain or ring and its substituents are represented by letters. Numbers are used to

Figure 5. Student CRS response data for nomenclature problem from Figure 4 showing most common responses.

Seventy-four of the 118 respondents entered the appropriate sequence of alphanumeric characters to correctly name this simple alkane. The most common incorrect entry is 2g3ha, which corresponds to 2-methyl-3-ethylpentane. Addressing these student responses that interchange characters in the alphanumeric sequences (3h2ga vs 2g3ha), the instructor can emphasize correct alphabetization for branches. Addressing student responses that have the correct letters in the right order but have the wrong numbers preceding them (3h4ga), the instructor can explain how the carbon atoms of the parent chain should be correctly numbered. If desired, the instructor can display all responses. Less frequent incorrect responses commonly contain combinations of errors, such as identifying branches without numbers and placing branches in the incorrect alphabetical order. These C

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

combinations of errors can be individually addressed by the instructor in the class discussion of responses containing the most frequent single errors. For responses that are markedly different, most likely the last letter is also incorrect; this is commonly a result of incorrectly identifying the parent chain, and consequently, most or all of the substituents are incorrectly identified as well. In this nomenclature example where the majority of students selected the correct sequence, only minor comments are required by the instructor to correct student misconceptions. Nomenclature questions for more structurally elaborate compounds can be asked in two parts if the CRS limits the number of characters in an alphanumeric response. For example, students are first asked to identify the parent chain in the compound shown in Figure 6, using the provided legend. Once the parent chain is identified, students are then asked to correctly identify and sequence the branches.

Figure 8. Response data for 4,6-diethyl-2,3,8-trimethyl-5-(1methylethyl)nonane.

only difference between these two is the name for the branch at C5 and then rehearses with the class the complex naming protocols. The third most common alphanumeric sequence, 46H238G5IE, decodes to 4,6-ethyl-2,3,8-methyl-5-propylnonane. This response is an example of a combination of errors, the error type that constitutes the majority of responses found in the “other” category. For this particular response, the Greek prefixes di and tri are absent as well as incorrect identification of the substituent at C5. If the instructor has already discussed complex naming protocols in conjunction with the first response, this classroom discussion focuses on the need for and correct use of Greek prefixes to name multiples of the same branch. In class analysis of CRS responses to nomenclature questions more clearly identifies student misconceptions, thus allowing the instructor to most profitably use valuable lecture time. Finally, it should be noted that when nomenclature questions are phrased as series response questions, they effectively become free response questions. In contrast, nomenclature problems formatted as standard mcqs involve selecting the correct answer from four or five naming choices. This answer recognition limitation of mcqs precludes providing comprehensive nomenclature practice for students. And, equally important, when the mcq format is used, instructors are not able to determine with precision where students need additional practice and remediation.

Figure 6. Nomenclature legend for 4,6-diethyl-2,3,8-trimethyl-5-(1methylethyl) nonane.

Most students correctly identify the parent chain as a nonane (Figure 7). Student responses identifying and locating

Example 3: Synthesis Road Map

This example illustrates the utility of the series response CRS technique in conjunction with organic synthesis. To contrast our series response application with the traditional mcq technique, a synthesis question (Figure 9) is first presented in the traditional mcq format, patterned after similar questions contained in the 2002 National ACS Standardized Organic Chemistry Exam.5 The answer choices reflect the type of choices (including distracters) found on the ACS exam. There are several limitations with using the mcq format for synthesis problems. First is the aforementioned “guessing factor”. With only four answer choices, students have a 25% chance of stumbling upon the correct answer just by random guessing. Second is the “cueing effect”,5 where the answer choices themselves are embedded with cues for how to answer the question. For example, one of the important concepts being tested in this synthesis is the choice of oxidizing agent (PCC vs CrO3). Answers (A) and (B) differ only in choice of oxidizing agent, which prompts students to remember that PCC and CrO3 react differently with primary alcohols. This is a subtlety

Figure 7. 4,6-Diethyl-2,3,8-trimethyl-5-(1-methylethyl)nonane with highlighted and numbered parent chain.

substituents are shown in Figure 8. Decoding the most common alphanumeric sequence, 46OH238PG5I, gives 4,6diethyl-2,3,8-trimethyl-5-propyl for the branch names. The second most common alphanumeric sequence, 46OH238PG5J, gives 4,6-diethyl-2,3,8-trimethyl-5-(1-methylethyl) for the branch names. The instructor can note with the class that the D

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

every answer choice would begin by oxidizing the benzyl alcohol and students are given the first step of the answer. Furthermore, distracter (D) shows the only other probable pathway that seems reasonable to students, but it unfortunately reveals the synthetic “twist” that distinguishes the correct answer (A) from the most probable incorrect answer (C). Prompting students to consider that the alcohol may be converted to the Grignard reagent partner instead of the carbonyl partner is the single cue that discriminates between the correct answer (A) and the most common incorrect answer (C), and this clue is given in answer choice (D). Therefore, even creatively written mcqs become exercises in “answer recognition” as students systematically eliminate answer choices using the embedded clues, as opposed to generating an answer by relying only on knowledge and critical thinking. To illustrate how series response applications circumvent these issues, the same synthesis roadmap question is rephrased and presented in Figure 10. Students are presented with a starting material, a synthetic target, and a reagent table. Reagents in the table are identified by number. Using the CRS device in numeric mode, students enter a sequence of numbers corresponding to the appropriate order or sequence of reagents that will accomplish the desired synthetic transformation. In class, stepwise analysis of the top two or three most common number sequences provides immediate feedback, targeting the step(s) where students erred in the logic sequence. Selecting from the nine listed reagents this synthesis is accomplished in five steps. If students are allowed to use a reagent only once, there are 15,120 possible numerical combinations. Additionally, students are instructed that they may use a reagent in more than one step of the overall synthetic transformation. This greatly increases the possible numerical combinations and effectively reduces to zero the likelihood of guessing the correct answer. In our experience, the top three student responses represent thoughtful and reasoned student analyses, which serve as a basis for in lecture discussion and remediation.

Figure 9. A synthetic roadmap problem phrased as a mcq.

that students might have not remembered if the clue were not embedded in the answer choices. While generating distracters, the instructor must avoid these “cues”. Answer choices (A) and (B) are embedded with an important clue, but they are also essential choices because they test the students’ ability to oxidize a primary alcohol to a carbonyl compound appropriate for a Grignard reaction. Distracter (C) is the most common pathway students initially pursue to solve this problem. Having applied the Grignard reaction to accomplish the first C−C bond formation, students reproduce the Grignard reaction for the second C−C bond forming reaction selecting the substituted benzene as the electrophilic carbonyl reaction partner. This leaves distracter (D) as the first and only option that begins with something other than an oxidation. However, because this is the only answer choice that does not oxidize the alcohol in the first step, it stands as a clear outlier and students are prompted to begin the synthesis by oxidizing benzyl alcohol. To circumvent this, distracter (D) can also begin with an oxidation step, but then

Figure 10. A “series response” organic synthesis question for CRSs involving a multistep transformation. E

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 11. A “series response” organic synthesis question for CRSs, starting with acetylene.

CRSs effectively address many challenges accompanying large lecture courses.18 However, the use of multiple choice questions in conjunction with CRSs has several inherent limitations. Multiple choice questions often “cue” students to the correct answer. Additionally, random guessing by unengaged students can cause instructors to devote valuable lecture time clarifying random answers by ill-prepared students that respond only to receive class participation credit. Series response applications using CRSs permit instructors to investigate with students each branch of the “decision tree” involved with a multistep problem solving strategy. They allow instructors to walk stepwise with students through the problem, precisely pinpointing misconceptions at each specific step. In class analysis of the four or five most common student responses represented in the histogram allows instructors to devote valuable lecture time clarifying “real” student misconceptions. Finally, series response problems are conceptually more challenging. Consequently, instructors do not forfeit or oversimplify content or rigor when using CRSs. The multifaceted nature of the series response question format allows several concepts to be discussed concurrently. Because the time allotted for series response questions does not just cover one topic, but usually addresses several important concepts simultaneously, it is therefore, in our experience, time well spent. It is our observation that series response CRS exercises actively engage students in large lecture settings; in fact, our students often remain in the classroom after class time to continue working more of these problems. Initial data show that CRS series response questions improve student performance on ACS exam questions that cover similar concepts. Hence, creative development and implementation of CRSs to engage and actively involve students in the learning process successfully transforms the large lecture classroom into a smaller more intimate and responsive learning environment.

To further illustrate the value of this problem format, another “series response” synthesis problem is shown in Figure 11. Student responses are shown in Figure 12.

Figure 12. Student CRS response data for synthesis problem from Figure 11 showing most common responses.

The correct answer sequence is 818372 or 838172, corresponding to deprotonation, alkylation, deprotonation, alkylation, partial syn reduction, and epoxidation. 818372 is the most common student response for this exercise. It should be noted that seven responses included in the “other” category contain the answer 838172, which is correct and differs from the most common response only in the relative order of alkylation. In all, 26 of the 174 student respondents (15%) answered correctly. The most common incorrect response, 81372, omitted the second deprotonation step. The instructor may take this opportunity to emphasize that the alkylation process is sequential because formation of the acetylide dianion is highly improbable. Responses 8372 and 8172 neglected to perform the second alkylation, and response 1372 omitted forming the carbon nucleophiles through deprotonation. As previously stated, the “other” category of responses is highly populated by assorted combinations of errors that are addressed individually by in class discussion of the most common responses.



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes



The authors declare no competing financial interest.



CONCLUSIONS The authors believe that effective teaching occurs when instructors pinpoint sources of student confusion in real time and then provide immediate corrective feedback. Though small class sizes are well suited for student involvement, large classes make such involvement much more challenging. Organic chemistry presents special challenges because students are required to conceptualize and successfully navigate logic strings.

REFERENCES

(1) Lewis, S. E.; Lewis, J. E. Departing from lectures: An evaluation of a peer-led guided inquiry alternative. J. Chem. Educ. 2005, 82 (1), 135−139. (2) Fies, C.; Marshall, J. The C-3 Framework: Evaluating Classroom Response System Interactions in University Classrooms. J. Sci. Educ. Technol. 2008, 17 (5), 483−499. F

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(3) Bunce, D. M.; VandenPlas, J. R.; Havanki, K. L. Comparing the effectiveness on student achievement of a student response system versus Online WebCT quizzes. J. Chem. Educ. 2006, 83 (3), 488−493. (4) Holme, T. Using Interactive Anonymous Quizzes in large general chemistry lecture courses. J. Chem. Educ. 1998, 75 (5), 574−576. (5) Schuwirth, L. W. T.; van der Vleuten, C. P. M. Different written assessment methods: what can be said about their strengths and weaknesses? Med. Educ. 2004, 38 (9), 974−979. (6) Wimpfheimer, T. Chemistry ConcepTests: Considerations for small class size. J. Chem. Educ. 2002, 79 (5), 592−592. (7) Woelk, K. Optimizing the use of personal response devices (clickers) in large-enrollment introductory courses. J. Chem. Educ. 2008, 85 (10), 1400−1405. (8) Sauers, A. L.; Morrison, R. W. In-lecture guided inquiry for large organic chemistry classes. Abstracts of Papers; American Chemical Society: Washington, DC, 2007; Vol. 233, p 268. (9) Ruder, S. M.; Straumanis, A. R. A Method for Writing OpenEnded Curved Arrow Notation Questions for Multiple-Choice Exams and Electronic-Response Systems. J. Chem. Educ. 2009, 86 (12), 1392−1396. (10) Benedict, L.; Pence, H. E. Teaching Chemistry Using StudentCreated Videos and Photo Blogs Accessed with Smartphones and Two-Dimensional Barcodes. J. Chem. Educ. 2012, 89 (4), 492−496. (11) Cooper, M. M.; Underwood, S. M.; Hilley, C. Z.; Klymkowsky, M. W. Development and Assessment of a Molecular Structure and Properties Learning Progression. J. Chem. Educ. 2012, 89 (11), 1351− 1357. (12) Flynn, A. B. Development of an Online, Postclass Question Method and Its Integration with Teaching Strategies. J. Chem. Educ. 2012, 89 (4), 456−464. (13) Flynn, A. B. Developing Problem-Solving Skills through Retrosynthetic Analysis and Clickers in Organic Chemistry. J. Chem. Educ. 2011, 88 (11), 1496−1500. (14) Mc Goldrick, N. B.; Marzec, B.; Scully, P. N.; Draper, S. M. Implementing a Multidisciplinary Program for Developing Learning, Communication, and Team-Working Skills in Second-Year Undergraduate Chemistry Students. J. Chem. Educ. 2013, 90 (3), 338−344. (15) Milner-Bolotin, M. Increasing Interactivity and Authenticity of Chemistry Instruction through Data Acquisition Systems and Other Technologies. J. Chem. Educ. 2012, 89 (4), 477−481. (16) Murphy, K. Using a Personal Response System To Map Cognitive Efficiency and Gain Insight into a Proposed Learning Progression in Preparatory Chemistry. J. Chem. Educ. 2012, 89 (10), 1229−1235. (17) Muthyala, R. S.; Wei, W. Does Space Matter? Impact of Classroom Space on Student Learning in an Organic-First Curriculum. J. Chem. Educ. 2012, 90 (1), 45−50. (18) Caldwell, J. E. Clickers in the Large Classroom: Current Research and Best-Practice Tips. CBE Life Sci. Educ. 2007, 6 (1), 9− 20.

G

dx.doi.org/10.1021/ed300688d | J. Chem. Educ. XXXX, XXX, XXX−XXX