Development of an Online, Postclass Question Method and Its

Feb 8, 2012 - ABSTRACT: A unique method was devised integrating online postclass .... The Organic Chemistry I course at the University of Ottawa is...
0 downloads 0 Views 896KB Size
Article pubs.acs.org/jchemeduc

Development of an Online, Postclass Question Method and Its Integration with Teaching Strategies Alison B. Flynn* Department of Chemistry, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada ABSTRACT: A unique method was devised integrating online postclass questions, clickers, a tablet, and active learning strategies in each class. This successful method connected in- and out-ofclass learning, provided prompt, regular, and relevant feedback to students and to the instructor, encouraged students to spend time on task, and enabled class time to be focused on topics that students found particularly challenging. The effects on student learning and experience in large organic chemistry courses, as determined from students’ assignment, midterm, and exam scores as well as student surveys, is discussed. The results of four years of data identified statistically significant improvements in students’ scores at the question level in classes that used the postclass question method as compared to classes that did not.

KEYWORDS: First-Year Undergraduate/General, Second-Year Undergraduate, Organic Chemistry, Collaborative/Cooperative Learning, Inquiry-Based/Discovery Learning, Internet/Web-Based Learning, Multimedia-Based Learning, Problem Solving/Decision Making, Student-Centered Learning



INTRODUCTION Chemistry has traditionally been seen as a complex and difficult subject for students to master. In a large chemistry course with an enrollment exceeding 100 students, a few additional factors can be deterrents to student learning, including: a passive lecture style of instruction; impersonal classes in which students can feel isolated; a diverse student population; and a wide variation in students’ abilities and learning styles.1−4 There are also the additional difficulties of providing individualized feedback and of connecting in- and out-of-class learning. Much research has been devoted to improving student learning in large classes; some strategies that can be effectively used in large classes are described below.5−14 Many techniques used to promote learning in large chemistry classes are straightforward to implement and are “low-tech”. In the think−pair−share technique, students are asked to answer a question individually first, then to discuss with their peers, and finally to share their conclusions with the class.15 Another simple technique, predict−observe−explain (POE), asks students to predict the outcome of some event, justify their predictions, describe what they see happen based on presented or acquired results, and then reconcile any conflicts between what they predict and what they observe.16,17 Peer instruction is a method that involves students more in their own learning compared to a standard lecture.18 In process-oriented, guidedinquiry learning (POGIL), students work on specially designed guided-inquiry materials in small, self-managed groups.19−24 Peer-led team learning (PLTL) is another method that promotes active learning in which a peer-led workshop replaces either a lecture or a tutorial session.25−27 © 2012 American Chemical Society and Division of Chemical Education, Inc.

Outside the classroom, reading assignments, problem sets, group study sessions, and professor office hours promote student learning, group work, encourage time on task, respect a diversity in student population and learning styles, and provide opportunities for faculty−student contact. The use of technology is being explored extensively in order to improve student learning, particularly in large classes,1,4,28−30 and is a means of connecting with a technologically adept student population. Common technologies used in large chemistry classes include an overhead projector, pen-enabled technologies (e.g., tablets),31−37 personal response devices (clickers), student computers, and presentation software such as PowerPoint.38−42 Personal response devices, or clickers, have emerged as highly effective instructional instruments and their use generally receives positive feedback from students.1−4,29,30,43−57 The instructor poses a question and students give either a numerical answer or select an answer from multiple choices. Student responses can be collected anonymously, if desired, and their responses can be graded based either on participation or correct responses. After a given time, student answers are displayed as a histogram, giving both the instructor and students immediate feedback. By collecting responses with clickers, the instructor sees a truly representative overview of responses and not simply those of the vocal minority (who might correctly or incorrectly portray understanding or misunderstanding of a topic).52 Many possible uses of these data are salient: for example, if an instructor sees that the majority of the class has responded correctly, there is no need to dwell on that topic. Alternatively, Published: February 8, 2012 456

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

Additionally, while online homework programs are used in a number of disciplines and aim to improve student learning, only a few studies have measured their impact.63−69 Most reports describe student satisfaction survey results, which are usually very positive. Smyth additionally described usage statistics for a nomenclature tutorial program.63 Penn et al. reported that higher scores were achieved after implementation of an online homework program.64 Other researchers have compared the grades of students who used a homework program with the grades of students who chose not to do so.60,65 In general chemistry courses, statistical analysis of student grades has shown an improvement when online homework was employed.66,67 Only one report to date has documented the effect of online organic chemistry homework on student achievement, in which a slight correlation was demonstrated between students’ homework scores and exam scores.68 A method was devised that integrated postclass questions in which students drew their answers using an online homework program,61 clickers, a tablet, and active learning strategies such as the think−pair−share technique. This successful method, which provided prompt, regular, and relevant feedback to students, helped them learn to spend time on task, focused class time on topics that students found particularly challenging, and connected in- and out-of-class learning, is described below. The effects on student learning in four large organic chemistry courses was analyzed using students’ assignment, midterm, and exam scores as well as student surveys, and is also discussed.

if an instructor sees numerous incorrect responses, some options include: explaining the topic in further detail; leading a discussion among students; asking students to discuss among themselves; retesting, and so on. The use of clickers helps to create a safe environment that enables all students to participate, even the shyest. In organic chemistry, the types of questions that can be created is expanding beyond the standard multiple-choice options to questions that address syntheses53 and mechanisms.54 Question types still rely on predrawn structures that preclude the ability for students to draw and submit compounds for evaluation and feedback. Many instructors are concerned that the degree of coverage in their classroom will suffer if they implement active learning strategies. While this is often true, options exist to address this issue of coverage.1,3,4,49,52 One option is not to cover everything listed on the syllabus in classthe instructor can require that students learn some material independently. Appropriate concepts that could be assigned for independent learning include introductory material, background information, definitions, or straightforward concepts.58 If desired, the information could be given in a handout or students could be given a reference to a textbook or other source.4 Depending on the students’ backgrounds, they might have the expectation that they are only responsible for material covered explicitly in class. It would be important in that case to clearly communicate expectations to students and explain why they are being asked to learn some material independently. Another option is to eliminate topics from the course completely. The content that is covered can then be explored more deeply, students can develop higher levels of thinking, and, if students are taught relevant research skills, they can research additional information independently.1,3,4,49,52 The methods discussed thus far have addressed primarily inclass learning. Because “time on task” is critical to learning (i.e., learning takes time), it is essential that learning continue beyond the classroom.59 Typically, homework, projects and readings are assigned as the out-of-class component of a course, although it is difficult to give students timely feedback for this type of work, particularly in large classes. In particular, it is important in organic chemistry that students be able to draw chemical structures. Until recently, it has been very difficult to evaluate and give feedback to students in large classes for this type of work. Online organic chemistry homework is emerging as a method of quickly evaluating student work and tracking results. Online organic chemistry homework programs include Synthesis Explorer,60 ACE Organic,61 and OWL.62 Instructors first create an assignment in the program, using questions from a database or by creating their own. Students draw structures in response to questions posed and the program gives them immediate feedback for their answers, telling students if they are correct and with the option of giving hints if responses are incorrect. Instructors can also see the students’ final answers in the gradebook. The advantage of having students draw their own structures is significant, because they really have to understand how compounds are constructed. Students are often able to recognize a correct answer but do not understand the fundamentals well enough to draw compounds or reagents properly. By looking at the students’ answers in the gradebook, common errors are revealed to the instructor. In this paper, a new strategy is described that integrates online homework with each class and that incorporates feedback derived from students’ answers into the course.



THE COURSES

The Organic Chemistry I course at the University of Ottawa is given in the winter of the students’ first year and consists of four−five sections (one−two in French; three in English) of 120−420 students each, for a total of approximately 1300 students. The Organic Chemistry II course is given in the fall of the students’ second year and comprises three large sections (one in French; two in English) of 250−420 students each, for a total of approximately 1050 students. Every class is 80 min in length. A summary of the components of the course that were used for the calculation of course grades, in the sections taught by the author, are summarized in Table 1. ACE online homework Table 1. Summary of Dates of Instruction, Class Sizes, and Evaluation Components for Organic Chemistry I and II

Year Semester Class Size ACE Quizzes ACE Postclass Questions Clickers Midterm 1 Midterm 2 Laboratory Final Exam

Organic Chemistry II

Organic Chemistry I

Organic Chemistry II

Organic Chemistry I

2008 Fall 705a 5% Bonus 1%b

2009 Winter 610 4% Bonus 1%b

2010 Fall 415 8% 2%c

2011 Winter 390 8% 2%c

Not used 10−20%e 20−40%e N/A 35−65%e

4%d 10−20%e 10−20%e 15% 37−57%e

5%d 10−20%e 10−20%e N/A 45−65%e

5%d 10−20%e 10−20%e 15% 30−50%e

a

Two sections (280, 420). bMarks were accorded for participation. The marking on each question was as follows: 100% for a correct answer; 90% for an incorrect answer; 0% for no attempt. dMarks were accorded for participation. eThe weighting was individually determined such that the student received the highest possible final grade. c

457

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

assignments were given weekly and one or two ACE online “postclass questions” were asked after every class.61 The postclass questions, described more fully in the next section, were introduced in the second or third week of classes, once the students were familiar with the course setup and the ACE program. Students were also regularly given optional assignments that were not worth credit and for which the answers were posted on the course Web site. Clickers were used in an organic chemistry course for the first time at the University of Ottawa in 2009.33 The marks for clicker questions were given on the basis of participation, not for correctness, particularly so that students would feel confident submitting their own answers when asked to answer individually and not discuss with their colleagues. PowerPoint was used to project the clicker questions and responses and a USB Rf receiver collected students’ answers. Between four and six clicker questions were asked per class.70 The lectures were conducted using a tablet31,32 to record notes and points of discussion. The learning opportunities and technologies employed in these courses are summarized in Table 2, below. The think−

Figure 1. An example of an online postclass question assigned via ACE Organic. Reproduced with permission from Pearson Education, Inc.

Table 2. Summary of the Active Learning Methods and Technologies Employed In-Class Active Learning Methods Think−Pair−Share

a

Minute Paper Question Box

Out of Class Activities Laboratory (3 h/week)b Group tutorials Online discussion forum

Technology Employed Online homework program (ACE Organic) Clickersc Tablet

a

Asked using clickers. bThe laboratory component of Organic Chemistry II is a separate course, although the majority of students take both the lecture and laboratory components. cFrom 2009 onward.

Figure 2. The display in ACE’s gradebook showing one student’s attempt. Reproduced with permission from Pearson Education, Inc.

few minutes to review the incorrect answers and thereby identify the most common errors. In the year that the postclass question method was first developed (2008), the answer to each postclass question was discussed in the following class, along with the most common incorrect answers. The students had the opportunity to explain why certain answers were incorrect and this process gave the instructor the opportunity to address many common errors, misunderstandings, and misconceptions. For example, in the question shown in Figure 1, some students had not taken note of the stereochemical implications of the question, some had used the solvent as the nucleophile (and left the oxygen protonated in the answer), and still others had assumed that hydroxide was the nucleophile (a common error). While the students’ common errors could sometimes be predicted based on past exam results, using postclass questions and clickers in this way uncovered unanticipated issues that could be addressed as soon as they arose. On the basis of the percentage of students who completed each postclass question (66−75%) and on the information gathered from the students’ answers, important benefits accrued from asking at least one postclass question per class using ACE. The postclass questions prompted students to review the last class’s material, to reflect on what they had learned, and to apply their newfound knowledge; additionally, they were encouraged to work together on problems. For the instructor, reviewing the students’ answers drawn online was more informative than seeing the distribution of answers from multiple-choice questions or hearing from only a

pair−share technique was used approximately twice per class, using clickers as of 2009. Minute papers and questions boxes were used every few weeks.71 Specifically, question boxes were used one week before each midterm and exam, and the submissions were used to design a review session for students. The students also attended weekly laboratory sessions61 and group tutorial sessions (optional).



DISCUSSION

Online postclass questions were developed to encourage students to review, and apply their newfound knowledge soon after each class, and for the instructor to be able to gauge students’ understanding. The postclass question, which was related to the concepts learned in a given class, was made available online in ACE organic61 after that class. Each question was due 1 h before the start of the following class and the students had one or two attempts to answer the question. An example of an ACE postclass question is shown in Figure 1. The ACE organic program provided feedback directly to students, telling them whether they were correct or incorrect in answering, and giving them hints. The program’s gradebook reported the percentage of correct responses, the total number of responses, and the average number of tries in the cases when multiple attempts are allowed. Additionally, the last answer drawn by each student was recorded in the gradebook (Figure 2) and the instructor could review these answers. While ACE cannot currently produce a histogram of results, it took only a 458

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

technique. When students could not even recognize a correct answer, often an underlying misconception or misunderstanding was the cause. Students were asked to pair up and to convince each other of the correct answer. The clicker question was subsequently repeated. The majority of the class typically identified the correct answer when the question was asked a second time.72 In general, students who know the correct answer are better able to explain their answer and defend their arguments, thereby convincing their peers. This finding is consistent with Mazur’s observations using peer instruction.18 If less than 30% of the class answered correctly, there were not enough students capable of teaching their peers. In these instances, further exploration or analysis of the questions was directed by the instructor before moving to peer-led teaching. For example, a review of the key features of a general reaction might be undertaken or the students might be asked to build a model of the compounds being studied. If more than 70% of the class answered correctly, many students became disinterested in the peer-learning process (i.e., there are not enough people remaining to be taught).18 In that latter scenario, options include explaining the correct answer for the benefit of those who have not understood or giving a “challenge question” to those who have understood (asking a few students to post their answers on a discussion board after class) while asking the students who have not understood to discuss further. Regardless of the type of technology used, it was important to ask “good” questions. Effective questions asked the students to apply newly acquired knowledge, to connect knowledge acquired in different parts of the course, to use their knowledge in a new way, or to address common misunderstandings (e.g., that numbered steps indicate separate reactions and that the reagents are not all combined at the outset of the reaction). Ineffective questions that were either too easy, such as simple variation of an R group, or too difficult, resulted in a loss of student attention and were not a productive use of class time. The progression from postclass questions, to in-class clicker questions, to small-group discussions and the subsequent use of clickers, is described in the following example from Organic Chemistry I (2009). Students were asked to draw the first organic intermediate obtained when 1-chloro-1-methylcyclohexane was dissolved in methanol. The four most common responses that were drawn online by students in ACE are shown in Table 3. Only 27% of students drew the correct answer, C, on ACE with the other most common answers being A, B, or D.

small subset of student answers during class. Asking students to draw their answers also addressed the learning styles of the more kinesthetic learners; for example, questions pertaining to stereochemistry, for which students could use molecular models (e.g., draw the following compound in its reactive conformation), were particularly suitable. In the following semester (2009) with the Organic Chemistry I class, postclass questions were used, and clickers were also incorporated. For example, as an online postclass question, students were asked to draw the organic substitution product of the reaction shown in Figure 3 (the answer choices

Figure 3. A clicker question that was designed based on common student answers on ACE to the question: “Draw the major organic substitution product for the reaction shown.”

were not given to the students on ACE; they had a blank page upon which to draw their answers). The instructor reviewed the students’ answers prior to the next class. Initially, only 53% of students drew the correct answer; this low result demonstrated that additional time should be dedicated to learning about this reaction. At the beginning of the following class, the multiple-choice question in Figure 3, which showed the correct answer along with three commonly submitted incorrect answers, was given as a clicker question. Students were asked to answer after having 1 min for a discussion; this time, 83% of the class obtained the correct answer. Although recognizing a correct answer required a lower level of thinking compared to drawing a correct answer, according to Bloom’s taxonomy,13 this was nevertheless a substantial improvement: t(316) = 6.797, p < 0.0001. Occasionally, when only 30−70% of students obtained the correct answer on a postclass question that was asked using clickers as a follow-up in class, an important opportunity for peer learning presented itself, using, for example, the think−pair−share

Table 3. Distribution of Results in Response to This Question in Organic Chemistry I, 2009

a Data from two sections. bData from one section. cThe question used a different substrate and solvent than the ones shown in the question above, but the reaction also proceeded through an SN1 mechanism. dOrganic Chemistry II (data from two sections).

459

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

When the question was repeated in the following class and students were asked to respond individually using their clickers, 55% of the class obtained the correct answer: t(766) = 8.441, p < 0.0001. A discussion was initiated in which students were asked to state the factors involved in their decision; these factors were suggested and listed without bias using a tablet: tertiary α carbon; SN2; nucleophile; good leaving group; and SN1. Students were then asked to pair up and discuss this problem, and after this period of peer discussion, the clicker question was repeated. This time, almost 90% of the class identified the correct answer. Because the marks were allotted on the basis of participation and not correctness, it is unlikely that students were simply copying their “smarter” peers. Additionally, it has been observed both in this course73 and elsewhere18 that students can usually identify the correct answer after peer discussion even when it is not the predominant response initially. The observed improvement is notable (t(652) = 10.826, p < 0.0001), particularly because there was no involvement on the part of the instructor, and is consistent with Mazur’s findings with peer instruction.18 Importantly, the students who participated in the subsequent class discussion were able to explain why they had originally obtained the incorrect answer. Some had misread the question and thought that they should draw the final answer or that water was the solvent instead of methanol. Others had incorrectly decided that this reaction proceeded via an SN2 mechanism. By doing this exercise, the importance of reading the question carefully and of analyzing the components of any question was emphasized. In all these situations, a tablet was used to facilitate the discussion. This approach also strove to address the skills and needs of students with different learning styles, particularly kinesthetic and experiential learners who might otherwise be disadvantaged in a lecture-style classroom that is most suitable for auditory and visual learners.74 Through questioning, students could become more actively involved in their learning. Many questions were three-dimensional in nature and most students needed to build a model in order to answera highly tactile process.74 A comparison of the final exam results between 2008 and 2009 for SN1 questions that were similar to the one shown in Table 3 suggested an improvement in learning outcomes. While the SN1 question shown in Table 3 was given as a postclass question in the Organic Chemistry I class in 2009, a question of this type was not asked as a postclass question in the Organic Chemistry II class in 2008. An approximately equal amount of time was spent in class on the topic of the SN1 reaction in both courses.75 In 2009, 80% of students were able to correctly draw the mechanism that included the carbocation intermediate on a different substrate than the one they had seen in the postclass question. This was in contrast to the 67% of students in 2008 who were able to correctly draw the first intermediate: t(1281) = 5.356, p < 0.0001. Asking postclass questions enabled the instructor to provide students in a large class with relevant and timely feedback; this advantage is highlighted through the following example. On the Organic Chemistry II final exam in 2008, one question asked students to identify an unknown (Figure 4) using the molecular formula of the compound and the NMR spectrum, which were both provided. The students were accorded one point for each fragment or piece of information (clue) that they identified using the data provided (e.g., degrees of unsaturation, presence of functional groups such as an alkyne, etc.). The final structure students proposed was scored 0, 1, or 2 out of 2 points possible,

Figure 4. Structure to be identified on an NMR question on the Organic Chemistry II final exam, 2008.

based on the fragments that the students had identified in the question (Table 4). Table 4. Marking Scheme for an NMR StructureDetermination Question Component of Question Fragment Identification (Part A) Final Structure (Part B)

Mark Allocation 1 point per fragment or information

Example Degrees of unsaturation, an isopropyl group, etc.

2/2: correct structure based on the fragments that the students have drawn 1/2: 1 error n-butyl acetate versus ethyl n-butyrate 0/2: multiple errors n-butyl acetate versus ethyl propionate

An analysis of student answers on the preceding NMR structure-determination question was conducted, which revealed an overall average mark of 69.2% and a median mark of 66.7% for that question (Figure 5). The average

Figure 5. Summary of student marks for a structure-determination NMR question on the final exam in Organic Chemistry II, 2008 (N = 619).

mark for part a, the section of the question dedicated to fragment identification, was 74.0%, with a median mark of 80.0% (Figure 5). Furthermore, analysis of the marks for the final structure (part b of the question) revealed an average mark of 45% and a median mark of 0 (Figure 5). The distribution of marks showed that the majority of students either drew the correct (accepted) answer (41.2%), or drew a structure that very poorly matched the fragments that they themselves had identified (47.0%); almost 4% of students did not even attempt to draw a final structure. These results suggested that students were able to identify the fragments in a structure but had difficulty putting the fragments together in order to identify the final compound. This information was helpful to the instructor in terms of designing lessons for future students even though the students who had taken the final exam for the course described above could not benefit from the improved lesson design. 460

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

student performance because of the more difficult question. The median score was higher in the 2010 group, both for the full question (75.0% in 2010 versus 66.7% in 2008) and for the mark for the final structure (50% in 2010 versus 0% in 2008). Additionally, almost all students (99.4%) made an attempt at a response, compared to only 96.2% in 2008: t(887) = 3.703, p = 0.0001. Improvement was also observed in the distribution of marks for the final structure identification, that is, part b of the question (Figure 9). Fewer students scored a zero for the final

To probe the students’ understanding during the NMR unit of the course in 2010, an online postclass question, similar to the structure-determination NMR question described above, was assigned to students. While 79% of students attempted this question, only 25% of those students drew the correct or accepted structure (Figure 6); the majority of students drew

Figure 6. Accepted answers for an NMR structure determination ACE postclass question, 2010.

structures possessing incorrect fragments or structures in which the fragments had been connected incorrectly. In the following class, the students were given time to work together to solve this problem and the lesson concluded with the students and the instructor describing additional strategies for solving problems such as the one above. A structure-determination NMR question on the final exam in 2010 provided a means of comparison to the 2008 group, which did not have an NMR postclass question. The structuredetermination NMR question was asked that was different from questions that the students had previously seen (answer shown in Figure 7). The distribution of the marks from the two classes

Figure 9. Distribution of marks for the final structure identification on an NMR structure-determination question in Organic Chemistry II.

structure: 47.0% in 2008 versus 38.0% in 2010, t(824) = 2.819, p = 0.0025; 61.5% of students obtained 1−2 points for the final structure in 2010, compared to only 49.3% of students in 2008, t(823) = 3.813, p = 0.0001.77 Overall, student performance improved at the question level for the examples described above after the integration of postclass questions with in-class active learning strategies. Similar results were obtained for other postclass questions throughout each semester. However, a comparison of overall final exam marks, final course grades, and student GPAs did not reveal statistically significant differences between the classes.78 This is not surprising, given that the Organic Chemistry I and II courses cover different topics and have different evaluation requirements (e.g., there is a laboratory component in Organic Chemistry I), as well as the many other factors that can influence the final exam and course grades, even between different sections of the same course. In addition, participation rates and postclass question marks improved following a modification of the weighting of the postclass questions in the students’ final grades, when comparing the same courses between years (Table 5). In 2010, the weighting of the postclass questions was increased to 2% of the students’ final grade from being worth bonus marks only.

Figure 7. Structure to be identified in an NMR question on the Organic Chemistry II final exam, 2010.

for this question is shown in Figure 8. Despite the 2010 final exam question being more difficult than the 2008 final exam



STUDENT FEEDBACK The student feedback provided on optional, anonymous, online surveys was extremely positive, as seen from the summary of survey responses of the Organic Chemistry II section in 2010 that are shown in Figure 10 (response rate of 80.3%). The students overwhelmingly responded that the online postclass questions helped them learn, that they reviewed their class notes regularly because of the postclass questions, and that they gave a reasonable effort to answer the questions. Furthermore, students’ answers to survey questions revealed that they were

Figure 8. Comparison of the distribution of marks for an NMR structure-determination exam question in Organic Chemistry II between 2008 and 2010.

question,76 there was not a significant difference between the 2008 average of 69.2% and the 2010 average of 69.6, t(739) = 0.13, p = 0.4481; one would have expected a decrease in 461

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

than one course, selling the clicker to another student after the course, and using a rebate coupon offered by many publishers are ways for students to minimize costs. In addition, an allocation of a larger percentage of the student’s grades to clickers and ACE could more greatly reflect the importance of these learning activities. It is also important to emphasize that the students are not paying for a percentage of their mark, but rather are buying a tool that will support their learning. Other comments made by students related to the use of the tablet, as in this example: The method by which the notes were taken in class really helped in the learning process. I think it would be really effective if say in physics or math they used the same method as opposed to the PowerPoint presentation or the blackboard. Overall, the extremely positive student comments and opinions support the continued use of this method.

Table 5. Participation Rates and Average Scores on Postclass Questions Average Participation Rate, Postclass Questions, %

Average Score, Postclass Questions, %

I

61.4a

63.6b

I

68.5a

70.6b

II

65.8c

68.0d

II

75.1c

78.1d

Course (N) Organic Chem. 2009 (604) Organic Chem. 2011 (390) Organic Chem. 2008 (705) Organic Chem. 2010 (413) a

t(20) = 1.84, p = 0.0399. bt(816) = 3.31, p = 0.0005. ct(31) = 2.45, p = 0.0101. dt(938) = 5.60, p < 0.0001.



CONCLUSIONS The development and integration of postclass questions asked via an online homework programwith clickers, tablet technology, and active in-class learning methods was described. The integration of online postclass questions with clickers and active learning strategies was successful in a variety of chemistry contexts, including mechanisms, spectroscopy, and synthesis. Preliminary results at the question level of comparison suggested that students better understand organic chemistry concepts. Student opinions about the method were extremely positive and the overwhelming student recommendation was to use these methods in future courses. The online postclass questions connected in- and out-of-class learning and acted as a thread to link each class together. Notably, this method allowed class time to focus directly on topics that proved challenging for students; encouraged students to regularly review their notes from each class; emphasized the importance of time on task (i.e., that learning takes time); gave specific, regularly, relevant, and timely feedback to the students and to the instructor; addressed the needs of students with different learning styles; and involved students regularly in their own learning.

Figure 10. Student survey results from the Organic Chemistry II course of 2010 (N = 310).

more actively learning in class because of the clickers. The students’ general feedback about clickers and the online homework program (not shown) was also quite positive and was similar to feedback received in other courses that used these technologies.4,30,46,48,68 The positive survey responses pertaining to the postclass questions, clickers, and active learning strategies were also supported by many additional positive comments made by students, such as these examples: Postclass questions really forced me to review the stuf f I learned in class, so I think it was a great idea. I’m usually too shy to raise my hand in class, so definitely having clickers is a great way for everyone who doesn’t like raising their [sic] hands in class, but still enjoy participation. Also going over the [questions] that most people had trouble with in class was also good, because then we could see exactly where we went wrong. The postclass questions were great to encourage at least a little “daily” reviewing, and again, I think it’s great that we were rewarded just for trying. Negative comments were few; mostly they related to the increased cost of the course because of the need to buy and register the clicker, as well as pay for an account to access the online homework program. Using these technologies for more



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected].



ACKNOWLEDGMENTS The author thanks Carolyn Hoessler and the reviewers for their insightful suggestions in the preparation of this manuscript.



REFERENCES

(1) Cutts, Q.; Kennedy, G.; Mitchell, C.; Draper, S. Maximising Dialogue in Lectures Using Group Response Systems. In 7th IASTED International Conference on Computers and Advanced Technology in Education, Hawaii, 2004. (2) Wood, W. B. Dev. Cell 2004, 7, 796. (3) Knight, J. K.; Wood, W. B. Cell. Biol. Educ. 2005, 4, 298. (4) Caldwell, J. E. CBE Life Sci. Educ. 2007, 6, 9. (5) Allison, J. J. Chem. Educ. 2001, 78, 965. (6) Bowen, C. W. J. Chem. Educ. 1992, 69, 479. (7) Bunce, D. M. J. Chem. Educ. 2009, 86, 674. (8) Clouston, L. L.; Kleinman, M. H. J. Chem. Educ. 1999, 76, 60. (9) Lyon, D. C.; Lagowski, J. J. J. Chem. Educ. 2008, 85, 1571.

462

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

(10) Oliver, M. E.; Butler, L. G.; Cordes, A. W. J. Chem. Educ. 1995, 72, 610. (11) Paulson, D. R. J. Chem. Educ. 1999, 76, 1136. (12) Chemists’ Guide to Effective Teaching; Pienta, N. J., Cooper, M. M., Greenbowe, T. J., Eds.; Prentice Hall: Upper Saddle River, 2009; Vol. II. (13) Bloom, B. S. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain; David McKay Co., Inc: New York, 1956. (14) Harpp, D. N. J. Chem. Educ. 1994, 71, 629. (15) Allen, D.; Tanner, K. Cell Biol. Educ. 2002, 1, 3. (16) Bodner, G. M.; Hunter, W. J. F.; Lamba, R. S. Chem. Educator 1998, 3, 1. (17) Gunstone, R. F.; Champagne, A. B. Promoting Conceptual Change in the Laboratory. In The Student Laboratory and the Science Curriculum; Routledge: London and New York, 1990. (18) Mazur, E. Peer Instruction: A User’s Manual; Prentice Hall: Upper Saddle River, NJ, 1997. (19) Moog, R. S.; Creegan, F. J.; Hanson, D. M.; Spencer, J. N.; Straumanis, A.; Bunce, D. M. POGIL: Process-Oriented GuidedInquiry Learning. In Chemists’ Guide to Effective Teaching; Pienta, N. J., Cooper, M. M., Greenbowe, T. J., Eds.; Pearson Prentice Hall: Upper Saddle River, NJ, 2009; Vol. II, p 90. (20) Farrell, J. J.; Moog, R. S.; Spencer, J. N. J. Chem. Educ. 1999, 76, 570. (21) Spencer, J. N. J. Chem. Educ. 2006, 83, 528. (22) Schroeder, J. D.; Greenbowe, T. Chem. Educ. Res. Pract. 2008, 9, 149. (23) Lewis, S. E.; Lewis, J. E. J. Chem. Educ. 2005, 82, 1408. (24) Straumanis, A. Process-Oriented, Guided-Inquiry Learning; TEDxSanMigueldeAllende. http://www.youtube.com/watch?v= XFYVmJYGJe8 (accessed Jan 2012). (25) Gosser, D. K.; Cracolice, M. S.; Kampmeier, J. A.; Roth, V.; Strozak, V. S.; Varma-Nelson, P. Peer-Led Team Learning: A Guidebook; Pearson Prentice Hall: Upper Saddle River, NJ, 2001. (26) Gosser, D. K. Peer-Led Team Learning: Scientific Learning and Discovery. In Chemists’ Guide to Effective Teaching; Pienta, N. J., Cooper, M. M., Greenbowe, T. J., Eds.; Pearson Prentice Hall: Upper Saddle River, NJ, 2009; Vol. II. (27) Kampmeier, J. A.; Varma-Nelson, P. Peer-Led Team Learning: Organic Chemistry. In Chemists’ Guide to Effective Teaching; Pienta, N. J., Cooper, M. M., Greenbowe, T. J., Eds.; Pearson Prentice Hall: Upper Saddle River, NJ, 2009; Vol. II, pp 122. (28) Harley, D.; Maher, M.; Henke, J.; Lawrence, S. Educause Quarterly 2003, 3, 26. (29) Boehmler, D.; Smith, A. C. Engaging Students and Promoting Discussion by Using Clickers in Large Science Lectures. In Proceedings of the College of Chemical and Life SciencesTeaching with Technology Conference; University of Maryland: College Park, MD, 2006. (30) Crossgrove, K.; Curran, K. L. CBE Life Sci. Educ. 2008, 7, 146. (31) Wacom Home Page. http://www.wacom.com/ (accessed Jan 2012). (32) Dell Home Page. http://www.dell.com/ (accessed Jan 2012). (33) eInstruction Home Page. http://www.einstruction.com/ (accessed Jan 2012). (34) Pargas, R.; Cooper, M.; Williams, C.; Bryfczynski, S. OrganicPad: A Tablet PC-Based Interactivity Tool for Organic Chemistry. In Proceedings of the First International Workshop on PenBased Learning Technologies (PLT 2007), Catania, Italy, 2007. (35) Hulls, C. C. W. Using a Tablet PC for Classroom Instruction. In 35th ASEE/IEEE Frontiers in Education Conference, Indianapolis, IN, 2005. (36) Tofan, D. C. J. Chem. Educ. 2009, 87, 47. (37) Derting, T. L.; Cox, J. R. J. Chem. Educ. 2008, 85, 1638. (38) Harpp, D. N.; Fenster, A. E.; Schwarcz, J. A.; Zorychta, E.; Goodyer, N.; Hsiao, W.; Parente, J. J. Chem. Educ. 2004, 81, 688. (39) Harris, M. J. Chem. Educ. 2006, 83, 1435. (40) Meyer, G. M. J. Chem. Educ. 2003, 80, 1174. (41) Niece, B. K. J. Chem. Educ. 2006, 83, 508. (42) Johnson, A. E. J. Chem. Educ. 2008, 85, 655.

(43) Broida, J. Classroom Use of a Classroom Response System: What Clickers Can Do for Your Students; Pearson Prentice Hall: Upper Saddle River, NJ, 2007. (44) POGILProcess Oriented Guided Inquiry Learning. http:// www.pogil.org/ (accessed Jan 2012). (45) Anderson, D. i>clicker Pedagogy Case Study; University of Colorado: Colorado Springs, CO, no date. http://www.iclicker.com/ uploadedFiles/Anderson%20case%20study%20final.pdf (accessed Jan 2012). (46) Anderson, D. Improving Learning through ClickingUsing an Electronic Audience Response System in General Chemistry, 235th ACS National Meeting, New Orleans, LA, April 6−10, 2008. (47) Anderson, D. Chem 103 Concept Test and Clicker Survey; University of Colorado: Colorado Springs, CO, no date. http://www. uccs.edu/∼faculty/danderso/edtech_clickers.html (Select clicker survey; see slide 5.) (accessed Jan 2012). (48) Cummings, R. G.; Hsu, M. J. Coll. Teach. Learn. 2007, 4, 21. (49) Draper, S. W. Ensuring Effective Use of PRSResults of the Evaluation of the Use of PRS in Glasgow University, October 2001− June 2002. http://www.psy.gla.ac.uk/∼steve/ilig/papers/eval.pdf (accessed Jan 2012). (50) Duncan, D. Clickers in the Classroom: How To Enhance Science Teaching Using Classroom Response Systems; Pearson: San Francisco, CA, 2005. (51) Lantz, M. E. Comput. Hum. Behav. 2010, 26, 556. (52) Simpson, V.; Oliver, M. Using Electronic Voting Systems in Lectures. http://www.tlcentre.net/resource_files/resources/386/ ElectronicVotingSystemsin_lectures.pdf (accessed Jan 2012). (53) Sauers, A. L.; Morrison, R. W. Abstracts of Papers, 233rd National Meeting of the American Chemical Society, Chicago, IL, Mar 25−29, 2007; American Chemical Society: Chicago, IL, 2007; p 838. (54) Ruder, S. M.; Straumanis, A. R. J. Chem. Educ. 2009, 86, 1392. (55) Woelk, K. J. Chem. Educ. 2008, 85, 1400. (56) Boehmler, D. J.; Smith, A. C. Engaging Students and Promoting Discussions by Using Clickers in Large Science Lectures, Teaching with Technology conference, University of Maryland, College Park, MD, 2006. (57) Iriarte-Gross, J.; Boehmler, D.; Havanki, K.; Jones, M. M.; Bunce, D. Effect of ConcepTests and Use of Student Response Systems on Student Understanding and Achievement in General Chemistry, 19th Biennial Conference on Chemical Education, West Lafayette, IN, 2006. (58) It would be important to gauge students’ understanding of these concepts and material, for example, through assignments or in-class questions. (59) Chickering, A. W.; Gamson, Z. F. Am. Assoc Higher Educ. Bull. 1987, 39, 3. (60) Chen, J. H.; Baldi, P. J. Chem. Educ. 2008, 85, 1699. (61) ACE Organic Home Page. http://www.aceorganic.com/ (accessed Jan 2012). (62) OWL Online Web Learning Home Page. http://www.cengage. com/owl (accessed Jan 2012). (63) Smyth, T. J. Comput. Assisted Learn. 1987, 3, 99. (64) Penn, J. H.; Nedeff, V. M.; Gozdzik, G. J. Chem. Educ. 2000, 77, 227. (65) Dillard-Eggers, J.; Wooten, T.; Childs, B.; Coker, J. Coll. Teach. Methods Styles J. 2008, 4, 9. (66) Chambers, K. A.; Blake, B. J. Chem. Educ. 2008, 85, 1395. (67) Richards-Babb, M.; Drelick, J.; Henry, Z.; Robertson-Honecker, J. J. Coll. Sci. Teach. 2011, 40, 81. (68) Chamala, R. R.; Ciochina, R.; Grossman, R. B.; Finkel, R. A.; Kannan, S.; Ramachandran, P. J. Chem. Educ. 2006, 83, 164. (69) Brewer, D. S.; Becker, K. J. Comput. Math. Sci. Teach. 2010, 29, 351. (70) Except in the class of 2008, in which clickers were not used at all. (71) Angelo, T. A.; Cross, K. P. Classroom Assessment Techniques: A Handbook for College Teachers, 2nd ed.; Jossey-Bass Publishers: San Francisco, CA, 1993. 463

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464

Journal of Chemical Education

Article

(72) This was the case even when the correct answer was not the predominant one when the question was first asked. (73) Flynn, A. B. University of Ottawa, Ottawa, Ontario, Canada. Unpublished work, 2008. (74) Based on student comments during informal discussions either before or in class. (75) Also, the Organic Chemistry II class in 2008 (fall) was seeing the SN1 reaction for the second time, having learned it originally in Organic Chemistry I (2008, winter). (76) Once the students had determined the correct fragments from the data, they had to connect the fragments. In 2008, there were 4 molecular fragments and 6 connection points (incomplete valences). In 2010, there were 5 fragments and 8 connection points, in addition to having to determine the degree and pattern of substitution of the aromatic ring, making the 2010 question more difficult. (77) A lower proportion of students obtained 2/2; however, the question in 2010 was more difficult and had more potential for errors than that of 2008. (78) Grade-point average data were provided to the author, stripped of all student identifying information, by the Institutional Research and Planning Office at the University of Ottawa.

464

dx.doi.org/10.1021/ed101132q | J. Chem. Educ. 2012, 89, 456−464