Using Clickers To Identify the Muddiest Points in Large Chemistry

Sep 12, 2011 - in Large Chemistry Classes. Daniel B. King*. Department of Chemistry, Drexel University, Philadelphia, Pennsylvania 19104, United State...
0 downloads 0 Views 2MB Size
ARTICLE pubs.acs.org/jchemeduc

Using Clickers To Identify the Muddiest Points in Large Chemistry Classes Daniel B. King* Department of Chemistry, Drexel University, Philadelphia, Pennsylvania 19104, United States ABSTRACT: One of the biggest challenges for instruction in largeenrollment introductory courses is identifying points of student confusion. One technique that is used to address this problem is the muddiestpoint card. However, this technique is logistically difficult to implement in large classes. Personal response devices (or clickers) can be used to facilitate this technique. Instead of providing students a blank card, students are asked to identify their “muddiest point” from an instructorprovided list of topics covered in that day’s class. The use of clickers allows students to maintain the anonymity normally associated with the cards used in the traditional implementation. Incorporation of muddiestpoint questions into the first and second terms of a general chemistry course is described. About three-quarters of the students who answered other clicker questions during class also answered the muddiest-point clicker question at the end of class. While approximately 75% of the muddiest-point topics were conceptual in nature, quantitative topics were more likely to be chosen as the muddiest point. KEYWORDS: First-Year Undergraduate/General, Communication/Writing, Misconceptions/Discrepant Events, Student-Centered Learning

O

ne of the most common challenges for an instructor in the classroom is how to obtain feedback from students. This is especially difficult in large classes, where enrollments above 100 students often prevent the instructor from even learning student names. Many different pedagogical techniques have been employed to determine what students understand in real time, such as minute papers,1 4 muddiest-point cards,5 7 and clickers (e.g., 8 11). The minute paper is a technique used to obtain quick feedback from each student at the end of class. A piece of paper is distributed to each student, and the students are given instructions to spend “one minute” to identify something that he or she learned or did not understand during class. If the instructor wishes to give credit for these, students would put their names on the sheet. However, these can be completed anonymously if the instructor is looking for general information about what was particularly clear or unclear that day. Muddiest point is a similar technique in which students identify the one point least understood from that day’s class. These “muddiest points” are usually written on a notecard distributed and collected at the end of class. These are generally filled out anonymously in an attempt to get the most honest feedback from the students. Battles5 describes a variation of this technique used with large introductory classes, in which notecards are collected once per week. On each card, students identify a troublesome topic or a potential exam question, and each card is worth extra points. Clickers (or personal response devices) are electronic devices that allow students to submit answers to questions posed by the instructor during class. These questions are usually in multiple-choice format, although recent advances in technology have enabled the submission of Copyright r 2011 American Chemical Society and Division of Chemical Education, Inc.

numeric or text answers. Questions can be asked at the start of class to determine what the students know or after a concept has been discussed to assess student understanding. This feedback is valuable for both the instructor and the students. The instructor can choose to associate each answer with a particular student or can let the answers remain anonymous. Either way, the students never see individual responses from each other, only the pooled results from all students.

’ IMPLEMENTATION For instructors who teach large lecture classes (e.g., more than 100 students), minute papers and muddiest-point cards often represent a steep logistical hurdle. The ability to distribute, collect, and process that many pieces of paper represents a large investment of classroom time and instructor’s time and energy. Also, the feedback would likely be so diverse that only a very small percentage of topics could be addressed the subsequent class. Consequently, despite their value, these pedagogical techniques are generally relegated to small classes (e.g., fewer than 30 students). This manuscript describes the use of clickers to collect “muddiest-point” information during two terms of a general chemistry lecture class, CHEM 101 and CHEM 102, with an enrollment of approximately 230 students each 10-week term. CHEM 101 meets for two lectures per week, and CHEM 102 meets for three lectures per week. Lectures were prepared in PowerPoint and Published: September 12, 2011 1485

dx.doi.org/10.1021/ed1004799 | J. Chem. Educ. 2011, 88, 1485–1488

Journal of Chemical Education

ARTICLE

Figure 1. An example of a muddiest-point clicker question before (A) and after (B) student voting.

delivered in class with a TabletPC, enabling real-time annotation of the slides. There were generally two to four clicker questions incorporated at various times throughout each 50-min lecture. In this course, each student is assigned a clicker to use for the duration of the term. The clickers are stored on two carts; as students arrive for class, they take their assigned clicker and then return it as they leave. Student responses to the clicker questions were recorded, although student use of these devices was not incorporated into their course grade. At the end of most lectures (12 of 19 in CHEM 101 and 24 of 29 in CHEM 102), a “muddiest-point” question was displayed on the screen. The question included a list of the most important topics or concepts (usually four to six) covered in that day’s lecture, as determined by the instructor (Figure 1). The number of topics included in the question varied from class to class, as a function of the number of topics discussed in that day’s lecture. This list was created objectively, with minimal attempt to predict which topics were likely to be confusing. Each day’s list could not identify all potential sources of confusion, but it did help identify which of the main topics required additional explanation. For the faculty member, the creation of this list helps to organize the lecture because the topics are listed in the order in which they are presented in lecture. The final item on each list allowed for the students to choose that they understood everything that day. Students were not given the option to identify other sources of confusion for two reasons: (i) the clickers used in this course did not allow text input; and (ii) selection of “other” would not have provided information that the instructor could have used in the subsequent class. An anonymous setting was used for this slide, so the student’s identity could not be matched to the choice of the muddiest-point selection. This was done to encourage the students to respond to the question, and to be honest in their responses. When implementing a new pedagogical technique, particularly one unfamiliar to students, it is important for the instructor to explain his or her motivation and demonstrate how the results will be used. To accomplish this, the purpose of the muddiest-point question was explained in lecture the first two or three times it was used.

’ RESULTS AND DISCUSSION The information obtained from the muddiest-point question was used as a source for a 5-min review at the start of the

following lecture. Usually, the muddiest-point slide from the previous lecture was shown to remind the students which topic was most confusing. This was either followed by a brief discussion about the topic or a clicker question about the topic. The clicker question generally took one of two formats. The most common format was a general question about the topic with the most votes from the previous class (Figure 2). The results of clicker responses to this follow-up question would identify if the topic was still a source of confusion (sometimes the time between lectures enabled students to answer their own questions about the topic). If confusion still remained about the topic (i.e., a large number of students did not answer the question correctly), the review time would be spent on the solution to the question. In the second format for reviewing the muddiest point, the clicker question was phrased in a way to identify which aspect of the topic was confusing to the students (Figure 3). Once the source of confusion was identified, the review would address that specific concept. The clicker questions shown in Figure 3 followed the identification of “integrated rate law” as the muddiest point. The question at the start of the subsequent class (Figure 3A) asked the students to identify which equations would be needed to answer an integrated rate law question. While most students correctly identified which equations to use, they were not as successful performing the actual calculation (Figure 3B). This confirmed that the confusion was associated primarily with equation rearrangement and manipulation. While all students were encouraged to answer the muddiestpoint questions, it was clear that some students did not participate in this activity. As daily attendance was not taken, it could not be determined what percent of the students in class provided feedback on a given day. On average, 84% (first term) and 73% (second term) of the students who answered other clicker questions during class also responded to the muddiest-point question at the end of the same class period. The students who did not respond to the muddiest-point clicker question might not have filled out a muddiest-point card, so this technology probably did not reduce the number of students providing this feedback relative to the number who would have filled out the cards. It is possible that a greater percentage of students provided the electronic feedback than would have provided written feedback, based on the minimal effort associated with answering the clicker question. In courses with lower enrollments (less than 50 students) 1486

dx.doi.org/10.1021/ed1004799 |J. Chem. Educ. 2011, 88, 1485–1488

Journal of Chemical Education

ARTICLE

Figure 2. Muddiest-point clicker results from the previous class (A) and follow-up clicker question in the subsequent class (B).

Figure 3. Two follow-up clicker questions used to identify which aspect of the muddiest point was the source of confusion.

taught by the same instructor (as this study), where muddiestpoint cards have been used, response rates below 50% are common. The low participation is believed to be related to the fact that no points are associated with the activity. The biggest disadvantage to this technique is that the student responses are limited to the instructor-identified topics listed in the question. In paper implementations of muddiest point, students are free to identify sources of confusion that the instructor did not consider. Students also ask questions that are related to or extensions of a particular topic, which provides the instructor an opportunity to address an interesting application of the topic. There are a couple of benefits, however, to the instructorcreated list of topics included in the muddiest-point question. First, it eliminates some of the off-topic comments that often appear on muddiest-point cards, when students are free to write anything they want. Second, the list of topics provides a summary of the day’s lecture that reinforces the key concepts. Reinforcing important points at the end of each lecture has been promoted as a beneficial pedagogical practice (e.g., 12, 13). General chemistry instruction involves the presentation and discussion of fundamental concepts and corresponding

calculations. Bruck and Towns14 analyzed student performance on clicker questions and found no difference as a function of question type (e.g., definition, algorithmic, and conceptual). However, many other studies have found that students perform better on algorithmic questions than on corresponding conceptual questions (e.g., 15 17). The instructor-chosen muddiestpoint topics were categorized as either qualitative topics (e.g., definitions or basic concepts) or quantitative (referring to equations or calculations) to determine whether topic type had an effect on which topics received the most votes. Qualitative topics comprised about 70 80% of the total muddiest-point topics over the course of each term (CHEM 101 and CHEM 102). The remaining topics were quantitative in nature. During both terms, quantitative topics were more likely to be identified by the students as the muddiest point. In CHEM 101, 4 of the 12 (33%) quantitative topics were chosen as the muddiest point by more than 20% of the students. Only 13 of the 50 (26%) qualitative topics were chosen as the muddiest point by more than 20% of the students. In CHEM 102, 14 of the 36 (39%) quantitative topics and 20 of the 99 (20%) qualitative topics were identified as the muddiest point by more than 20% of 1487

dx.doi.org/10.1021/ed1004799 |J. Chem. Educ. 2011, 88, 1485–1488

Journal of Chemical Education the students. Only the differences in CHEM 102 are statistically significant at the 95% confidence level (z-test, two-tailed). Further investigation is needed to determine whether the higher uncertainty with the quantitative topics was because more qualitative topics were included in the lists, or because the students had more difficulty understanding the quantitative topics during lecture. During the first term, on average, 32% ((11%, 1 σ) of the students responded that they had no muddiest points that day (i.e., they understood everything). During the second term, that average decreased to 24% ((7%, 1 σ). This decrease is not surprising, as the content in the second term is usually more challenging for the students and more quantitative in nature. As previously described, the quantitative topics appeared to be a bigger source of confusion.

ARTICLE

(13) McKeachie, W. J. Teaching Tips, 11th ed.; Houghton Mifflin Co.: Boston, MA, 2002. (14) Bruck, A. D.; Towns, M. H. Chem. Educ. Res. Pract. 2009, 10, 291–295. (15) Cracolice, M. S.; Deming, J. C.; Ehlert, B. J. Chem. Educ. 2008, 85 (6), 873–878. (16) Nurrenbern, S. C.; Pickering, M. J. Chem. Educ. 1987, 64, 508–510. (17) Sawrey, B. A. J. Chem. Educ. 1990, 67, 253–254.

’ SUMMARY Clickers were used to facilitate adoption of the muddiest-point technique in a large-enrollment general chemistry course. This technology enabled collection of a large amount of information that would have been logistically difficult using the traditional implementation of handing out blank notecards to all students. With this new clicker technique, students quickly and anonymously identified the topic from that day’s class that was most confusing. The instructor used this feedback to identify the topic or topics that were still unclear to students. At the start of the subsequent class, these topics were reviewed briefly (often with a clicker question) to ensure that the students were comfortable with the previous material before moving forward. While student feedback was more limited than one would receive from studentwritten comments, the relatively high participation rate (about 75%) and the ease of data collection and processing suggest that using clickers to identify muddiest points could be a valuable technique in large classes in which the distribution and collection of notecards is not practical.

’ AUTHOR INFORMATION Corresponding Author

*E-mail: [email protected].

’ REFERENCES (1) Angelo, T. A.; Cross, K. P. Classroom Assessment Techniques, 2nd ed.; Jossey-Bass: San Francisco, CA, 1993; pp 148 158. (2) Harwood, W. S. J. Chem. Educ. 1996, 73 (3), 229–230. (3) Weaver, R. L.; Cotrell, H. W. Innovative Higher Educ. 1985, 10, 23–31. (4) Wilson, R. C. J. Higher Educ. 1986, 57 (2), 196–211. (5) Battles, D. A. J. Geosci. Educ. 2000, 48, 30–32. (6) Cottell, P. G., Jr. In Classroom Research: Early Lessons from Success; Angelo, T. A., Ed.; Jossey-Bass: San Francisco, CA, 1991. (7) Mosteller, F. On Teaching and Learning: The Journal of the Harvard Danforth Center 1989, 3, 10–21. (8) Ebert-May, D.; Brewer, C.; Allred., S. BioScience 1997, 47, 601–607. (9) MacArthur, J. R.; Jones, L. L. Chem. Educ. Res. Pract. 2008, 9, 187 195; DOI: 10.1039/b812407h. (10) Reay, N. W.; Li, P.; Bao, L. Am. J. Phys. 2008, 76, 171 178; DOI: 10.1119/1.2820392. (11) Simpson, V.; Oliver, M. Austr. J. Educ. Technol. 2007, 23, 187–208. (12) Bligh, D. What’s the Use of Lectures?; Jossey-Bass: San Francisco, CA, 2000. 1488

dx.doi.org/10.1021/ed1004799 |J. Chem. Educ. 2011, 88, 1485–1488