Student Active Learning Methods in General ... - ACS Publications

Research: Science and Education. 120 ... 2. Use of ConcepTests, an informal cooperative learning technique, in the lectures. 3. .... shops (at student...
0 downloads 0 Views 79KB Size
Research: Science and Education

Student Active Learning Methods in General Chemistry Jeffrey Kovac Department of Chemistry, University of Tennessee, Knoxville, TN 37996-1600

Introduction A major focus of current reform efforts in chemical education is the development of pedagogical methods to facilitate active learning, particularly in large lower-division lecture courses (1). Inspired primarily by the work of Arthur B. Ellis (2) and John C. Wright (3) at the University of Wisconsin–Madison and David M. Hanson at SUNY Stony Brook (4), I initiated the use of student active learning methods (SAL) in off-sequence sections of Chemistry 120 and 130, the two semesters of the mainstream general chemistry course at the University of Tennessee, Knoxville. The course used a standard textbook, General Chemistry, by Darrell Ebbing (5). (The 4th edition was used for Chemistry 130 during the fall semester of 1996; the 5th edition was used for Chemistry 120 during the spring semester of 1997.) Chemistry 120 covers the bulk of the first 12 chapters of Ebbing, while Chemistry 130 selectively covers the remaining chapters. The enrollment at the beginning of the semester for Chemistry 130 was about 140 students and for Chemistry 120 was about 160 students. Both courses met twice a week for lectures and were divided into sections of 24 or fewer students that met once a week for a 50-minute discussion and once a week for a three-hour laboratory. The discussion and laboratory sessions were conducted by graduate teaching assistants (GTA). Following Wright (3), I introduced the following SAL features to the course: 1. An absolute grading scale introduced at the beginning of the semester. 2. Use of ConcepTests, an informal cooperative learning technique, in the lectures. 3. Biweekly cooperative learning workshops modeled on the successful program developed by David M. Hanson at SUNY Stony Brook (4 ). 4. Cooperative take-home examinations (three take-home examinations were given in Chemistry 130; one was given in Chemistry 120). 5. A student advisory committee to discuss the progress of the course regularly with the instructor.

As part of a project to develop materials for the teaching of writing in the chemistry curriculum, I assigned four one-page essays to the students in Chemistry 120 during the spring semester (6 ). In the spring-semester Chemistry 120 course I encouraged the students to form out-of-class study groups and provided a suggested study group assignment for each chapter of the textbook. These techniques were explained to the students at the beginning of the course both orally and in a detailed course syllabus. Student Active-Learning Methods The absolute grading scale was announced at the beginning of the course in an attempt to reduce competition. We often hear the complaint from students that another student “broke the curve” by earning a very high score on an exam. 120

With a preannounced grading scale, there should be no reason not to cooperate with other students in the learning process; the success of other students in the class does not diminish your own. I also emphasized from the first day that we had a mutual goal, for all students to earn an A in the course, and that we should work together to achieve it. ConcepTests modeled on those developed by Ellis and coworkers (2) were used in nearly every lecture in both courses. The number used depended on the subject matter and the pace of the lecture. As many as four appeared in some lectures. In a few lectures, particularly those in which descriptive chemistry was being covered, no ConcepTests were used. The ConcepTests were prepared on overhead transparencies and projected at the appropriate times during the lecture. Students were asked to answer the question, a poll of responses was taken by show of hands, and, in most cases, students were asked to discuss the answer with a neighbor. The discussion was usually initiated by a statement like, “Turn to your neighbor and convince him or her that your answer is correct.” After a second poll of answers, I provided closure by discussing the reasoning behind the correct answer and the possible reasons for an incorrect response. Occasionally I would ask a student to provide the explanation for the correct answer. The ConcepTests developed for this course have been contributed to the Chemistry ConcepTest Web site at the University of Wisconsin (http://www.chem.wisc.edu/~concept). There are several advantages to the ConcepTest methodology. First, the lecturer receives immediate feedback about student learning. If most of the students answer the question correctly, it is safe to assume that they understand the material you have just covered. If most get the answer wrong, you probably need to review. Second, the discussions, which are a form of informal cooperative learning, allow the students to teach each other. Often students can help each other to understand better than the instructor can. Third, the in-class discussions break up the monotony of the lecture and engage students in active participation in the class. Finally, use of ConcepTests communicates the message that the university is a community where both students and faculty should share the responsibility for learning. The disadvantage of ConcepTests, of course, is that they take precious lecture time. Using them does require that less material be covered in the lecture. Those more experienced in the use of this technique maintain that even though less is covered, there is better understanding. This is a question that needs to be further explored. Writing good ConcepTests is not easy, but an increasing number are available on the Chemistry ConcepTest Web site. A final issue is that the ConcepTests require the lecturer to adopt a more interactive and flexible teaching style which, for some, may be unfamiliar. Approximately every other week, the discussion period was devoted to a cooperative learning workshop. I prepared problem sheets drawing heavily on the work of David M. Hanson (7 ) and Richard S. Moog and John J. Farrell (8).

Journal of Chemical Education • Vol. 76 No. 1 January 1999 • JChemEd.chem.wisc.edu

Research: Science & Education

Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were “stuck in a bad group for the entire semester”, the groups were changed after each of the two in-class exams. Absences also caused groups to be reconstituted on an ad hoc basis. To force cooperation, each group received only one copy of the problem sheet. (In an earlier attempt to use the workshop method I discovered that providing each student with a problem sheet often resulted in a “parallel play” situation in which each student would try to work the problems independently and then either ask for help or confirmation.) Each group was expected to work together to provide a single answer to the problems or questions. Near the end of the session, a simultaneous reporting technique was used to review the correct answers. Each group was asked to put the answer to one problem on the blackboard. The GTA then quickly reviewed the public answers to make sure there was understanding. In a 50-minute period, it was not always possible for the groups to work through all the problems or questions, nor was there always sufficient time for the reporting procedure. Since the students were graded on process rather than on the number of problems solved correctly, the lack of time did not result in complaints about grading, but it did diminish the educational value of some workshops. Copies of the work-

Table 1. Self-Assessment Form (adapted from ref 4) Chemistry 120, Spring 1997, Workshop #1 GROUP MEMBERS We verify that we all agree and understand the solutions to these problems. MANAGER: _______________ keeps the group on task, distributes work and responsibilities. S POKESPERSON: _______________ presents the results of the group to the rest of the class R ECORDER: _______________ records efforts and prepares written results S TRATEGY ANALYST: ________________ assures that all group members participate and understand, identifies problem solving strategies and methods, identifies what the group is doing well and what needs improvement. ASSESSMENT: Provide your self-assessed grade for the session (3, 4, or 5): _______ (NOTE: instructor’s grade will depend on whether selfassessed grade is reasonable.) What is the rationale for your selfassessed grade? Instructor’s grade:_______ Total grade: _______ Instructor’s Comments: _______ On the back of this sheet answer the following assessment questions. 1. List two strengths in the function of your group. 2. List one improvement that could be made in the function of your group. R EPORT OF THE STRATEGY ANALYST (Use a separate sheet to answer the following questions in consultation with your group): 1. Identify how dimensional analysis was used to solve one of the problems. 2. What should you do when you encounter a problem that you cannot solve at first sight? R EPORT OF THE RECORDER (Complete in consultation with your group. Use a separate sheet): Please provide a report of the method of solution for each problem and a clear, concise, written answer to all discussion questions. The written report must be agreed upon by all members of the group.

shop problem sheets were made available to students who wished to work the unfinished problems out of class. Recently, SUNY Stony Brook has increased the class time for workshops (at student request!) from 50 to 80 minutes, which helps alleviate these pressures (Hanson, D. M., private communication). For each workshop, the groups were asked to reflect on their group process skills by answering some process-related questions. They were also asked to provide a self-evaluation grade of 3, 4, or 5. An example of the form used for these evaluations is given in Table 1. If the grade was judged to be accurate by the GTA, it was doubled. If not, it remained the same. The GTA was also encouraged to award up to 5 bonus points to groups that worked well together. The total grade for the session was 0–15 points and was based entirely on the process, not the number of correct answers to the problems. In Chemistry 130 there were eight workshops during the semester, which were divided into two sets of four. In each set, the best three grades were taken, summed, and normalized to 100 points. The two aggregate workshop grades were weighted the same as the five discussion quizzes given during the semester. In Chemistry 120, there were seven workshops. The two lowest grades were dropped and the remaining grades, normalized to 100 points, were weighted the same as two discussion quizzes. The cooperative learning workshops require the GTA to adopt a new teaching style. In the workshop the instructor acts as a facilitator, encouraging and asking leading questions rather than providing information. The GTAs adapted well to the change and seemed to enjoy this new mode of teaching (9). I did meet with them before each workshop to discuss potential problems. The process grading of the workshops is also more subjective and it was necessary to have several discussions before the group came to agreement about how to assign the grades. The major concern was establishing consistency in the awarding of bonus points so that students in different sections felt they were being treated fairly. Eventually we adopted a system in which 3, 4, or 5 bonus points were awarded for average, above-average, and superior performance in the workshop. A grade of 0 was reserved for those groups that were off task and needed to be reminded of their purpose. Writing good workshop materials is time-consuming. Fortunately, the books by Hanson (7) and Moog and Farrell (8) are available to provide guidance, at least in general chemistry. To build on the workshop experience, the students in Chemistry 120 were encouraged to form out-of-class study groups. A mechanism was devised (a notebook filled with course schedules) to help students find others who wanted to form study groups. I prepared a suggested study group assignment for each chapter. The end-of-semester survey revealed that about 60 students participated in these informal study groups and almost all found them valuable. In Chemistry 130, a cooperative take-home exam was distributed one to two weeks before each of the three major exams in the course: two hour exams and the final. Students were encouraged to work together on these exams, but each student was asked to submit a personal set of answers. These exams were designed to help students prepare for the upcoming in-class exam, but included more challenging questions. My goal was to stimulate students to think deeply about the

JChemEd.chem.wisc.edu • Vol. 76 No. 1 January 1999 • Journal of Chemical Education

121

Research: Science and Education

course material. These exams were graded and the scores weighted the same as an in-class quiz. Based on the experience of the fall semester and because I was also assigning the one-page essays, I gave only one take-home exam during Chemistry 120. It was distributed about halfway through the semester and students were given three weeks to complete the problems. The take-home exam was given the same weight as each of the two in-class exams. With take-home exams there is a danger of cheating. Since the take-home exams constituted a small percentage of the total grade I chose not to worry about academic dishonesty, assuming that those who did not do their own work would probably do poorly on the in-class exams. In cases where we suspected that students did not do their own work on a take-home exam, the score on the in-class exams reflected their lack of independent study. In Chemistry 120 I assigned four one-page “microthemes” during the semester. Students were given one week to write the papers. The assignments were designed to help the students think through some of the important concepts in the course. I graded the papers myself using a holistic method (10). I read all the papers quickly, then read them a second time more carefully. During the second reading I classified them into six groups with scores of 6 (high) to 1 (low) based both on content and the clarity of the prose. I did not make detailed comments on the papers. To help students learn what was expected, the best papers were posted (with the names removed) on the course bulletin board. Finally, I invited students to participate in a “student advisory committee” that would meet weekly with me to discuss the progress of the course. The goal was to receive regular feedback about the pedagogical experiments and how the course was progressing. In the fall semester, the regular meeting was scheduled for the hour after the weekly laboratory. Unfortunately, because students finish their lab work at different times and are eager to leave, the meetings were poorly attended and eventually abandoned. I did receive useful comments from the students who came to the meetings. In the spring semester I scheduled the meeting for the half hour immediately following the weekly discussion, but the students did not attend meetings at this time either. John Wright’s (3) experience is that the student advisory committee or board of directors is very valuable, so I plan to experiment further with this idea. Evaluation At the end of the two courses I asked the students to evaluate the SAL methods using the questionnaires in Table 2. The results, displayed in Tables 3a and 3b, show that the pedagogical innovations were, on the whole, quite successful. The students viewed the ConcepTests and the cooperative learning workshops as valuable. In both classes more than 64% of the respondents agreed or strongly agreed that these two SAL methods were helpful in learning the course material, while less than 13% disagreed. The absolute grading scale was also viewed as a positive innovation. From individual discussions with students and from their written free responses, it is clear that both the workshops and the ConcepTests helped them learn chemistry and made the course less intimidating. In Chemistry 130 the cooperative take-home exams were less successful, however. Students did not view them as 122

Table 2. Evaluation Forms Chemistry 130, Fall 1996 Evaluation of Instructional Methods In this course I have used several new (for me) instructional methods that I would like you to evaluate. These include (1) use of an absolute grading scale, (2) “ConcepTests”—questions and problems put on the overhead during lecture with opportunities for students to discuss the questions with their neighbors, (3) cooperative learning workshops during discussion periods, and (4) cooperative take-home examinations. Please respond to the following statements on a scale of 1 to 5. 1 = strongly agree 4 = disagree 2 = agree 5 = strongly disagree 3 = neutral ___ 1. Having an absolute grading scale announced at the beginning of the semester was helpful. ___ 2. The “ConcepTests” used in the lecture helped me learn the course material. ___ 3. The cooperative learning workshops were valuable learning experiences. ___ 4. The cooperative take-home exams were valuable learning experiences. ___ 5. The overall format of the course helped me learn chemistry. ___ 6. I have a more positive image of chemistry than I did when I began the course. Please write any comments, positive or negative, that you have about the course in the space below. You can continue on the back if necessary. Suggestions for improvement will be appreciated. Chemistry 120, Spring 1997 Evaluation of Instructional Methods In this course I have used several new (for me) instructional methods that I would like you to evaluate. Please respond to the following statements on a scale of 1 to 5. 1 = strongly agree 4 = disagree 2 = agree 5 = strongly disagree 3 = neutral ___ 1. Having an absolute grading scale announced at the beginning of the semester was helpful. ___ 2. The “ConcepTests” used in the lecture helped me learn the course material. ___ 3. The cooperative learning workshops were valuable learning experiences. ___ 4. The cooperative take-home exam was a valuable learning experience. ___ 5. The writing assignments were valuable learning experiences. ___ 6. The overall format of the course helped me learn chemistry. ___ 7. I have a more positive image of chemistry than I did when I began the course. The following questions concern the optional study groups. 8. Did you participate in a study group? Yes ___ No ___ ___ 9. If you participated in a study group, please rate the study group experience from 1 (not useful) to 5 (valuable). 10. Did you use the suggested study group assignments? Yes ___ No ___ ___11. If you used the suggested study group assignments, please rate them on a scale of 1 (not useful) to 5 (valuable). Please write any comments, positive or negative, that you have about the course on the back of this sheet. Suggestions for improvement will be appreciated. Thanks for your cooperation.

helpful. In their free responses the complaint was that the cooperative exams were “too difficult” or took “too much time”. I thought that the challenging problems would stimulate the students to review the material in depth and that understanding would translate into success on the in-class exams. With this class, my hypothesis was incorrect, at least for a large fraction of the students. This caused me to rethink

Journal of Chemical Education • Vol. 76 No. 1 January 1999 • JChemEd.chem.wisc.edu

Research: Science & Education Table 3a. Chemistry 130 Evaluation Results Score

Question No.

1

2

3

4

Mean Score

5

(Number of responses a/Percentage of total )

1

19/16

45/38

30/25

18/15

6/5

2.6

2

27/23

52/44

28/24

8/7

3/3

2.1

3

30/25

51/43

21/18

10/8

5/4

2.2

4

5/ 4

25/21

22/19

36/31

30/25

3.5

5

14/12

52/44

37/31

11/9

4/3

2.5

6

15/13

27/23

39/33

20/17

17/14

3.0

aSample

size = 118. Row total may be less than 118.

Table 3b. Chemistry 120 Evaluation Results Question No. 1

Score 1 2 3 4 5 (Number of responses a/Percentage of total ) 30/26 18/16 5/4 5/4 57/50

2

25/23

45/41

32/29

7/6

1/1

2.2

3

54/47

41/36

16/14

1/1

2/2

1.7

4

24/21

36/31

23/20

24/21

8/7

2.6

5

6/5

17/15

18/16

35/31

38/33

3.7

6

16/14

59/51

33/29

4/3

3/3

2.3

7

25/22

32/28

36/32

12/11

8/7

2.5

21/38

16/29

3.8

11/31

9/25

3.8

8

Yes 55 2/ 4

9

3/5

10

Mean Score 1.8

No 58

13/24 Yes 36

0/ 0

11

aSample

2/6

13/36

size = 115. Row total may be less than 115.

Table 4. Comparison of Overall Course Grade with Mean Writing Grades in Chemistr y 120 Course Grade A B+ B C+

No. of Students

Writing Grade for Assignments 1 and 2

3 and 4

24

80.0

73.3

4

72.5

67.5

26

72.1

62.8

5

64.0

60.0

C

40

66.4

50.1

D

24

63.1

59.3

F

39

48.2

19.4

my approach to the take-home exams. Context-rich problems can be a valuable learning tool, but not for students who are algorithmic learners. Algorithmic learners need more straightforward exercises that directly apply to exam problems. The cooperative take-home exams need to include a mix of problems and a grading scheme that rewards the best students who can take advantage of the context-rich problems while not penalizing the otherwise good student who is still in the algorithmic learning stage.

The one take-home exam I gave in Chemistry 120 was much more successful. The mix of problems was more diverse and the exam was given more weight in the final grade, so students seemed to take it more seriously. It was also decoupled from the in-class exams. The student response was much more favorable. A significant fraction of the students found the exam to be a valuable learning experience. The writing assignments, however, were not viewed as a success, at least by most students. The numerical ratings were poor and the students’ comments unfavorable. Several students commented that they felt the writing assignments were just “busy work” or “worthless”. On the other hand, a few students judged the writing to be valuable and wrote positive comments. At the end of the term I examined the correlation between the grades that students received on the writing assignments and their overall course grade. The numerical results given in Table 4 show a strong relationship between the writing grade and the course grade. Further study is required before any conclusion can be drawn concerning the causal relationship between the two grades, if any. Upon reflecting, I saw several deficiencies in the way I used the writing assignments. First, there were too many writing assignments and the students felt that they did not have enough time to do a good job. Second, the students did not understand the holistic grading. They are accustomed to receiving detailed comments on their papers, not just a numerical score. Posting the best papers seemed a good way to show them what I expected, but I learned that the conceptual task of using a good paper as a model for learning is very difficult. In the future I need to provide a better explanation of the evaluation scheme, perhaps even taking the time to make comments on the first assignment of the term. Last, I need to do a better job of connecting the writing tasks with the rest of the course. The summary ratings of the format of the two courses are slightly less positive than the rating of the ConcepTests and cooperative learning workshops, but they support the experiences of Wright, who found that the SAL methods improved learning in his courses (11). Individual discussions with students have reinforced the survey results. Unfortunately, it is very difficult to determine whether the actual student performance was better because there was no control group. The end-of-semester grades for Chemistry 130 were comparable to my previous courses in general chemistry. The percentage of A grades (6.8%) was a bit lower than in other years, where it typically has been approximately 10%, but so was the percentage of failures (12.8%), which generally is in the range of 15–20%. The percentage of F grades includes all students on the active class roll, including some who stopped coming to class early in the semester but did not officially withdraw. Very few students “gave up” and stopped coming to classes and exams. Of the 135 students registered at the end of the semester, 127 took the final exam (94.1%). This suggests that the atmosphere of the course was sufficiently positive that almost all students felt they had a chance to succeed. In the Chemistry 120 section, it appears that the students were more successful than usual. The percentage of A grades was quite high (14.4%) compared to the usual 10%, as was the percentage of B+ and B grades (17.4% combined). The usual percentage of B+ and B grades is 10–15%. The percentage of failures was high (23.4%); but about half of those

JChemEd.chem.wisc.edu • Vol. 76 No. 1 January 1999 • Journal of Chemical Education

123

Research: Science and Education

were students who stopped attending class and taking quizzes and exams early in the semester, so the percentage of “real” failing grades is much smaller (about 10%). Of the 162 students still registered at the end of the semester, 99 received a C or better in the course which, for an off-sequence section, is excellent. I am encouraged by the success of these experiments and plan to incorporate all of these innovations in future courses. This past year has been a learning experience for me also. Using ConcepTests in the lecture, writing good workshop problems sheets, maintaining a cooperative atmosphere, and listening to students are new skills for me. The challenge is to create a successful learning environment for the diverse population of students in our courses. Acknowledgments I am grateful to Arthur Ellis, John Wright, David Hanson, and Donna Sherwood for useful discussions and encouragement. Discussions with Brian Coppola have helped me clarify my thinking about the role of active learning in the broader context of a university education. Literature Cited 1. For a broad persepctive on this issue, see Kovac, J.; Coppola, B. P. Universities as Moral Communities; Proceedings of the 1997 Conference on Values in Higher Education; available URL: http://web.utk.edu/~unistudy/values (accessed September 1998).

124

2. Ellis, A. B.; Cappellari, A.; Lorenz, J. K.; Moore, D. E.; Lisensky, G. C. Experiences with Chemistry ConcepTests; http:// www.chem.wisc.edu/~concept (accessed August 1998). 3. Wright, J. C. J. Chem. Educ. 1996, 73, 827. 4. Hanson, D. M. An Instructor’s Guide to Process Workshops; Department of Chemistry, SUNY Stony Brook, 1996; Hanson, D. M. Instructor’s Guide to Discovering Chemistry; Houghton Mifflin: Boston, 1997. 5. Ebbing, D. D. General Chemistry; Houghton Mifflin: Boston, 1993 (4th ed.), 1996 (5th ed.). 6. Kovac, J.; Sherwood, D. W. Writing Across the Chemistry Curriculum; Proceedings of the 44th Annual Conference of the Society for Technical Communication, May 1997; pp 11–13. Kovac, J.; Sherwood, D. W. Writing Across the Chemistry Curriculum: A Faculty Handbook; University of Tennessee: Knoxville, 1998 (available from the author). 7. Hanson, D. M. Foundations of Chemistry; Pacific Crest Software: Corvallis, OR, 1996. Hanson, D. M. Discovering Chemistry: A Collaborative Learning Activity Book; Houghton Mifflin: Boston, 1997. 8. Moog, R. S.; Farrell, J. J. Chemistry: A Guided Inquiry; Wiley: New York, 1996. 9. For a discussion of the role of the facilitator of group work and suggestions about training see Gosser, D.; Roth, V.; Gafney, L.; Kampmeier, J.; Strozak, V.; Varma-Nelson, P.; Radel, S.; Weiner, M. Chem. Ed. 1996, 1(1), S1430-4171 (96) 01002-3; URL: http:// journals.springer-ny.com/sam-bin/swilma/cla.827385307.html. 10. White, E. M. Teaching and Assessing Writing; Jossey-Bass: San Francisco, 1988. 11. Wright, J. C.; Millar, S. B.; Kosciuk, S. A.; Pemberthy, D. L.; Williams, P. H.; Wampold, B. E. J. Chem. Educ. 1998, 75, 986–992.

Journal of Chemical Education • Vol. 76 No. 1 January 1999 • JChemEd.chem.wisc.edu