A Placement Examination and Mathematics Tutorial for General

Nov 1, 2003 - Valid and Reliable Assessments To Measure Scale Literacy of Students in Introductory College Chemistry Courses. Karrie Gerlach , Jaclyn ...
0 downloads 9 Views 81KB Size
Chemical Education Today edited by

NSF Highlights

Susan H. Hixson

Projects Supported by the NSF Division of Undergraduate Education

National Science Foundation Arlington, VA 22230

Richard F. Jones

A Placement Examination and Mathematics Tutorial for General Chemistry

Sinclair Community College Dayton, OH 45402-1460

by Norbert J. Pienta

Universities generally offer multiple introductory chemistry sequences and must ensure that the enrolled students start at levels appropriate to their background, aptitude, and program requirements. This paper describes a 30-question assessment tool that has been developed, tested, and used online. A second Internet-based feature, a set of tutorials about mathematics and calculator skills and approaches to solving word problems, has been implemented. The introductory chemistry courses at the University of Iowa include a one-semester preparatory course (taken by about 1000 students each calendar year); a traditional two-semester sequence, Principles in Chemistry (taken by about 1350 students the first semester and 800 the second); and a twosemester sequence for chemical sciences majors (about 60 chemistry, biochemistry, and chemical and biomedical engineering students). This report focuses on the preparatory and Principles courses. To this point, advising students about specific chemistry courses has been based entirely on a set of university-wide mathematics placement exams. The use of chemistry examinations as placement or diagnostic instruments has been described in the literature (1–3) and is one of the purposes of those produced by the ACS DivCHED Examinations Institute.1 Faced with providing a chemistry evaluation (the instruments and results) to 15 or more different orientation sessions, we sought a more asynchronous approach. The WebCT course management system offers authentication and security, allows for timed delivery and question sets, produces statistics, and supports several question formats.2 We created the first exam for delivery via WebCT in the fall semester of 2001 and have used the system with revised exams in the subsequent three semesters. The student scores are simply the number of correct answers on the 30-question multiple choice test. In order to test and authenticate questions and establish the exam’s validity as an instrument providing useful information to advisees, the exam was administered during the first two weeks of classes to students in the preparatory course and to students in both semesters of the Principles course. In fall 2001, 1100 of 1950 students eligible to participate (slightly more than 50%) elected to take the exam. In

spring 2002 participation dropped to about 40%, since a substantial group from the fall preparatory course entered the spring Principles course. To that point, students’ participation was voluntary. Their motivation was to find out the knowledge level they were starting the course with, compared with their classmates, and to have the benefit in subsequent semesters of having selected the appropriate starting course. In fall 2002, 1400 out of 1950 (more than 70%) took part after being told that the placement exam was “required but not binding”. In other words, they would not have to forfeit a place in the course if their score was not sufficient. In addition, the instructor of the second-semester Principles course offered a small point premium for completing the exam as the first homework assignment, which appeared to produce a higher rate of completion of placement exams (about 25) and at an amazing rate of speed (under 5 minutes). Unknown to the students, WebCT times and reports the duration of an exam. We did not include exams of such a short duration in the statistics; however, scores from exams of the next shortest times (10–12 minutes) included ones with scores above the median, and in the absence of any additional information to the contrary, were included in the statistics. For fall 2003, the exam will have placement consequences, and there will be no need to motivate students from the Principles II course. The content of the exam focuses on material from the Principles I course. The WebCT system is set to allow 60 minutes, and students are cautioned that the short time precludes the use of notes or a textbook; a calculator is allowed. The WebCT system randomly selects a question from each of the 30 question sets; the total database is now comprised of about 100 questions. The statistical data from fall 2002 are shown in Table 1 and Figure 1. The table includes percentage participation and the median, mean, and standard deviations of the scores by course. A 23-student section from the nearby Kirkwood Community College scored the same average (18.1 ± 2.7) as the Principles I group. Figure 1 shows the grade distributions for the students in the first two courses, which are reported as a percentage of the total course enrollment. Those taking

Table 1. Students Completing the Chemistry Exam (30-point Total), Fall 2002, by Course Course

N

Participation %

Mean Scores

Standard Deviation of Scores

Median Scores

Preparatory

384

53

13.4

4.1

13

Principles I

756

92

18.3

4.6

18

Principles II

267

86

19.2

5.5

20

1244

Journal of Chemical Education • Vol. 80 No. 11 November 2003 • JChemEd.chem.wisc.edu

Chemical Education Today

14

Fraction of Students At Each Score (%)

12 10 8 6 4 2 0

0

5

10

15

20

25

30

Placement Exam Score Figure 1. Percent distribution of each score by course (Preparatory, gray circles; Principles I, open squares).

the Principles II course in fall 2002 are likely to have taken the first part in spring 2002 and to exhibit the “mind dump” that occurs between semesters or over the summer. The percentage of students in each course that successfully completed each question was analyzed. Independent of the level of difficulty of the questions, the Principles group has outperformed the Preparatory group on each question. The exam was intended to pair conceptual and algorithmic questions on each topic. The algorithmic group contains questions that require a calculation (12 questions) or knowledge of facts (2 questions). In the first two versions of the exam, the calculations made use of a WebCT question format that allows the question designer to define the equation, assign a range of values to variables, and cast these into a word problem. That format was abandoned in favor of multiple choice questions since the management system does not provide individual statistics for the calculation questions. Changes to questions after the first two offerings (fall 2001, spring 2002) were based on the difficulty and discrimination of each question and produced a greater number of questions on which Preparatory students scored significantly below Principles students. The outcomes on this exam show some correlation with the final letter grades achieved in each course (data not shown). One would not expect a high level of correlation because student learning is not based on prior knowledge at the start of the course, but student effort, attitude, and motivation in

addition to quality instruction during the course. Nonetheless, the data suggest that students who get a score of 15 will successfully complete Principles I with a grade of C or better. Table 2 examines the relationship between conceptual and algorithmic questions (calculations and memory recall) as a function of course. There are 16 conceptual and 14 algorithmic questions, and in some cases, the distinction between those two categories may not be unequivocal. In both categories the Principles I students score about 2.5 points higher than the Preparatory students. Those students in Principles II score higher in the algorithmic portion by an amount that accounts for the entire difference between the two groups. The ratio of correct answers on conceptual questions to correct answers on algorithmic questions is very similar for the Preparatory and both Principles I and II. However, the range of values is narrower in the series: Preparatory, Principles I, and Principles II as demonstrated by the size of the standard deviations (0.8, 0.5, and 0.3, respectively). These data suggest that there is not a disproportionate gap between conceptual understanding and the mathematical applications portion of student knowledge in those entering the preparatory course. Thus the Preparatory students answer correctly fewer conceptual and fewer mathematical questions than their Principles counterparts. The small differences in scores that should reflect learning and retention in Principles I are somewhat disappointing. Principles I gains appear to be entirely algorithmic. In summary, the exam scores appear to be a reasonable predictor of performance in the Preparatory or Principles I courses. The administration of the exam via the Internet was successful and relatively easy to implement. Large numbers of students participated and did so without any difficulties. The WebCT site is authenticated so we know that the proper students logged in, but we have no guarantees that they completed the exam themselves. Furthermore, we have no way of judging whether those who participated put forth their best effort. However, this is also true for paper-and-pencil tests. Apparent shortcomings in students’ mathematics and calculator skills in introductory chemistry can be described by two cases: an unlearned or missing skill, or the need for mathematics skills in chemistry that have gone unpracticed for one or two years. Most instructors do not want to spend time reviewing mathematics during chemistry lecture. This is particularly true in a large enrollment course in which an inter-

Table 2. Correct Answers to Conceptual versus Algorithmic Questions Average of Correct Answers to Conceptual Questions

Average of Correct Answers to Algorithmic Questions

Average of Ratios of Correct Answers (Conceptual Questions) to Correct Answers (Algorithmic Questions)

Average of Differences of Correct Answers (Conceptual Questions) to Correct Answers (Algorithmic Questions)

Preparatory

7.2 (⫾ 2.5)

6.3 (⫾ 2.3)

1.3 (⫾ 0.8)

0.9 (⫾ 2.6)

Principles I

9.8 (⫾ 2.6)

8.5 (⫾ 2.6)

1.2 (⫾ 0.5)

1.2 (⫾ 2.4)

Principles II

9.7 (⫾ 3.0)

9.5 (⫾ 3.0)

1.1 (⫾ 0.3)

0.2 (⫾ 2.4)

Course

JChemEd.chem.wisc.edu • Vol. 80 No. 11 November 2003 • Journal of Chemical Education

1245

Chemical Education Today

NSF Highlights vention for those in a bottom level would be boring and unproductive to the others in the class. This scenario lends itself to asynchronous instruction. We have created a mathemtics and calculator skills tutorial Web site for the Preparatory and Principles students at the University of Iowa at http://genchem.chem.uiowa.edu/ chemrev/ (accessed Sep 2003) and modeled it after a successful site we reported previously (4). The site has been organized into these topics: •

Mathematics Numbers and Their Properties Numbers in Science Ratios and Proportions Units, Dimensions, and Conversions Percents Logarithms



Basic Concepts (of Chemistry) Chemical Nomenclature Atomic Structure Stoichiometry Acid–Base Chemistry



Calculator Skills Basic Operations Additional operations



Further Resources

Instructors generally recommend that students use our Web site early in the semester to review topics before the hour exams. In some cases, students visit the site during the first meeting of their discussion sections, which meet in a room outfitted with laptops and wireless In-ternet access. An additional set of materials was designed to help students approach word problems. For each topic, a presentation has been prepared using one of several software products in which a set of images (often PowerPoint slides) has been annotated or accompanied by audio.3 In this way, the ap-

1246

proach to a problem or a set of rules can be explained using both video and audio. The images change automatically and track the voice that is providing the explanation.4 These tutorials5 have been promoted as “virtual office hours” in which the instructor can provide an explanation at any time. Notes 1. ACS DivCHED Examinations Institute, University of Wisconsin–Milwaukee, Chemistry Department, P.O. Box 413, Milwaukee, WI 53201-3029; http://www.uwm.edu/Dept/chemexams/ (accessed Sep 2003). 2. For information on WebCT see http://www.webct.com/ (accessed Sep 2003). 3. A graphic or image with accompanying audio can be prepared using the software programs Real Producer, QuickTime, and Macromedia Flash. 4. Through the use of audio and image compression, these software programs can create files of 1.5–2.5 Mb, which contain 30–50 “slides” and 10–15 minutes of accompanying commentary. 5. Tutorials cover introduction to scientific notation, mathematics involving scientific notation, introduction to significant figures, mathematics involving significant figures, understanding and interpreting word problems in chemistry, drawing Lewis structures, and introduction to equilibrium.

Literature Cited 1. Hovey, Nelson W.; Krohn, Albertine. J. Chem. Educ. 1963, 40, 370–372. 2. Russell, Arlene A. J. Chem. Educ. 1994, 71, 314–317. 3. McFate, Craig; Olmsted, John A. III J. Chem. Educ. 1999, 76, 562–565. 4. Pienta, N. J.; Thorp, H. H.; Panoff, R. M.; Gotwals, R. R. Jr.; Hirst, H. P. Chemical Educator 2001, 6 (5), 365–69.

Norbert J. Pienta is in the Department of Chemistry, University of Iowa, Iowa City, IA 52242-1294; [email protected]

Journal of Chemical Education • Vol. 80 No. 11 November 2003 • JChemEd.chem.wisc.edu