How To Recognize Success and Failure: Practical Assessment of an

Apr 5, 2013 - This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory ...
0 downloads 0 Views 240KB Size
Article pubs.acs.org/jchemeduc

How To Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools Liz U. Gron,* Shelly B. Bradley, Jennifer R. McKenzie, Sara E. Shinn, and M. Warfield Teague Department of Chemistry, Hendrix College, Conway, Arkansas 72032, United States S Supporting Information *

ABSTRACT: This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and beyond the major. Among the stakeholders, there was low-level yet widespread apprehension that changes in the introductory level content would weaken student preparation. Because of this unease, it was important to prove to faculty and students alike that the new laboratory program could effectively help students reach the new laboratory learning goals without sacrificing technical content. A set of simple assessment tools, student precision and accuracy data from experiments during the semester, and a laboratory practical and a student survey from the end of the semester were used as both formative and summative program assessments. This article establishes the power of these simple, course-embedded tools to yield insights into program strengths and weaknesses and, ultimately, to demonstrate to all faculty and students the effectiveness of this first-semester program. KEYWORDS: First-Year Undergraduate/General, Analytical Chemistry, Chemical Education Research, Curriculum, Testing/Assessment, Green Chemistry, Quantitative Analysis FEATURE: Chemical Education Research



BACKGROUND In the last nine years, Hendrix College has worked to redesign the introductory chemistry majors’ laboratory experience. The new laboratory had three design criteria: the experiments should (i) both apply and teach green chemistry principles, (ii) train students in basic analytical techniques, and (iii) use environmental samples. For us, as for many other programs, the laboratory program is less firmly entrenched than classroom material, leading to important opportunities to alter the curricular focus. However, this laboratory program was a significant departure from our previous proof-of-concept laboratories, giving rise to unease among the faculty and students, leading to a clear need for assessment. Constraints on the new program required that it fit within the traditional classroom and laboratory model without changing the lecture portion of the course, and limit content to the knowledge base typical of the introductory course. The resulting new laboratory has been given a unique title to emphasize program continuity and goals; it is known locally as Green−Soil and Water Analysis at Toad Suck, or Green−SWAT.1 Early in the design process, the first-semester of this yearlong program was assessed informally through faculty discussions and evaluation of student work, because the experiments and supporting materials were very fluid. However, as the experiments matured, it became evident that more standardized evaluation was necessary. Assessment became critical owing to © XXXX American Chemical Society and Division of Chemical Education, Inc.

conflicting faculty and student conceptions. Initially, faculty in the introductory chemistry program presumed that the quantitative goals and environmental connections were too ambitious for introductory students, potentially leading to increased student stress and decreased student enthusiasm for science majors; however, as the program took shape, the students reported an increased enjoyment of the Green− SWAT experiments. This gave rise to a contradictory concern from faculty teaching upper-level science courses. Their concern was related to reduced program rigor resulting in poorer student preparation for advanced work. For students, initial concerns centered on their ability to attain high grades and anxiety about how the new tools would be used to score their work. Formal program evaluation was necessary to respond to these stakeholder concerns, to measure program progress, as well as to identify areas still needing development. Assessment is the activity necessary to answer the question, “Are students learning what we think we are teaching?” However, we, like many professional chemical educators, had little assessment experience beyond the traditional written exam, and strongly suspected the assessment experience would be difficult and unpleasant. The chemical education literature has relatively few examples of simple, well-done assessments that can be easily adapted or adopted. We found 101

A

dx.doi.org/10.1021/ed200523w | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



assessment articles spanning the whole of the undergraduate curriculum from general chemistry laboratory through research seminars, and from the use of demonstrations to safety training. Of those, 20 articles were not applicable, and in most articles that described a new laboratory (38), assessment was effectively absent,2 thus yielding a final 43 articles broadly applicable to the assessment of laboratories. In 23 of the 43 articles, student surveys were the primary assessment tool, with a variety of modifications. Most articles (15 of the 23) used a straight student attitudinal survey,3−17 another five18−22 combined the survey with pre−post testing and some statistical analysis, two articles presented surveys focused on peer-assessment,23,24 while one article discussed the creation of a nuanced survey, the self-concept inventory.25 Beyond the student surveys, a wide variety of tools, some simple and some quite sophisticated, were used for laboratory or program assessment. Outcomebased assessment tools were used in 11 of the 43 articles and included: evaluation of student data across sections or against control groups,26−28 computer tracking of student calculation attempts,29,30 rubrics for presentations31 or written work,32−34 portfolio assessment with a reflective log,35 as well as sophisticated coded observations and interviews.36 A few articles (7 of the 43) evaluated student cognitive development (e.g., questioning and problem-solving skills) by using concept mapping,37 practical with a test,38 coded observations or interviews27,39−42 and neural-net clustering.43 It should be noted that one article27 used both outcome-based and coded observations and interviews, and so is doublecounted. Finally, 3 of the 43 assessment articles focused on faculty objectives. The first presented coded interviews to assess faculty members’ perceptions of the undergraduate laboratory,44 and the last two used interviews to discuss the development of learning objectives and assessment plans.44,45 Our paper makes an important addition to this pool by describing the efficacy of a combination of simple, outcomebased assessment tools in the design and evaluation of a firstsemester introductory laboratory program. The multiple assessments require ∼1.5 laboratory days of student time, while evaluating the data requires about 4 h of faculty time once the data is collected. While the initial setup of this project was time-consuming, it is important to note that this work was hardly onerous and the data generated was critical to the program developers, and proved to be crucial in converting program skeptics into believers.

Article

ASSESSMENT DESIGN

The assessment design was similar to that described in a number of basic assessment manuals.47,48 The first assessment challenge, and in many ways the most difficult hurdle, was to define a limited list of student learning goals for the program. The second challenge was to distill out a subset of learning goals for the focus of our first assessments. Because of the campus concerns about rigor, technical laboratory skills were chosen as the primary focus for this set of assessments. Once the assessment focus was finalized, we looked for evaluative tools that would reflect student competence with these skills. We tried to use data already generated, but that had not been systematically collected previously. Initial design started with student precision from two spectroscopic projects, analysis of iron in water by ultraviolet−visible (UV−Vis) spectroscopy, and analysis of iron by flame atomic absorption spectroscopy (FAAS), with additional accuracy data from the FAAS experiment. Although the specifics change slightly year to year, these two projects typically represent 38% of the semester’s laboratory work and 25% of the final laboratory grade. In 2005, we collected data from an end-of-semester laboratory practical to answer specific questions about student learning, as well as a summative student survey in 2006 to insert a student voice in the evaluation proceedings. We were interested in their appraisal of their learning gains as well as their assessments of the strengths and weaknesses of the program. These four data sets were used as both formative and summative project assessments. A matrix of the student learning goals versus the assessment tools is presented in Table 1. The experimental skills are consistent with exposures in many first-year programs, though the precision−accuracy and calculation skills overlap more closely with the expectations of a quantitative analysis course. Table 1. Matrix of Student Learning Goals and Assessment Tools Learning Goals Experimental Skills Quantitatively transfer solids and liquids Create precise solutions: volumetric flasks, pipets, and burets Use a spectrometer (with instructions) Calculation Skills Use Excel: manage data, make graphs, and linear regression Use calibration curves Use basic statistics Understand spectroscopy Environmental Concepts Understand environmental importance and sources of simple ions Green Principles Define and explain green chemistry



PROGRAM OVERVIEW The Green−SWAT laboratory program at Hendrix College has been designed to teach green analytical chemistry to introductory students within the strictures of the traditional lecture and laboratory model. The experimental material covered was generally consistent with the expectations of a typical introductory laboratory program,46 but the focus on quantitative skills and green concepts as part of the specific student learning goals at this level is unusual. The students met for one, 3-h laboratory period per week and the laboratory grade was incorporated into the classroom evaluation. Many of the experiments are multi-week projects in order to give students time to collect and assess data. This laboratory program has involved over 10 different faculty members over the nine-year period that it has been in existence, with 120−200 students per semester, in laboratories with 25−45 students per section. Assessment data discussed herein was collected in the 2004, 2005, and 2006 academic years. B

Precision

Accuracy

Lab Practical

Student Survey

×

×

×

×

×

×

×

×

×



×

×

×

×

×

×

×  

×  

×  ×

× × ×







×







×

dx.doi.org/10.1021/ed200523w | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

Student Accuracy

PROJECT ASSESSMENT AND EVALUATION Student precision and accuracy data were collected from experiments that quantified iron using either UV−Vis spectroscopy or FAAS. Both of these experiments required students to make a standard series from a provided stock solution; however, the UV−Vis experiment started with an aqueous unknown and required complicated solution preparation, while the FAAS experiment began with a solid unknown and much simpler solution preparation. Precision data were available from both experiments; accuracy was only evaluated in the second experiment, FAAS.

In addition to collecting precision data, accuracy data from the FAAS experiment assessed students’ ability to get the “right” answer. Accuracy was based on the student quantification of a reference sample. Students had to mass the unknown, transfer it to a volumetric flask, acidify, and dilute twice using volumetric pipettes. Our expectations of accuracy by novice chemists are limited to a goal of having the majority of the students attain errors of less than 15%.52 The FAAS accuracy results collected in Table 3 show that a majority of students in all years had less than 15% error.

Student Precision

Table 3. Student Accuracy Trends for the FAAS Experiment

Table 2 lists the precision data collected from the iron by UV− Vis and FAAS experiments in the 2004−2006 academic years.

Student Groups, % FAAS Accuracy

Table 2. Student Precision Trends for the UV−Vis and FAAS Experiments UV− Vis, % Student Groups

a

b

2004, N = 16

2005, N = 44b

2006, N = 32b,c

38 31 13 19

39 21 7 34

69 9 9 13

20%

FAAS, % Student Groups

Precisiona

2004

2005

2006

2004

2005

2006

15% Student groups, Nb

6 47 24 12 6 6 17

44 21 21 12 2 0 43

20 17 23 22 13 5 64

6 31 63 0 0 0 16

5 16 71 5 5 0 44

41 44 15 0 0 0 32c

a

Measured as percentage of relative error between the known value and the student answer. bStudent groups were usually pairs. cThe number of student groups was reduced due to a weather-related cancellation of one laboratory day.

However, an analysis of these data sets indicates a significant decrease in accuracy from 2004 to 2005: χ20.05 (3 df) = 7.86, p < 0.05. This was primarily because of the increase in students with error >20%. This was a puzzling result as these same students obtained excellent precision in the same experiment. Discussions within the teaching staff suggested that the problem could lie in the students’ ability to manipulate a solid sample. A laboratory practical was designed specifically to evaluate the students’ ability to create an analytical solution from a solid, mimicking the skills needed to prepare the reference sample used to assess accuracy. Between 2005 and 2006, there was an obvious improvement in the number of student groups attaining less than 10% relative error in accuracy (39 vs 69%). We attribute this improvement to programmatic changes instituted as a result of the laboratory practical, which is discussed in the next section.

a

Measured as percentage of relative error of the slope (em/m%). Student groups were usually pairs. cThe number of student groups was reduced due to a weather-related cancellation of one laboratory day. b

Precision was measured as the relative error of the slope (%) in student-generated calibration curves.49 Inspection of the UV− Vis data indicates that the majority of students in all years were successful in creating high-quality calibration curves (% error 0.05.

Laboratory Practical

For the end-of-semester laboratory practical, students were given brief written instructions for creating an analytical copper nitrate solution from the solid and measuring the resulting absorbance on a spectrophotometer. Students worked at designated stations where faculty and staff monitored the actions of students groups (pairs usually) using a simple rubric that broke down the activities of interest. Table 4 lists the skills and the percentage of the total points acquired by the students as well as the average accuracy53 attained by the students in 2005. Because of outside factors, the laboratory practical data was not collected in 2006. The practical results indicate that the students could do a number of things extremely well, and overall, the students acquired 94% (±11%) of the available points on average. Consistent with our laboratory results, though, only 70% of students attained the accuracy goal of less than 2% error51 for this single solution. The individual graded steps of the laboratory practical provided insight into this problem. The practical indicated that the students were remarkably less competent in two specific skills: (i) subtracting the final weight of the weighing paper, and (ii) quantitatively C

dx.doi.org/10.1021/ed200523w | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

included interactions with personnel, written materials, and activities (prelaboratory lecture, laboratory tasks, and discussions). The skills section asked students to rate their confidence in competently performing a range of laboratory tasks or explaining laboratory topics. The final section asked students to rate the challenge provided and their enjoyment of the laboratory. Students chose answers on a scale of 1−5 with one point denoting “strongly disagree” and five points denoting “strongly agree” with the given statement. The categories of questions and the general results are given in Table 5.

Table 4. Laboratory Practical Skills and Student Success Laboratory Skills

Points Acquired by Students, %a,b

Balance Use Tare weighing paper Subtract final weight of paper Material Transfer Skills Transfer massed solid directly to mixing container Dissolve solid completely Quantitatively transfer solution into volumetric Volumetric glassware use Fill to mark Invert 15−20 times Spectrometer Use Record measurements Calculation Accuracyc Error