Assessment of Process Skills in Analytical Chemistry Student

Jun 25, 2019 - If educators value skills in addition to the correctness of an answer, it is ... As part of the ANA-POGIL (analytical process oriented ...
0 downloads 0 Views 4MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Assessment of Process Skills in Analytical Chemistry Student Responses to Open-Ended Exam Questions Jennifer A. Schmidt-McCormack,† Caryl Fish,‡ Anne Falke,§ Juliette Lantz,∥ and Reneé S. Cole*,† †

Department Department § Department ∥ Department Downloaded via VOLUNTEER STATE COMMUNITY COLG on July 27, 2019 at 20:26:15 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.



of of of of

Chemistry, Chemistry, Chemistry, Chemistry,

University of Iowa, Iowa City, Iowa 52242, United States Saint Vincent College, Latrobe, Pennsylvania 15650-2690, United States Worcester State University, Worcester, Massachusetts 10602, United States Drew University, Madison, New Jersey 07940, United States

S Supporting Information *

ABSTRACT: Assessment, including course exams, clearly indicates to students what learning goals they are expected to master in a certain course. However, most of these assessments tend to focus on generating a correct answer rather than on the type of reasoning or skills used to arrive at the answer. If educators value skills in addition to the correctness of an answer, it is important that they assess them. As part of the ANA-POGIL (analytical process oriented guided inquiry learning) project, the ANA-POGIL team developed a set of process-rich or guided-inquiry-type assessment questions to be used on exams. These questions were designed to mirror the structure of the POGIL activities, where students were provided data in the form of a table, graph, or set of information with the intention of eliciting evidence of process skills such as information processing, problem solving, and critical thinking in the students’ written responses. This study presents an analysis of student responses gathered from multiple institutions over several semesters to determine characteristics of questions that are likely to elicit evidence of process skills. Results of this project can provide some insight and recommendations to instructors about how to construct questions to elicit evidence of desired skills. KEYWORDS: Upper-Division Undergraduate, Analytical Chemistry, Chemical Education Research, Testing/Assessment FEATURE: Chemical Education Research



dimensional learning assessment protocol (3D-LAP).4,5,11,12 These analyses show that while instructors and programs state that they value practices such as “developing and using models” or “engaging in argument from evidence”, most assessments at the introductory level focused on recall, procedural knowledge, and algorithmic problem solving. The alignment between assessment and the objective of learning is called constructive alignment, which was defined by Biggs3 as the matching of assessments with learning activities and desired outcomes. Biggs3 states that if the activities do not match the assessment, then the education setting is a “poor system” where the components are not working together constructively to support the desired learning outcomes. This situation is not uncommon, ́ and Bretz13 study that found as illustrated in the Sanabria-Rios that chemistry instructors’ exam questions did not align with their stated learning objectives. The first step in creating a constructive learning environment is to define the learning objectives. Fuentealba14 and Suskie15

INTRODUCTION There is ample evidence that active learning environments are effective at helping students learn.1,2 However, there is often a gap between classroom actions and behaviors called for in various pedagogies and the assessment questions. This is important because assessment is a clear way to reveal to students the objectives they are expected to master as they complete the course.3−5 Students value the assessments because grades serve as a motivating factor and signal what instructors consider important.6−8 Crooks9 provided the recommendation that the quickest way to influence what students learn is to change the assessment system. Boud6 also discussed how assessments need to complement other activities of the course. Ideally, assessments should provide feedback for both the instructor and the student as to how well the learning objectives are being met.8 Most course assessments tend to focus on a student’s competency at arriving at a correct solution, which means the assessment of other valuable skills is often overlooked.4,5,10−12 The most comprehensive analysis of assessment items for the potential to elicit evidence of science practices in addition to content knowledge has been through application of the three© XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: October 25, 2018 Revised: June 2, 2019

A

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

state that the most effective assessments are ones constructed with a specific purpose and target student population. Boud and Falchikov7 and Fuentealba14 state that long-term assessments should be as important as the short-term ones, and assessments of students should align with curricular goals and educational objectives. Boud and Falchikov7 go on to claim that the two main objectives of assessment are to “provide certification” and to “facilitate learning”. Once the learning objectives have been identified, the construction of the assessment can begin. Suskie15 proposes that there are four main steps to constructing assessments: 1. Writing down clear learning objectives. 2. Creating learning environments that support the students’ development of the identified learning objectives. 3. Implementing ways to measure these students’ outcomes. 4. Using the results obtained from the assessment to inform instructor practice.

Table 1. Operationalized Definitions of the Process Skills Used for Analysisa Process Skill Information processing Problem solving Critical thinking a

Definition Evaluating, interpreting, manipulating, or transforming information Identifying, planning, and executing a strategy that goes beyond routine action to find a solution to a situation or question Analyzing, evaluating, or synthesizing relevant information to form an argument or reach a conclusion supported with evidence

Adapted with permission from the POGIL project.

Assessment of Process Skills

higher-order cognitive skills.24,25 The 3D-LAP is an exception in that it coded assessment items in terms of their potential to elicit evidence of science practices, cross-cutting concepts, or core disciplinary ideas.4 A comparison of student performance relative to the nature of the question has shown that performance on questions requiring lower-order cognitive skills is not correlated to performance on questions requiring higherorder cognitive skills, suggesting that the higher-order questions are measuring different skills, not just content knowledge.25 The wording and structure of questions has been shown to influence student responses. Small changes in wording have been shown to impact student performance,26,27 although changes to graphical elements were less significant.28 Students are typically more successful in meeting expectations when questions are more scaffolded29 or when they are similar to what students have seen in previous assignments or exams.30 When studies analyze student responses, the focus has typically been on correctness or overall performance scores rather than on particular characteristics of student reasoning. Studies that provided a deeper analysis of student responses have focused on how students incorporate concepts and content into their responses31 or distinguished between student success in applying content knowledge and describing scientific processes.32 To quote Crisp et al.,33 “The power of questions resides in their ability to elicit a response, i.e. they implicate a respondent into returning information to a teacher (or an assessor) from which a decision can be made about the next steps for learning.” With the widespread adoption of POGIL and other strategies that focus on process skills as part of course outcomes, the authors have often been asked to provide advice on how to write exam questions that are more likely to elicit evidence of process skills. Since we had access to a large number of student responses to questions intended to elicit evidence of process skills from the ANA-POGIL project, we undertook a retrospective analysis of this data to determine characteristics of questions that were likely to elicit evidence of process skills. While these questions were designed with the assessment of process skills in mind, they were not designed as formal assessment items with the type of piloting and testing research instruments would undergo.

While there have been a number of studies that have characterized the nature of exam questions and/or analyzed student performance on them, the focus has not been on the assessment of the evidence of the skills students used in their responses. For example, studies have characterized question prompts in terms of the nature of student thinking required to address the question. These characterizations have generally focused on classifications such as algorithmic, conceptual, or recall,22,23 or algorithmic, lower-order cognitive skills, and

RESEARCH QUESTIONS There are three main research questions related to the analysis of the exam questions that are addressed in this paper. 1. To what extent did the written, open-ended exam questions elicit evidence of process skills in students written responses? 2. To what extent were the intended process skills aligned with evidence of process skills in the student responses?

Process Skills

Process skills, sometimes called professional skills, transferable skills, or soft skills, are important skills for undergraduate students to develop.2,12,16−18 A key aspect of the POGIL (process oriented guided inquiry learning) pedagogy is the incorporation of process skills (oral and written communication, teamwork, problem solving, critical thinking, management, information processing, and self-assessment) as explicit components of the course.18−21 POGIL instructional materials (and their implementation) are designed to support students in developing process skills along with content knowledge. The POGIL community spans a wide range of disciplines in secondary and postsecondary education, and a number of groups have assembled to create materials for particular courses. One such group, the ANA-POGIL (analytical POGIL) Project, focused on developing guided inquiry materials for analytical chemistry. The faculty consortium of the ANA-POGIL project valued the development of process skills alongside the traditional content learning outcomes in the classroom and believed the course exams should reflect both types of learning outcomes. Adhering to the principles of aligning the assessments with the curriculum, in addition to developing classroom activities, the ANA-POGIL faculty developed a set of processrich or guided-inquiry-type assessment questions to be used on exams. These questions were designed to mirror the structure of the POGIL activities, where students were provided data in the form of a table, graph, or set of information with the intention of eliciting evidence of process skills in the students’ written responses. The process skills targeted in exam questions were information processing, problem solving, and critical thinking. For this study, the operationalized definitions for the three process skills were taken directly from the POGIL project18 and can be found in Table 1. A more complete description of how the definitions were constructed is described by Cole et al.19



B

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 1. Portion of the spectroscopy (SP) exam question. The graph of the two dyes served as the model in the question. The question parts SP-2, SP3, and SP-4 are shown.



skills was developed for each of the following topic areas, which corresponded to the broad topic areas covered in the analytical curriculum: analytical tools (AT), chromatography (CH), electrochemistry (EC), multiple equilibria (ME), spectroscopy (SP), and statistics (ST). Targeted process skills were identified by the consortium for each exam question with the exception of the electrochemistry question. The exam questions were then incorporated into course exams by consortium members if the corresponding topic and activity were part of the class. A portion of a sample question is shown in Figure 1. In this question, students were given a model requiring interpretation of spectra of two different compounds. They were also provided the molar absorptivity of the two compounds and the cuvette path length, and they were told that the two compounds are in solutions of the same concentration. From this information, students were asked a series of questions that required them to apply Beer’s law in a variety of different ways and combine interpretations about graphs and textual information in their explanations. For example, they were expected to use both the spectra and the information about molar absorptivity to determine which spectrum belonged to which compound. To calculate the concentration of one of the dyes using Beer’s law, the students had to use information from both the spectra and the text.

3. What features of the written exam questions elicited evidence of process skills in the students’ written responses?

METHODS

Participants and Setting

The participants in this study were primarily upper-level undergraduate students enrolled in analytical chemistry courses taught by instructors using the ANA-POGIL curriculum. There were instructors from 11 institutions, which ranged from small liberal arts colleges to large research-intensive universities. Data was collected from 2008 to 2011 in both quantitative analysis and instrumental analysis courses. The sequence when students took the analytical chemistry courses varied depending on the individual school curriculum, although all students had completed two semesters of general chemistry. Institutional Review Board (IRB) approval for the study was obtained at each participating institution, and consent was obtained from all participants whose work was included in the study. Exam Questions

Common exam questions were developed by consortium members in order to allow faculty to test the transferability of the questions across contexts, provide a model for assessment items aligned with POGIL activities, and compare evidence of student performance across multiple years and institutions. A multipart question that would require students to use process

Data Collection

Instructors from multiple institutions collected student exam responses over multiple years and submitted photocopies with C

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

and the process skills that the activity was targeting in the student responses are listed in the second column. The last four columns of the instructors’ guide show the courses for which the activity is best suited. As shown in Figure 2, one aspect of information processing required by the activity was to “develop generalizations from tabulated data”. The initial coding scheme was developed using these characteristics of the process skills identified in the ANA-POGIL materials. After the initial coding scheme was generated, all members of the research team coded samples of student work. After coding an initial subset of students’ work, the research team determined that there was evidence of process skills in the students’ work that were not clearly articulated in the initial list. The coding scheme was modified by the research team as needed during the development process on the basis of the student responses until the coding scheme adequately characterized evidence of process skills and could be applied consistently by all members of the research team. In the final coding scheme, shown in Box 1, each process skill has multiple codes to represent different aspects of that skill. Full definitions of all of the codes in the scheme are provided in the Supporting Information. Once the coding scheme was finalized, the first author coded all of the remaining student exam responses. Two other members of the research team coded 25% of the student exam responses to establish reliability with applying the coding scheme. The 25% sample was a stratified random sample chosen from multiple schools, semesters, and exam questions from the total pool to ensure an even distribution. An inter-rater agreement value of 90% was calculated by comparing the coding of each question. If 0−29% of the student responses showed evidence of a particular process skill, then that question was characterized as eliciting low evidence of that particular process skill. If a question elicited evidence of process skills in 30−69% of the student responses, then the question was characterized as eliciting moderate evidence of that particular process skill. If at least 70% of the student responses showed evidence of a particular process skill, the question was characterized as eliciting consistent evidence for that particular skill. The categories of process skills identified in the student responses to the exam questions were compared to the facultyidentified process skills for each question. Since the faculty did not break down the process skills by the separate characteristics of each skill (for example, labels graph/f igure and sketches graph/ f igure for information processing), the coding of the question

all of the student identifiers removed to the researchers. A full list of exam questions for which student responses were collected is presented in the Supporting Information. A summary of the semesters and exam questions for which student responses were collected is provided in Table 2. The number of participating instructors varied over time based on teaching assignments, sabbaticals, and transitions of participants to administrative roles. Table 2. Summary of Semesters and Exam Questions in Which Student Test Responses Were Collected Number of Institutions in Which Data Were Collected, by Semester and Year Exam Question Topic

Fall 2008

Spring 2009

Fall 2009

Spring 2010

Fall 2010

Spring 2011

Analytical tools Chromatography Electrochemistry Multiple equilibria Spectroscopy Statistics Analytical tools

4 1 0 3 3 4 4

0 2 0 0 2 0 0

6 2 1 4 2 4 6

0 1 0 1 2 0 0

1 0 0 0 1 1 1

2 0 0 0 1 1 2

A complete table that lists the total number of exam responses collected for each part of each question can be found in the Supporting Information in Table S1. The number of exam responses varied for each semester because instructors only used exam questions that aligned with their assessment goals, some instructors chose to use a subset of the question parts for a given exam question rather than the complete set, and because illegible and blank exams were not coded and were excluded from the exam count totals. Data Analysis

Development of the Coding Scheme. A qualitative coding method was used to analyze student responses. The research team consisted of faculty involved with the ANAPOGIL project (R.S.C., C.F., A.F., J.L.) and the first author (J.A.S.-M.). The first author (J.A.S.-M.) used the process skill descriptions identified in the ANA-POGIL Instructors’ Guide as a starting point for developing the coding scheme. All of the faculty members that were involved in the ANA-POGIL project contributed to articulating the characteristics of each process skill. An excerpt from a Statistics activity is shown in Figure 2. The content goals for the activity are shown in the first column,

Figure 2. Excerpt from the ANA-POGIL Instructor’s Guide for a Statistics Activity. Used with permission from the POGIL Project. D

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Box 1. List of Codes Used for Analysis of Student Responses to Exam Questions

was based on the aspect of the skill with the highest percentage found in student responses. To gain insight into the nature of exam questions that were successful at eliciting specific process skills, the exam question prompts that elicited evidence of a particular process skill were analyzed and compared to one another to identify themes.

Figure 3. Example of a students’ response to three question parts from the spectroscopy exam question. Note that the student’s hand-drawn curve is shaded in light orange.

consistent evidence of information processing in the student responses elicited evidence of either interprets data in model or interprets data in text. Questions that asked students to draw or label graphs or figures generally resulted in a majority of students providing some evidence of labels graph/f igure or sketches graph/f igure. However, if the question prompt also asked students to identify the information they used to draw the spectrum, additional insight into students’ information processing could be obtained. This was observed in question part SP-4, shown in Figure 3, where students were prompted to draw the absorbance curve that would result when the concentration of the analyzed solution was doubled and to provide the information they used to determine how to draw the curve. In the example of student work shown in Figure 3, the student correctly stated that the absorbance should double. However, the curve that was generated doubles the first point and then maintains a consistent spacing, one box height, between the generated and original curve. They should have doubled the value of each point, which would have resulted in the curve having a value of zero for the points past 550 nm. While 91% of students drew a curve, only 46% provided evidence that they used the provided data to do so. Without the prompt for students to describe what information they used to generate the curve, an instructor might assume students did not understand the Beer’s law relationship that absorbance is directly proportional to concentration. Asking students to provide information they used to solve the problem may reveal evidence of a disconnect between content knowledge and skills such as the ability to

Application of Codes

To illustrate how the coding scheme was applied to student responses, an analysis of a single student’s exam response, shown in Figure 3, is summarized in Table 3. As mentioned above, not all instructors used all parts of a question; in this case, the student work begins with the second part of the spectroscopy question.



RESULTS AND DISCUSSION The analysis of the student responses was based on the presence or absence of evidence of the process skills and not on the quality of the performance. For example, as long as a student showed evidence of the steps of a calculation and listed the equation they used to achieve their final answer, their response was coded for evidence of problem solving, regardless of the accuracy of their final answer. The findings are organized first by process skill, followed by analysis of broader themes. In this discussion, when we refer to a specific code, it is italicized for clarity. A summary of the process skills observed in student responses for each question part is shown in Table 4. Information Processing

As seen in Table 4, at least one question part for each of the exam questions elicited consistent (>70%) evidence of information processing in the students’ exam responses, although evidence of all of the aspects of information processing was not elicited for any one question. The complete analysis of information processing in question parts is available in Table S2 in the Supporting Information. All of the question parts that elicited E

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 3. Description of Analysis of Student Work from the Spectroscopy Question Question Part SP-2

SP-3

SP-4

Applied Codea IP: interprets data in model IP: interprets data in text PS: lists equation CT: makes conclusion CT: provides justification IP: interprets data in model IP: interprets data in text IP: uses provided data in answer PS: decides on a process PS: lists equation PS: substitutes numbers PS: organized solution PS: clearly arrives at an answer IP: interprets data in model IP: sketches graph/figure CT: makes conclusions CT: provides justification

Evidence from Student Work Statement: “Dye 1 shows a higher absorbance than Dye 2” Statement: “Bromothymol Blue has a higher molar absorvity [sic] than Methyl Red” The student listed the Beer’s law equation of absorbance. Statement: “Dye 1 is Bromothymol Blue and Dye 2 is Methyl Red” Statement: “Dye 1 shows a higher absorbance than Dye 2, due to the fact that Bromothymol Blue has a higher molar absorvity [sic] than Methyl Red” Student determined the maximum absorbance to be 0.35 and the wavelength at the maximum to be 600 nm from the information provided in the spectra. Student used the value for molar absorptivity provided in the question prompt. Student used the numbers provided in the question prompt and determined from the spectra in the Beer’s law equation. Student chose to calculate the concentration for dye 1; they also used Beer’s law. Student wrote out the equation they used. Values were substituted into the Beer’s law equation (note the wrong value for b was used). Work was organized and easy to follow. Final answer was clearly indicated by boxing in the value of the concentration calculated. Student sketched a spectrum for Methyl Red. (Note: This curve is highlighted in light orange in Figure 3 to distinguish it from the curves in the original question prompt.)

The sketched curve represented a prediction/conclusion as to the shape of the new spectrum. Statement: “Doubling the concentration of Methyl red will also double the Absorbance.” (Note: While the student’s logic was correct, the execution is not. Rather than doubling each point, they doubled the initial values and then shifted the entire curve by a consistent amount.)

a

Process skills are indicated by the following codes: IP (information processing), PS (problem solving), and CT (critical thinking).

Problem Solving

transform data. In the example discussed here, analysis of student responses may provide motivation for instructors to provide instruction in the appropriate transformation of data as well as provide more opportunities for student practice and feedback. Using symbolic representations to represent chemical phenomena is another aspect of information processing important to chemistry. Prompts that elicited consistent evidence of symbolic representation generally included chemical formulas in the text of the question. Three of these question parts (ME-1, ME-5, and ME-6) are shown in Figure 4. All three of these contain chemical formulas, which may have prompted students to include symbolic representation in their answers even though ME-5 is the only one that explicitly prompted students to use chemical equations to support their answer. An example of a question part that elicited a moderate amount of symbolic representation was EC-5B, which is shown in Figure 5. In their responses to this question, students were asked to calculate the half-cell potential of the Ag|AgCl half-cell. While students were not explicitly prompted to include symbolic representation in their responses to successfully answer the question, almost half of the students included some type of chemical equation in their response.

While problem solving is a skill that most chemists recognize as a key skill in chemistry and most would say that they require students to engage in problem solving on a regular basis, fewer exam question parts elicited consistent evidence of problem solving in the students’ responses compared to information processing. However, all but one of the topics elicited some evidence as shown in Table 4. The complete analysis of evidence of problem solving is shown in Table S3 in the Supporting Information. Examination of the questions provides insights as to why this would be true. Question parts that elicited consistent evidence of problem solving typically had evidence of decides on process/procedure to use, substitutes numbers in the equation, organized/clear solution steps, and clearly arrives at an answer in student responses. As can be seen, in the electrochemistry question parts EC-5A and EC-5B shown in Figure 5, these questions direct students to complete calculations. All of the questions that elicited at least one aspect of problem solving produced evidence of decides on process/procedure to use. This makes sense because the first step of problem solving, in the way that it was defined, was for students to show their thought process for how they approached and solved the problem. For questions AT-1 and ST-1 (Figure 6), decides on process/procedure to use was the only code that was applied to these questions. These questions prompted students to explain the approach F

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 4. Summary of Evidence of Process Skills in Student Responses for Each Exam Question Parta

⧫ indicates that evidence was seen in 70% or more of responses. ◊ indicates that evidence was seen in 30−69% of responses.  indicates that evidence of the skill was seen in fewer than 30% of responses. Light blue shading indicates that the question part was intended to elicit evidence of a particular skill.

a

additional opportunities for student practice and feedback should be provided. It may also suggest that other tasks, such as problem sets or take-home exam questions, would be more appropriate for assessing problem solving so that students have time to provide evidence of all the components.

they would take to solve the problem but did not ask students to complete the calculations. As question part ST-1 (Figure 6) shows, students had to describe their approach to solving the problem but not necessarily solve the problem. Question parts that were most successful in consistently eliciting evidence of multiple aspects of problem solving in student responses, such as in SP-3 (Figure 1) or ST-4 (Figure 6), explicitly directed students to show their work or support their conclusion with a calculation, so students might have been more deliberate in showing all aspects of their problem solving process. However, even in these cases, the students often did not include the equation they were using in their answers. The other aspect of problem solving for which there was minimal evidence in a majority of the students’ responses was provides explanation. This is perhaps not surprising given that students were working under timed conditions and were not directly prompted to explain why they chose a particular problem solving approach in their response. The question part that was most successful in eliciting evidence for provides explanation for solution was ME-6 (Figure 4), which directs students to identify and justify any assumptions made in completing the calculation. However, even with this directive, only 60% of students stated the assumptions they made and even fewer (13%) provided an explanation for how they approached the problem. This provides an opportunity for instructors to reflect on the degree to which this process has been modeled for students and whether

Critical Thinking

Each of the exam questions had at least three question parts where evidence of critical thinking was consistently present in the students’ responses as shown in Table 4. The complete distribution of evidence of analysis of critical thinking in the exam questions is shown in Table S4 in the Supporting Information. Most of the question parts that elicited evidence of makes conclusions also elicited evidence of provides justif ication for claim, with a few exceptions: ME-5 (shown in Figure 4), CH3, and CH-4 (both shown in Figure 7). ME-5 (shown in Figure 4) elicited provides justif ication for claim without makes conclusions. These question prompts provided students with a claim for which they had to provide an explanation. Two questions, EC-7 and ME-1 (shown in Figure 4), elicited student responses that were coded as makes conclusions but not provides justif ication for claim. These question prompts included language to have students come up with a conclusion but did not include words like “explain”, “elaborate”, or “support your answer” to encourage students to support the conclusions they made. G

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 4. Model provided for students to use on the multiple equilibria questions and the text for the ME-1, ME-4, and ME-5 questions.

The last two dimensions of this process skill, compares and contrasts solutions and evaluates the reasonableness of an answer, were only seen to appreciable extent in student responses to AT5C and AT-5D (both shown in Figure 8), where evidence of evaluates the reasonableness of an answer was elicited in 60% and 40% of the students’ responses, respectively. In these questions, students were directed to analyze a provided explanation for errors in a data set and determine if the suggestions were reasonable. There were two codes, explains trends and states assumptions, that are part of critical thinking but were rarely evident in students’ responses. There were two question parts that elicited evidence of explains trends in student responses, AT-3 and CH-3 (Figure 7). Both of these question parts included text that prompted students to interpret graphs. Only two question parts, ME-4 and ME-6 (Figure 4), elicited evidence of states assumptions in a significant number of students’ responses. These were also the only question parts that included explicit language that explicitly directed students to list their assumptions. The provides alternative solution/explanation was only applied to student responses for three question parts, AT-4, EC-7, and SP-6. These question parts included direct language for students to describe two reasons why a certain phenomenon was observed or two ways of approaching a problem. Question

parts AT-4 and SP-6 also elicited evidence of makes conclusions and provides justification for claim because students had to provide detailed explanations for why their two proposed methods would work (or not) in the given scenarios. Summary Themes from Exam Question Analysis

Information processing is generally a precursor to problem solving and critical thinking, and it was almost always found paired with those other skills. However, there were four exam questions where information processing was the only process skill elicited in the majority of the student responses: AT-1, ME2, CH-1, and CH-2 (both shown in Figure 7). Where information processing was the only process skill, these questions tended to focus on interpretation of data and not on problem solving or the generation of conclusions. Problem solving was always paired with information processing, while critical thinking sometimes appeared alone. Where critical thinking was the sole process skill elicited, the format of these question parts asked students to generate a conclusion or way to approach the data without prompting students to explicitly draw upon the data provided directly in the question prompt. The only questions that elicited all three of the process skills were ST-1 (Figure 6), ST-3, and ST-4 (Figure 6). These statistics question parts prompted students to make decisions using information provided to them (interprets data in model H

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 5. Model that was provided for students to use on the electrochemistry questions and the text for the EC-5B and EC-7 question.

Figure 6. Set of data that was provide for students to use on the statistics questions and the text for ST-1 and ST-4.

Alignment of Process Skills with Faculty-Identified Process Skills

and/or interprets data in text) and required students to explain or support their answers. Question ST-3 indicated that students could use a calculation in their response while ST-4 (Figure 6) required a calculation as part of the support. There were only two questions that did not elicit evidence of any process skills in the students’ responses. Both of these questions came from the analytical tools set of questions, AT-5A and AT-5B. To answer these questions, students were able to use memorized information and did not provide evidence that they used higher-level reasoning to answer these questions.

As indicated earlier, faculty identified the process skill(s) they expected students to use in answering the parts of most of the exam questions. Overall, all but four of the question parts were at least partially successful in eliciting evidence of the intended process skills. As can be seen in Table 4, instructors were more successful in eliciting evidence of information processing and critical thinking than they were for problem solving. For information processing, all but one of the questions, ME4, elicited consistent evidence of the intended process skill. The I

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 7. Set of data that was provided for students to use on the chromatography question parts and the text for CH-1, CH-2, CH-3, and CH-4.

For critical thinking, all but one of the questions, CH-2 (shown in Figure 7), elicited evidence of the intended process skill in the students’ responses. The prompt for CH-2 was as follows: “Label each curve with the name of the source of band broadening that it represents.” Students’ written responses for this prompt included labeling the curves, but there were no additional explanations of why they labeled the curves the way that they did. In this case, the question was not written in such a way as to prompt students to provide evidence of their thinking. When looking at how well the problem solving codes aligned with the faculty intentions, only four of the eight elicited consistent evidence of problem solving. One possible reason for the misalignment is that the faculty thought students were going to use evidence of problem solving to support their reasoning but students actually supported their answer by employing heuristics, using memorized knowledge, or simply did not show evidence of their problem solving process in their responses. For example, the spectroscopy question part SP-5 stated, “Using the Dye 1 absorption spectrum shown, sketch with rough accuracy the calibration curves (plot of absorbance vs. concentration of

Figure 8. Stem of question part AT-5 with AT-5C and AT-5D.

text for ME-4 was the following: “To calculate the pH of solutions made from solid Na2A (the notation represented sodium aspartate), at least one critical assumption must be made to simplify the process. Identify the most critical assumption that must be made.” However, it was observed that most students just described an assumption to simplify the calculation and did not incorporate the information that was presented in the diagram. In this case, students did not complete the question as prompted. J

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

question parts elicited evidence of at least one process skill, with only one exam question part showing no evidence of process skills in the majority of the student responses. The analysis also shows that if instructors wish to see evidence of process skills in student responses, then these expectations must be clearly communicated through the written question prompt; additional success may be found when the process skill is the dominant portion of the question. The results also suggest that some scientific practices such as providing explanations for problem solving processes, comparing and contrasting solutions, and evaluating the reasonableness of solutions/answers have not become part of typical exam structures or students’ normal practice and require more support from instructional activities.

dye) expected for a set of data collected at 395 nm and for a set of data collected at 616 nm. Label the graph appropriately.” While the faculty expected students to show Beer’s law and provide support for the differing lines, many students sketched curves without showing the calculations that they used (if at all) in graphing the data. Another example is spectroscopy question part SP-6: “Brad is trying to measure the absorbance of a high concentration Dye 1 solution at 616 nm and the absorbance reading was too high. There are three possible changes you could make to his method that would allow him to measure the absorption of this solution. Describe two of these changes and explain why each would be an acceptable change.” This question asked students to describe hypothetical changes to the measurement method and how these changes would influence the data collected. While students demonstrated critical thinking in their answers, many of them did not show evidence of deciding on a procedure, the use of an equation to support their decisions, or an explanation for their solution (and consequently, no evidence of problem solving as defined by the faculty in this project). Students might have viewed the question as just asking for hypothetical reasoning to answer the question. The other two question parts that did not elicit consistent evidence were from the multiple equilibria category. The first question part, ME-4, did not ask students to solve a problem, although it did require critical thinking. The last question was ME-5 (shown in Figure 4). In their responses to these question parts, some students did exhibit evidence of problem solving, but most of them relied on skills coded as critical thinking. In summary, the difference between what faculty had intended the exam prompts to elicit and what evidence was found in students’ responses was caused either by the student not showing evidence of the intended skill (even though prompted) or by a misalignment between the assessment questions and the intended skills. The former may be due to students running out of time or not noting all the components of the question part that needed to be addressed. It may also be due to students having insufficient instruction or practice with feedback before being assessed, which resulted in student uncertainty in how to complete the task as prompted.

Implications for Instruction

The results provide some insights and recommendations for how instructors can elicit evidence of specific process skills in students’ responses. Including a model or data in the question, whether it be a graph, figure, or table, can help elicit evidence of information processing in student responses, particularly if students are required to indicate what/how information was used in their response. To elicit information processing, the questions should include terms such as “label the graph”, “using the graph”, or “using the information given.” It was also seen that if the question referred to data in the prompt, students were more likely to include this data in their answer. This is an example of “parallel structure”, where students mirror the structure of the question prompt. To elicit evidence of problem solving, questions should include terms that prompt students to show their work or provide support for decisions made. Phrases such as “determine the concentration”, “show your work”, and “support your answer” were all successful in getting students to provide some evidence of their thought processes. However, even though these terms elicited some evidence of problem solving, the questions must be carefully constructed to require students to actually solve a problem that contains multiple parts. To better elicit evidence of critical thinking, including phrases such as “support your work”, “explain your reasoning”, or “explain why that would be the most appropriate” can be effective. However, there are ways to structure questions to elicit only one of these process skills if desired. To isolate the reasoning a student might use to support their answer, the question can ask students to support a specific claim that is provided by the instructor. Specific characteristics in students’ responses, such as state assumptions or evaluate the reasonableness of an answer, are only likely to be elicited if the prompt includes targeted phrases such as “state your assumptions” or if this has become normative practice for the class. Including models of data can help support students to provide evidence in critical thinking by providing them opportunities to support their conclusions using data. Writing questions designed to elicit evidence of process skills provides instructors an opportunity to reflect upon how students responded to the questions and identify strengths and areas where the intended outcomes were not achieved. This should lead to both better question design as well as identification of aspects of process skills that need to be better developed, which should lead to revisions to course activities.



LIMITATIONS Due to the nature of the ANA-POGIL project and the frequency of courses being offered at the participating institutions, the question usage distribution varied across the multiple semesters. A limitation to using handwritten test questions is the difficulty in deciphering students’ handwriting and following the students’ thought processes. In addition, using exam questions to assess process skills does create limitations to the interpretation. Students had a limited amount of time to complete the exams, which might have caused them to not exhibit the full extent of their use of process skills in their written answers. There were no other sources of data to triangulate the students’ responses, such as one-on-one student interviews or classroom observations, to determine whether the evidence the students presented in their exam data matched the process that students used when they constructed their answers.





CONCLUSIONS

The findings indicate that questions can be successfully designed to elicit evidence of process skills in student responses. They also highlight the importance of aligning assessments with curricular goals and classroom pedagogies. The majority of the exam

ASSOCIATED CONTENT

S Supporting Information *

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b00877. K

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

(13) Sanabria-Ríos, D.; Bretz, S. L. Investigating the relationship between faculty cognitive expectations about learning chemistry and the construction of exam questions. Chem. Educ. Res. Pract. 2010, 11 (3), 212−217. (14) Fuentealba, C. The role of assessment in the student learning process. Journal of Veterinary Medical Education 2011, 38 (2), 157−162. (15) Suskie, L. Understanding the nature and purpose of assessment. In Designing Better Engineering Education through Assessment; Spurlin, J., Rajala, S. A., Lavelle, J. P., Eds.; Stylus: Herndon, VA, 2008; pp 3−19. (16) American Chemical Society Undergraduate Professional Education in Chemistry. ACS Guidelines and Evaluation Procedures for Bachelor’s Degree Programs; 2015 https://www.acs.org/content/dam/ acsorg/about/governance/committees/training/2015-acs-guidelinesfor-bachelors-degree-programs.pdf (accessed May 2019). (17) Kondo, A. E.; Fair, J. D. Insight into the Chemistry Skills Gap: The Duality between Expected and Desired Skills. J. Chem. Educ. 2017, 94 (3), 304−310. (18) POGIL. Process Skills Definitions. https://pogil.org/educators/ additional-resources#processskills (accessed Apr 2019). (19) Cole, R.; Lantz, J.; Ruder, S. PO: The Process. In POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish To Empower Learners; Stylus Publishing: Sterling, VA, 2019. (20) POGIL Project; POGIL, 2018. https://pogil.org/ (accessed May 2019). (21) Moog, R. S.; Spencer, J. N. POGIL: An Overview. In Process Oriented Guided Inquiry Learning (POGIL); Moog, R. S., Spencer, J. N., Eds.; American Chemical Society: Washington, DC, 2008; Vol. 994, Chapter 1, pp 1−13. DOI: 10.1021/bk-2008-0994.ch001 (accessed May 2019). (22) Smith, K. C.; Nakhleh, M. B.; Bretz, S. L. An expanded framework for analyzing general chemistry exams. Chem. Educ. Res. Pract. 2010, 11 (3), 147−153. (23) Raker, J. R.; Towns, M. H. Benchmarking problems used in second year level organic chemistry instruction. Chem. Educ. Res. Pract. 2010, 11 (1), 25−32. (24) Zoller, U. Algorithmic, LOCS and HOCS (chemistry) exam questions: Performance and attitudes of college students. International Journal of Science Education 2002, 24 (2), 185−203. (25) Tsaparlis, G.; Zoller, U. Evaluation of higher vs. lower-order cognitive skills-type examinations in chemistry: implications for university in-class assessment and examinations. University Chemistry Education 2003, 7 (2), 50−57. (26) Schurmeier, K. D.; Atwood, C. H.; Shepler, C. G.; Lautenschlager, G. J. Using Item Response Theory To Assess Changes in Student Performance Based on Changes in Question Wording. J. Chem. Educ. 2010, 87 (11), 1268−1272. (27) Zhou, S.; Han, J.; Koenig, K.; Raplinger, A.; Pi, Y.; Li, D.; Xiao, H.; Fu, Z.; Bao, L. Assessment of scientific reasoning: The effects of task context, data, and design on student reasoning in control of variables. Thinking Skills and Creativity 2016, 19, 175−187. (28) Crisp, V.; Sweiry, E. Can a picture ruin a thousand words? The effects of visual resources in exam questions. Educational Research 2006, 48 (2), 139−154. (29) Gibson, V.; Jardine-Wright, L.; Bateman, E. An investigation into the impact of question structure on the performance of first year physics undergraduate students at the University of Cambridge. Eur. J. Phys. 2015, 36 (4), 045014. (30) Crisp, V.; Sweiry, E.; Ahmed, A.; Pollitt, A. Tales of the expected: the influence of students’ expectations on question validity and implications for writing exam questions. Educational Research 2008, 50 (1), 95−115. (31) Broman, K.; Bernholt, S.; Parchmann, I. Analysing task design and students’ responses to context-based problems through different analytical frameworks. Research in Science & Technological Education 2015, 33 (2), 143−161. (32) Weston, M.; Haudek, K. C.; Prevost, L.; Urban-Lurain, M.; Merrill, J. Examining the impact of question surface features on students’ answers to constructed-response questions on photosynthesis. CBE Life Sciences Education 2015, 14 (2), No. ar19.

Expanded coding scheme definitions (PDF, DOCX) Exam questions and analysis (PDF, DOCX)

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Jennifer A. Schmidt-McCormack: 0000-0003-1488-601X Renée S. Cole: 0000-0002-2807-1500 Notes

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. The authors declare no competing financial interest.



ACKNOWLEDGMENTS This work was supported in part by the National Science Foundation under Grants 0717492 and 1524965. We thank the members of the ANA-POGIL consortium members who played key roles in the creation of the exam questions and collection of student responses. We also thank the students who allowed us to document and analyze their classroom experiences and artifacts.



REFERENCES

(1) Freeman, S.; Eddy, S. L.; McDonough, M.; Smith, M. K.; Okoroafor, N.; Jordt, H.; Wenderoth, M. P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. U. S. A. 2014, 111 (23), 8410−8415. (2) National Research Council. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; National Academies Press: Washington, DC, 2012. (3) Biggs, J. Aligning teaching and assessing to course objectives. Teaching and Learning in Higher Education: New Trends and Innovations 2003, 13−17. (4) Laverty, J. T.; Underwood, S. M.; Matz, R. L.; Posey, L. A.; Carmel, J. H.; Caballero, M. D.; Fata-Hartley, C. L.; Ebert-May, D.; Jardeleza, S. E.; Cooper, M. M. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol. PLoS One 2016, 11 (9), No. e0162333. (5) Matz, R. L.; Fata-Hartley, C. L.; Posey, L. A.; Laverty, J. T.; Underwood, S. M.; Carmel, J. H.; Herrington, D. G.; Stowe, R. L.; Caballero, M. D.; Ebert-May, D. Evaluating the extent of a large-scale transformation in gateway science courses. Sci. Adv. 2018, 4 (10), eaau0554. (6) Boud, D. Reframing assessment as if learning were important. In Rethinking Assessment in Higher Education: Learning for the Longer Term; Boud, D., Falchikov, N., Eds.; Routledge: London, 2007; pp 24−36. (7) Boud, D.; Falchikov, N. Aligning assessment with long-term learning. Assessment and Evaluation in Higher Education 2006, 31 (4), 399−413. (8) Ye, L.; Oueini, R.; Lewis, S. E. Developing and Implementing an Assessment Technique To Measure Linked Concepts. J. Chem. Educ. 2015, 92 (11), 1807−1812. (9) Crooks, T. J. The impact of classroom evaluation practices on students. Review of Educational Research 1988, 58 (4), 438−481. (10) Smit, R.; Birri, T. Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies. Studies in Educational Evaluation 2014, 43, 5−13. (11) Stowe, R. L.; Cooper, M. M. Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. J. Chem. Educ. 2017, 94 (12), 1852−1859. (12) Reed, J. J.; Brandriet, A. R.; Holme, T. A. Analyzing the Role of Science Practices in ACS Exam Items. J. Chem. Educ. 2017, 94 (1), 3− 10. L

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(33) Crisp, V.; Johnson, M.; Constantinou, F. A question of quality: Conceptualisations of quality in the context of educational test questions. Research in Education 2018, 003452371775220.

M

DOI: 10.1021/acs.jchemed.8b00877 J. Chem. Educ. XXXX, XXX, XXX−XXX