Concept Learning versus Problem Solving: Evaluating a Threat to the

Three different samples of students were asked to answer five multiple-choice questions concerning the properties of a sample of helium gas (particle ...
0 downloads 0 Views 920KB Size
Article pubs.acs.org/jchemeduc

Concept Learning versus Problem Solving: Evaluating a Threat to the Validity of a Particulate Gas Law Question Michael J. Sanger,* C. Kevin Vaughn, and David A. Binkley Department of Chemistry, Middle Tennessee State University, Murfreesboro, Tennessee 37132, United States ABSTRACT: Three different samples of students were asked to answer five multiple-choice questions concerning the properties of a sample of helium gas (particle speed, state of matter, sample volume, sample pressure, and particle distribution), including a particulate question first used by Nurrenbern and Pickering (particle distribution). In the first experiment, half of the students were given the boiling point of helium under these conditions while the other half were not; in the second experiment, half of the students were explicitly told that the cooled gas sample would not liquefy or solidify under these conditions while the other half were not; in the third experiment, half of the students received instruction that asked them to focus on whether the container for the gas sample was rigid or nonrigid while the other half received traditional instruction that did not focus on the rigidity of gas container. The responses from students in these three experiments were compared. The first experiment was unable to show any significant difference in students’ responses to the five questions and found that the proportion of correct answers for the two groups was equivalent for the particle speed, sample volume, and sample pressure questions. The second experiment found that students given the explicit information were more likely to correctly predict the state of matter for the sample, but the responses from the two groups of students were equivalent for the other four questions. The third experiment suggested that students receiving instruction regarding the rigidity of the gas container were more likely to choose the correct answer for the questions related to the sample volume, sample pressure, and particle distribution. The results of the first two experiments suggest that choosing an incorrect state of matter for the gas sample does not appear to be a major threat to the validity of Nurrenbern and Pickering’s particulate question. The last two experiments suggest that providing students with instructional or assessment cues with the goal of helping students activate schema related to the behavior and properties of gases appeared to improve their answers to some of the five multiple-choice questions asked in these studies. KEYWORDS: High School/Introductory Chemistry, First-Year Undergraduate/General, Chemical Education Research, Misconceptions/Discrepant Events, Testing/Assessment, Gases, Liquids, Solids FEATURE: Chemical Education Research



“Concept Learning versus Problem Solving: Is There a Difference?” to show that students who can solve mathematical chemistry problems often have difficulty answering conceptual problems covering the same topics, especially if these problems address concepts at the particulate level. These particulate questions spawned a revolution in chemical education research in the United States, and several chemical education researchers have used these questions and others to analyze and evaluate students’ particulate-level understanding of chemistry.12−35 This particular gas question had been used by chemical education researchers for over 20 years.12,13,16−19,31,33 Many of these studies16−19,31 have corroborated Nurrenbern and Pickering’s assertion that students are generally more successful at answering algorithmic problems correctly compared to particulate conception questions. While Pickering17 concluded that this difference was due to students’ lack of factual knowledge and not inherent ability, Cracolice et al.31 concluded that students’ scientific reasoning skills seem to be at least

INTRODUCTION In recent years, chemical education researchers have recognized that the reliability and validity of test instruments and surveys used to collect data involving students must be tested.1−3 The need for tests of reliability and validity is clear:1 Research on how students learn must be based upon the data provided by instruments that have been proven to measure what they say they are measuring [validity] and are shown to provide stable results [reliability]. Anything less could lead to erroneous results or interpretation of data. Since that time, several chemical education research studies testing the reliability and validity of newly created data collection instruments have been reported.4−11 The goal of this study is to continue our analysis12,13 of the validity of a multiple-choice particulate gas law question first used by Nurrenbern and Pickering.14 This question was shown to be fairly reliable,12 with a stable distribution of student responses based on four different samples from four different universities spanning 20 years. Nurrenbern and Pickering14 first used this particulate multiple-choice question (and others) in a paper titled © XXXX American Chemical Society and Division of Chemical Education, Inc.

A

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

choice questions regarding the average speed of the particles, the state of matter present in the container, the volume of the sample, the pressure of the sample, and the distribution of particles in the tank after cooling the helium gas sample from 20 to −20 °C (Figure 1). The first experiment described in this

partly responsible for this difference in ability to answer algorithmic versus particulate conceptual questions.



TESTING THE VALIDITY OF THE PARTICULATE GAS LAW QUESTION Testing the validity of this multiple-choice gas law question requires chemical education researchers to investigate whether students’ answers to this question represent a valid assessment of their understanding of gas laws or the kinetic molecular theory. In 2007, Sanger and Phelps12 reported students’ molecular-level explanations of their answers to this question, and identified four parameters (particle speed, sample volume, gas pressure, and state of matter) that students used to explain their choice at the particulate level. These authors noted that 80% of students choosing the correct answer did not show any misconceptions, but 90% of students choosing one of the three incorrect answers demonstrated at least one misconception regarding the behavior of gas particles. These results suggest that this question has reasonable construct validity.36 However, Sanger and Phelps12 identified several possible threats to the validity of this question’s ability to measure students’ molecular understanding of gas particle behavior. • The critical attribute of change in this question (a decrease in particle speeds as the gas sample was cooled) is not depicted in the static multiple-choice question. As a result, students are reluctant to choose the correct answer because it shows no change in particle behavior with respect to the initial picture. • Students who do not recognize that helium gas will not liquefy or solidify at −20 °C and 3 atm may choose answers that are incorrect for gas particles but represent appropriate choices for the behavior of solid or liquid particles. • Students may believe that more than one choice is correct, depending on whether the cross-section of the tank in the question was cut horizontally or vertically. Each of these scenarios could represent a threat to the validity of the question, in which students having a misconception related to gas particle behavior may chose the correct answer for an incorrect reason and students choosing an incorrect answer may be exhibiting an error that is not the result of a misconception based on gas particle behavior. Sanger et al.13 analyzed the first validity threat described above, and found that showing particle motions (including the fact that particles in the initial picture moved faster than in all four distractors) improved the proportion of students answering the question correctly. Several students in that study admitted to being reluctant to choose the correct answer for the static picture because it looked the same as the initial picture, but were willing to choose it in the animated question because they could see the particles moving slower in the distractor compared to the initial picture. These results suggest that not showing particle motions can significantly affect the validity of this question. The goal of this paper is to evaluate the second threat to the validity of this question. To test this possible validity threat, we randomly provided half of the students with information that should aid them in recognizing that the helium sample would remain a gas at the lower temperature. As a way to track their beliefs about the behavior of the gas as the sample was cooled, all students in this study were asked to answer five multiple-

Figure 1. Multiple-choice questions used in this study (adapted from a question first reported by Nurrenbern and Pickering14). For the first experiment, half of the students were given the underlined sentence and none were given the italicized sentence. For the second experiment, half were given the italicized sentence and none were given the underlined sentence. For the third experiment, none of the students were given the underlined or italicized information.

study tests the hypothesis that providing students with the boiling point of helium at 3 atm (−268 °C, taken from a phase diagram of helium)37 would help them recognize that helium would remain as a gas at −20 °C, and that this would change their views regarding some of the other properties of the gas particles. For the second experiment, we tested the hypothesis that providing an explicit statement that this temperature was not cold enough to liquefy or solidify the helium sample would change these students’ views regarding the properties of the cooled gas sample. The third experiment in this study, which is not tied to testing the second validity threat described above, tests the hypothesis that providing instruction focusing on the difference between rigid and nonrigid containers would change students’ answers to these multiple-choice questions regarding the properties of the gas particles.



THEORETICAL PERSPECTIVE This study uses constructivism38−40 as part of its theoretical framework, in which learners construct their own knowledge as they interact with the learning environment and attempt to B

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 1. Distribution of Student Responses to the Five Multiple-Choice Questions in Figure 1 for the First Experiment

a

Correct answer for a gaseous sample.



make sense of their experiences and interactions. This study also uses the information-processing theory of learning to explain student learning.41−46 This theory postulates that external information (ideas, events, concepts) must be attended to by learners in their sensory memory and must be selected by the learners (based on their previous knowledge, interests, prejudices, beliefs, etc.) before it can be used in their working memory. Working memory is limited, but this is where learners bring together and incorporate new information from sensory memory and stored information from long-term memory.45,46 The use of schema-activation methods,45 which assist learners in connecting new material with relevant stored information, can improve student learning. Cueing is one example of a schema-activation method, and involves the use of instructional (or assessment) methods to activate relevant prior knowledge from long-term memory. These instructional cues can be visual47−50 or symbolic.51−53 The cues used in this study include the additional information provided to some of the students during the survey (assessment) or the instruction regarding rigid and nonrigid containers (instruction).

METHODOLOGY

Subjects

For the first experiment, a group of students (N = 391) enrolled in a first-semester general chemistry course were asked to answer the five multiple-choice questions appearing in Figure 1 after instruction on the properties and behaviors of ideal gases. Students were randomly assigned to one of two groups; one group of students was given the boiling point of helium at 3 atm (the underlined sentence in Figure 1) while the other group was not given this information. None of these students was given the information in the italicized sentence in Figure 1. Student answer sheets with missing data were excluded from this study. This study included 187 students who were not given the boiling point information (no bp) and 191 students who were given the boiling point data (bp). In the second experiment, a group of 306 students enrolled in a second-semester general chemistry course were asked to answer the questions appearing in Figure 1 a few months after instruction on the properties and behaviors of ideal gases. These students were also randomly assigned to two groups; one group was told that −20 °C would not be cold enough to solidify or liquefy the helium sample (the italicized sentence in Figure 1), while the other group was not given this information. None was given the underlined sentence in Figure 1. There C

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 2. Results of the Equivalence Tests for the Questions Used in the First Experiment

a

Question

X̅ no bp

X̅ bp

Particle speed State of matter Sample volume Sample pressure Particle distribution

0.840 0.738 0.519 0.476 0.471

0.853 0.775 0.513 0.487 0.539

Intervala −0.072, −0.086, −0.100, −0.100, −0.100,

0.072 0.086 0.100 0.100 0.100

z1b

z2b

Result

2.32 2.78 1.84 2.16 3.28

1.57 1.11 2.07 1.73 0.62

Equivalent Not within interval Equivalent Equivalent Not within interval

These boundaries represent the standard deviation of two groups combined, multiplied by 0.2. bThe critical score for these values is zα=0.10 = 1.28.

Data Analysis

were 154 students who were not given the state of matter information (not told) and 152 students who were given this information (told). A sample of 67 students (34 not told and 33 told) were interviewed regarding their answers to these five multiple-choice questions. For the third experiment, 161 students enrolled in two different sections of a first-semester general chemistry course taught by two different instructors were asked to answer the multiple-choice questions appearing in Figure 1. Students in the first section (N = 86) received traditional instruction on the properties and behaviors of ideal gases (control); students in the second section (N = 75) received instruction on the properties and behaviors of ideal gases that focused on the difference between rigid and nonrigid containers (experimental). The instructor emphasized that gas samples in perfectly rigid containers (e.g., steel tanks) had constant volumes but could undergo changes in pressure, while gas samples in perfectly nonrigid containers (e.g., rubber balloons) had constant pressures but could undergo changes in volume.

Students’ responses for the five multiple-choice questions appearing in Figure 1 were collected and analyzed for the students in each experiment. The distribution of responses to each question was compared for the two groups of students in each experiment using χ2 tests of homogeneity.60 The proportion of students in the two groups choosing the correct answer for each question was also compared using z-tests, and these values were compared for equivalence using a method described in this Journal by Lewis and Lewis61 based on the work of Schuirmann.62 When determining the interval for the equivalence tests, we followed the more conservative suggestion of Lewis and Lewis61 and chose to set these boundaries based on an effect size of 0.2. We also compared the distribution of student responses to the particulate distribution question for the students in the first two experiments with those reported in the 2007 study by Sanger and Phelps15 using a χ2 test of homogeneity.



Reliability and Validity of the Data Collection Instrument

RESULTS AND DISCUSSION

First Experiment

To test the reliability and measure the stability of the five questions in the data collection instrument, we used the test− retest method.54,55 A sample of 56 students enrolled in a firstsemester organic chemistry course were asked to answer the five questions, and then were asked to answer the same questions four weeks later. The reliability indices for the five questions appearing in Figure 1 (expressed as the fraction of students choosing the same answer for both testing events) were 0.98, 0.71, 0.64, 0.68 and 0.66, respectively. These values fall in the ranges reported by other chemical education researchers.5,7,56−59 The validity for the content in these questions was established by a content analysis12 of students’ molecularlevel explanations for their answers to Nurrenbern and Pickering’s14 particulate gas law question. The content validity of these questions was also checked by asking three collegelevel faculty members who were not involved in this study to evaluate each question (face validity). All three faculty members agreed that each test question represented information that would be relevant to evaluate students’ particulate-level understanding of the properties and behaviors of ideal gases. Finally, the validity of these questions to measure students’ beliefs regarding the properties and behaviors of ideal gases was tested by interviewing 67 students and asking them to explain their choices for each question at the particulate level. The interviews showed that the cognitive processes used by each student in the explanations of his or her answers to all five questions were consistent with those skills intended for use by the test developer and were relevant to the content appearing in each of the five questions.

The first experiment compares the responses from students who were given the boiling point of helium at 3 atm to the responses from students who were not given this information. The distribution of these students’ responses to the five questions appears in Table 1. These distributions appear very similar for the two groups for all five questions. In fact, we were unable to find significant differences in the distribution of student responses for any of the questions, as follows:

Because these distributions looked so similar, we decided to test whether the responses from the two sets of students could be deemed equivalent.61 Table 2 contains the data from the equivalence tests. Based on these tests, we can conclude that the proportion of correct responses for the particle speed, sample volume, and sample pressure are equivalent for the two groups of students. However, we were unable to conclude that the proportion of correct responses was equivalent for the two groups for the questions concerning state of matter and particle distribution. As we failed to find a significant difference for the distribution of student responses to these two questions and failed to find the proportion of correct responses to these two questions equivalent for the two groups of students, the only conclusion we can reach from these comparisons is that the sample size (N = 378) was insufficient to determine whether there was a difference between the responses of these two groups of students to these two questions. D

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 3. Distribution of Student Responses to the Five Multiple-Choice Questions in Figure 1 for the Second Experiment

a

Correct answer for a gaseous sample.

−20 °C and 3 atm. This interpretation is also supported by the fact that providing the boiling point data did not significantly impact the students’ responses to question 3 (sample volume) or question 5 (particle distribution) either, where we had expected to see differences in student responses if they assumed different states of matter were present.

The equivalence of the students’ responses to questions 1 (particle speed) and 4 (sample pressure) is not surprising, given that the cooled particles should have decreased speeds and a decreased sample pressure regardless of whether they exist in the solid, liquid, or gas phase. On the other hand, students’ responses to questions 2 (state of matter), 3 (sample volume), and 5 (particle distribution) should be different depending on the state of matter present in the container. For example, in question 3, the sample volume should stay the same if it remained as a gas, but should decrease if the gas sample deposited as a solid or condensed as a liquid. For question 5, only the first picture would be acceptable if the sample were still a gas, but any of the four pictures could be acceptable if the students believed the container now contained a solid or liquid sample. It is encouraging that roughly three-fourths of all students (74% for no bp and 77% for bp) believed that the helium sample would remain gaseous; this does suggest that predicting an incorrect state of matter is not a major mistake made by students when answering these questions. The lack of a significant difference in the students’ responses to question 2 (state of matter) suggests that students who were given the boiling point of helium either did not attend to this information, or did not know how to use this information to predict the correct state of matter present in the container at

Second Experiment

Because the first experiment failed to find a significant difference in student responses when the boiling point of helium was given, and there was some question as to whether students were able to use this boiling point information to correctly decide the state of matter present in the container at −20 °C, we decided to perform another experiment with a different sample of students in which half of them were told that −20 °C is not cold enough to liquefy or solidify a sample of helium gas. This experiment compares the responses from students who were told that helium would not liquefy or solidify (told) to the responses from students who were not told this information (not told). The distribution of these students’ responses to the five questions appears in Table 3. Although the distribution of student responses from the two groups to question 2 (state of matter) is very different and statistically significantχ2(2) = 27.74, p < 0.0001the distributions appear to be very similar for the other four E

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 4. Results of the Equivalence Tests for the Questions Used in the Second Experiment

a

Question

X̅ not told

X̅ told

Particle speed State of matter Sample volume Sample pressure Particle distribution

0.961 0.675 0.617 0.532 0.526

0.954 0.941 0.638 0.546 0.513

Intervala −0.040,  −0.097, −0.100, −0.100,

0.040 0.097 0.100 0.100

z1b

z2b

Result

1.44  2.13 1.99 1.53

2.06  1.37 1.51 1.97

Equivalent Statistically different Equivalent Equivalent Equivalent

These boundaries represent the standard deviation of two groups combined, multiplied by 0.2. bThe critical score for these values is zα=0.10 = 1.28.

Table 5. Distribution of Student Responses to the Particulate Gas Law Question

a

See ref 12. bCorrect answer for a gaseous sample.

the two groups were very similar within each category; the major difference appeared to be the number of students from the two groups in each category. Although there were more students in the told group choosing “gas” as the state of matter for question 2 compared to the not told group (94% versus 74%, respectively), the proportion of students recognizing that the gas sample would occupy the entire container, the first choice for question 5, was similar for the two groups (48% vs 44%, respectively). The significant difference in the distribution of students’ responses to question 2 (state of matter), identified from the statistical comparison and the student interviews, is not surprising and suggests that telling students that −20 °C was not cold enough to liquefy or solidify the helium gas sample does improve the proportion of students recognizing that helium would remain in the gaseous state under these conditions. This result also appears to corroborate our interpretation of the results from the first experiment that students provided with the boiling point of helium at 3 atm either did not attend to this information or did not know how to use this information to predict the correct state of matter present in the container. It is interesting that even though the distribution of students’ responses to question 2 (state of matter) is different, the distribution of their responses to the other four questions is equivalent. These results, along with the nonsignificant differences seen in the first experiment, suggest that predicting an incorrect state of matter for the gas sample at −20 °C and 3 atm does not appear to affect students’ responses to the other four questions. As helping students recognize that the helium sample will remain in the gas phase at this lower temperature did not significantly impact their answers to question 5 (particle distribution), it does not appear that predicting an incorrect state of matter is a significant threat to the validity of the particulate multiple-choice question.

questions and these differences are not statistically significant, as follows:

Table 4 contains the data from the equivalence tests for questions 1, 3, 4, and 5. Based on these tests, we can conclude that the proportion of correct responses for the particle speed, sample volume, sample pressure, and particle distribution are equivalent for the two groups of students. Student Interviews

A sample of 34 students in the not told group and 33 students in the told group were asked to explain their answers to question 2 (state of matter) and question 5 (particle distribution) using semistructured interviews. For the not told group, 15 students (44%) believed the sample was a gas that occupied the entire container, 8 students (24%) believed the sample was a gas that was condensed into a smaller volume, 1 student (3%) believed the sample was a gas that expanded into a larger volume, 5 students (15%) believed the sample was a liquid that occupied a smaller volume, and 5 students (15%) believed the sample was a solid that occupied a smaller volume. For the told group, 16 students (48%) believed the sample was a gas that occupied the entire container, 14 students (42%) believed the sample was a gas that was condensed into a smaller volume, 1 student (3%) believed the sample was a gas that expanded into a larger volume, 1 student (3%) believed the sample was a liquid that occupied a smaller volume, and 1 student (3%) believed the sample was a solid that occupied a smaller volume. Most of the students in the told group who stated that the sample would remain a gas chose their answer based on the explicit statement included in the survey (13/16 = 81% occupying the entire container, 11/14 = 79% with a smaller volume, and 1/1 = 100% with a larger volume). The explanations from the students in F

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 6. Distribution of Student Responses to the Five Multiple-Choice Questions in Figure 1 for the Third Experiment

a

Correct answer for a gaseous sample.

Comparison to Previously Reported Data

particle distribution question and ultimately helped them arrive at the correct answer.

In the first two experiments, roughly half of all students (47− 54%) chose the correct answer for question 5 (particle distribution). This is higher than the proportion of students choosing the correct answer as reported in the literature,12−14,16 which ranges from 30% to 36% without intervention. We compared the distribution of students’ responses to the particulate question from these two experiments to those from another study12 performed at the same university using students enrolled in the same general chemistry course sequence from a different semester. The data from these three experiments (using the combined data from both groups for each experiment in this study) appear in Table 5. The χ2 test of homogeneity showed that these three groups are significantly different: χ2(6) = 37.44, p < 0.0001. An analysis of residuals showed that students in the 2007 study were less likely to choose the first picture (correct choice) and more likely to choose the second picture, while students in this second experiment were less likely to choose the second picture. We believe that the four questions answered by students in these two experiments before answering the particle distribution question may have served as symbolic cues51−53 that assisted students in activating their schema45 concerning the properties and behaviors of ideal gases before answering the

Third Experiment

Although the first two experiments suggest that predicting an incorrect state of matter for the helium sample did not significantly affect students’ answers to questions regarding the particulate behavior and properties of gases, they do not suggest which instructional strategies will be effective in improving students’ answers to these questions. Student quotes from the content analysis study12 suggested that students believing the sample would remain a gas but would occupy a smaller volume (second choice for question 5) might have confused the behavior of the rigid (constant volume) steel tank with those of a nonrigid container, such as a balloon. To test this hypothesis, we performed an experiment with a different sample of students in which some of them were taught in lecture to consider whether the container holding a gas sample was rigid or nonrigid before answering additional questions. This experiment compares the responses from students who were taught using this method (experimental) to the responses from students who were not taught using this method (control). The distribution of these students’ responses to the five questions appears in Table 6. The distributions of student responses from the two groups to question 1 (particle speed) and question 2 (state of matter) G

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 7. Results of the Equivalence Tests for the Questions Used in the Third Experiment

a

Question

X̅ no bp

X̅ bp

Intervala

z1b

z2b

Result

Particle speed State of matter Sample volume Sample pressure Particle distribution

0.919 0.767 0.581 0.523 0.465

0.933 0.787 0.813 0.733 0.640

−0.053, 0.053 −0.084, 0.084   

1.62 1.56   

0.91 0.97   

Not within interval Not within interval Significantly different Significantly different Significantly different

These boundaries represent the standard deviation of two groups combined, multiplied by 0.2. bThe critical score for these values is zα=0.10 = 1.28.



CONCLUSIONS Our original hypothesis for the first experiment was that providing students with the boiling point of helium at 3 atm (−268 °C) would help them recognize that helium would remain a gas under these conditions (3 atm and −20 °C), and that this would subsequently affect their answers regarding sample volume and particle distribution. Based on an analysis of student responses, we were unable to identify any significant differences in these responses of students who were given this information or not. In fact, we were able to show that the proportion of correct responses from the two groups of students was equivalent for the particle speed, the sample volume, and the sample pressure questions. We believe that this lack of difference in their answers related to the sample volume and particle distribution questions was due to the fact that students who were given the boiling point data did not know how to use this information to improve their conceptual understanding regarding how the properties of a gas sample will change upon cooling. In the second experiment, students who were explicitly told that −20 °C was not cold enough to liquefy or solidify the helium sample were more likely to recognize that the helium sample would remain in the gaseous state than students who were not told this information. However, having this information did not lead to a significant difference in the distribution of students’ responses to questions related to particle speed, sample volume, sample pressure, or particle distribution; in fact, the distribution of answers to each of these questions was found to be equivalent. These results appear to corroborate our interpretation of the first experiment’s results: (i) students provided with the boiling point of helium at 3 atm either did not attend to this information or did not know how to use this information to predict the correct state of matter present in the container; and (ii) having additional information about the state of matter present in the cooled sample did not affect students’ answers to the other conceptual questions. On the basis of the results of the first two experiments, it appears that predicting the wrong state of matter present in the container at −20 °C is not a major factor in explaining why students choose incorrect answers for the particulate distribution question. First of all, only about 30% of all students who were not given explicit information regarding the state of matter present in the cooled container believed that the sample would liquefy or solidify, while roughly 70% assumed it would remain in the gaseous state. Also, while students who were explicitly told that the sample would not change states of matter were more likely to correctly predict that the helium sample would remain a gas at −20 °C compared to students who were not told this information, having this information did not lead to a difference in the distribution of answers from these two groups to any of the other questions in this study. As a result, it appears that predicting an incorrect state of matter is not a significant threat to the validity of the particulate multiple-

were similar and were not found to be statistically significant: χ2(1) = 0.13, p = 0.72; and χ2(2) = 0.44, p = 0.81, respectively. The distributions of responses to question 3 (sample volume) and question 4 (sample pressure) were found to be significantly different: χ2(2) = 10.92, p = 0.004; and χ2(2) = 8.40, p = 0.015, respectively. The distributions of responses to question 5 (particle distribution) were not found to be significantly different based on the χ2 test of homogeneityχ2(2) = 5.65, p = 0.059but the test of the proportion of correct responses (needed to perform the equivalence testing) showed that the proportion of correct responses to question 5 was significantly different (z = 2.22, p = 0.026). Table 7 contains the data from the equivalence tests for questions 1 and 2. On the basis of the χ2 homogeneity tests and the equivalence tests, it appears that the sample size (N = 161) was insufficient to determine whether there was a difference between the responses of these two groups of students to these two questions. It appears that, compared to students in the control group, instruction focusing on the behaviors and properties of gases in perfectly rigid and nonrigid containers helped students in the experimental group recognize that the volume of the sample would remain the same (58% for control, 81% for experimental), that the sample pressure would decrease (52% for control, 73% for experimental), and that the particles would be distributed throughout the entire volume of the container (47% for control, 64% for experimental). These results suggest that one reason why students may be answering the particulate distribution question incorrectly is that they did not consider the fact that this question contains a rigid container; if the container were nonrigid, then its overall volume would have decreased and the distribution of particles would look more like the second answer in question 5 (with the walls contracted to a smaller volume along with the gas particles). We have not labeled this as a threat to the validity of the particulate distribution question because it is possible that the authors of this question had intended it to measure this difference (i.e., it is not a threat to the validity of the question, it is a measure of the validity of the question). In any event, instruction based on rigid and nonrigid containers appeared to improve students’ conceptual understanding of the properties and behaviors of gas particles with respect to the volume, pressure, and particle distribution within a rigid container. As with the results from the second experiment, we believe that the instructional method that focused students’ attention on whether the container for a gas sample was rigid or nonrigid may have served as an instructional cue51−53 that assisted students in activating their schema45 concerning the properties and behaviors of ideal gases before answering questions on the sample volume, the sample pressure, and the particle distribution question, and that this instruction ultimately helped them arrive at the correct answer for these three questions. H

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

choice question used in this study. Sanger et al.13 described how showing particle motions dramatically improved students’ responses to this particulate question and suggested that not showing these motions was a significant threat to the validity of this question. This study extends that work to suggest that students’ predicting the wrong state of matter for the helium sample in the cooled container did not impact students’ responses to the particulate question and therefore did not appear to be a significant threat to the validity of this particulate question. Roughly half of the students in the first two experiments (47−54%) correctly predicted that the particles would be distributed throughout the entire container. This correct response rate is actually quite high for this particulate question (typically reported in the literature at 30−36%). The data presented here may suggest that simply answering the four previous questions regarding particle speed, state of matter, sample volume, and sample pressure has significantly improved students’ responses to the particulate question compared to students answering the particulate question alone. The third experiment in this study suggests that instruction regarding the properties and behavior of gas particles in perfectly rigid (constant volume) and perfectly nonrigid (constant pressure) containers may help improve students’ conceptual understanding of gas properties, which in turn may improve their performance on the particulate gas law question appearing in this study. We believe that the assessment technique of asking students to answer the four related questions before answering the particle distribution question and the instructional technique of asking students to consider whether the container holding a gas sample was rigid or nonrigid served as instructional and assessment cues that helped students activate relevant schema related to the properties and behaviors of ideal gases in order to more correctly answer the questions posed in this study.



(9) Cooper, M. M.; Underwood, S. M.; Hilley, C. Z. Chem. Educ. Res. Pract. 2012, 13, 195−200. (10) Christian, B. N.; Yezierski, E. J. Chem. Educ. Res. Pract. 2012, 13, 384−393. (11) Bauer, C. F.; Cole, R. J. Chem. Educ. 2012, 89, 1104−1108. (12) Sanger, M. J.; Phelps, A. J. J. Chem. Educ. 2007, 84, 870−874. (13) Sanger, M. J.; Campbell, E.; Felker, J.; Spencer, C. J. Chem. Educ. 2007, 84, 875−879. (14) Nurrenbern, S. C.; Pickering, M. J. Chem. Educ. 1987, 64, 508− 510. (15) Gabel, D. L.; Samuel, K. V.; Hunn, D. J. Chem. Educ. 1987, 64, 695−697. (16) Sawrey, B. A. J. Chem. Educ. 1990, 67, 253−254. (17) Pickering, M. J. Chem. Educ. 1990, 67, 254−255. (18) Nakhleh, M. B. J. Chem. Educ. 1993, 70, 52−55. (19) Nakhleh, M. B.; Mitchell, R. C. J. Chem. Educ. 1993, 70, 190− 192. (20) Zoller, U.; Lubezky, A.; Nakhleh, M. B.; Tessier, B.; Dori, Y. J. J. Chem. Educ. 1995, 72, 987−989. (21) Smith, K. J.; Metz, P. A. J. Chem. Educ. 1996, 73, 233−235. (22) Lee, K.-W. L. J. Chem. Educ. 1999, 76, 1008−1012. (23) Taber, K. S.; Watts, M. Chem. Educ. Res. Pract. 2000, 1, 329− 353. (24) Sanger, M. J. J. Chem. Educ. 2000, 77, 762−766. (25) Mulford, D. R.; Robinson, W. R. J. Chem. Educ. 2002, 79, 739− 744. (26) Pinarbasi, T.; Canpolat, N. J. Chem. Educ. 2003, 80, 1328−1332. (27) Sanger, M. J. J. Chem. Educ. 2005, 82, 131−134. (28) Tasker, R.; Dalton, R. Chem. Educ. Res. Pract. 2006, 7, 141−159. (29) Onwu, G. O. M.; Randall, E. Chem. Educ. Res. Pract. 2006, 7, 226−239. (30) Othman, J.; Treagust, D. F; Chandrasegaran, A. L. Int. J. Sci. Educ. 2008, 30, 1531−1550. (31) Cracolice, M. S.; Deming, J. C.; Ehlert, B. J. Chem. Educ. 2008, 85, 873−878. (32) Davidowitz, B.; Chittleborough, G.; Murray, E. Chem. Educ. Res. Pract. 2010, 11, 154−164. (33) Waner, M. J. J. Chem. Educ. 2010, 87, 924−927. (34) Nyachwaya, M. J.; Mohamed, A.-R.; Roehrig, G. H.; Wood, B. N.; Kern, L. A.; Schneider, L. J. Chem. Educ. Res. Pract. 2011, 12, 121− 132. (35) Liang, J.-C.; Chou, C.-C.; Chiu, M.-H. Chem. Educ. Res. Pract. 2011, 12, 238−250. (36) Borg, W. R.; Gall, M. D. Educational Research, 4th ed.; Longman: New York, 1983; pp 280−281. (37) Thuneberg, E. Helium. http://ltl.tkk.fi/research/theory/helium. html (accessed Apr 2013). (38) Bodner, G. M. J. Chem. Educ. 1986, 63, 873−878. (39) Herron, J. D.; Nurrenbern, S. C. J. Chem. Educ. 1999, 76, 1354− 1361. (40) Ferguson, R. L. Constructivism and Social Constructivism. In Theoretical Frameworks for Research in Chemistry/Science Education; Bodner, G. M., Orgill, M., Eds.; Prentice-Hall: Upper Saddle River, NJ, 2007; pp 28−49. (41) Johnstone, A. H. J. Chem. Educ. 1997, 74, 262−268. (42) Tsaparlis, G. J. Chem. Educ. 1997, 74, 922−925. (43) Gabel, D. J. Chem. Educ. 1999, 76, 548−554. (44) Mayer, R. E. Multimedia Learning; Cambridge University Press: New York, 2001; pp 41−62. (45) Mayer, R. E.; Wittrock, M. C. Problem Solving. In Handbook of Educational Psychology, 2nd ed.; Alexander, P. A., Winne, P. H., Eds.; Routledge: New York, 2009; pp 287−303. (46) Johnstone, A. H. J. Chem. Educ. 2010, 87, 22−29. (47) Mayer, R. E.; Gallini, J. J. Educ. Psych. 1990, 82, 715−727. (48) Patrick, M. D.; Carter, G.; Wiebe, E. J. Sci. Educ. Technol. 2005, 14, 353−365. (49) Cook, M.; Wiebe, E.; Carter, G. J. Educ. Multimedia Hypermedia 2011, 20, 21−42. (50) Lin, L.; Atkinson, R. K. Comp. Educ. 2011, 56, 650−658.

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

The authors declare no competing financial interest.



REFERENCES

(1) Bunce, D. M. J. Chem. Educ. 2008, 85, 1439. (2) Bunce, D. M.; Cole, R. S. Using This Book To Find Answers to Chemical Education Questions. In Nuts and Bolts of Chemical Education Research; Bunce, D. M., Cole, R. S., Eds.; ACS Symposium Series 976; American Chemical Society: Washington DC, 2008; pp 1− 10. (3) Cooper, M. M. Drawing Meaningful Conclusions from Education Experiments. In Nuts and Bolts of Chemical Education Research; Bunce, D. M., Cole, R. S., Eds.; ACS Symposium Series 976; American Chemical Society: Washington DC, 2008; pp 171−182. (4) Adams, W. K.; Wieman, C. E.; Perkins, K. K.; Barbera, J. J. Chem. Educ. 2008, 85, 1435−1439. (5) Lacosta-Gabari, I.; Fernández-Manzanal, R.; Sánchez-Gonzólez, D. J. Chem. Educ. 2009, 86, 1099−1103. (6) Lewis, S. E.; Shaw, J. L.; Freeman, K. A. Chem. Educ. Res. Pract. 2011, 12, 158−166. (7) Stains, M.; Escriu-Sune, M.; de Santizo, M. L. M. A.; Sevian, H. J. Chem. Educ. 2011, 88, 1359−1365. (8) Knaus, K.; Murphy, K.; Blecking, A.; Holme, T. J. Chem. Educ. 2011, 88, 554−560. I

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(51) Beck, I. L.; McKeown, M. G.; Sinatra, G. M.; Loxterman, J. A. Read. Res. Q. 1991, 26, 251−276. (52) Conner, L. N. Res. Sci. Educ. 2007, 37, 1−16. (53) Crump, M. J. C.; Milliken, B.; Ansari, I. Psicológica 2008, 29, 97−114. (54) Murphy, K. R.; Davidshofer, C. O. Psychological Testing: Principles and Applications; Prentice Hall: Englewood Cliffs, NJ, 1988; pp 65−67. (55) Kline, T. J. B. Psychological Testing: A Practical Approach to Design and Evaluation; Sage: Thousand Oaks, CA, 2005; pp 168−171. (56) Bauer, C. F. J. Chem. Educ. 2005, 82, 1864−1870. (57) Bauer, C. F. J. Chem. Educ. 2008, 85, 1440−1445. (58) Cooper, M. M.; Sandi-Urena, S. J. Chem. Educ. 2009, 86, 240− 245. (59) Oloruntegbe, K. O.; Ikpe, A. J. Chem. Educ. 2011, 88, 266−271. (60) Hinkle, D. E.; Wiersma, W.; Jurs, S. G. Applied Statistics for the Behavioral Sciences, 3rd ed.; Houghton Mifflin: Boston, MA, 1994; pp 542−551. (61) Lewis, S. E.; Lewis, J. E. J. Chem. Educ. 2005, 82, 1408−1412. (62) Schuirmann, D. J. J. Pharmacokin. Biopharm. 1987, 657−680.

J

dx.doi.org/10.1021/ed200809a | J. Chem. Educ. XXXX, XXX, XXX−XXX