Testing Students' Use of the Particulate Theory

www.JCE.DivCHED.org • Vol. 81 No. 6 June 2004 • Journal of Chemical Education ... Physical Changes Concepts Test (PCCT) was administered in two fo...
0 downloads 0 Views 213KB Size
Research: Science and Education edited by

Chemical Education Research

Diane M. Bunce The Catholic University of America Washington, D.C. 20064

W

Testing Students’ Use of the Particulate Theory Vickie Williamson,* Jason Huffman, and Larry Peck Department of Chemistry, Texas A&M University, College Station, TX 77846-3255; *[email protected]

Chemistry is based on the premise that matter is particulate in nature. Recent studies have shown that students have different kinds of misunderstandings in chemistry, many of which are related to the failure to acknowledge the particulate nature of matter (1, 2). Various attempts have been made to determine the source of these misunderstandings and to determine what chemistry teachers can do to increase sound scientific understanding (3–6). To build on this effort, Haidar and Abraham (7) investigated high school students’ understanding about the particulate theory of matter and their use of particulate terminology. Six concepts were tested: dissolution, insolubility, saturation, diffusion, states of matter, and effusion. The Physical Changes Concepts Test (PCCT) was administered in two forms, an applied version and a theoretical version, to determine whether students scientifically understood the concepts well enough to apply them to everyday-life examples and if particulate terminology was used in their responses. The applied version tested student knowledge by using everyday language to ask questions, and the theoretical version tested student knowledge using scientific language. The Test of Logical Thinking (TOLT) (8) was administered to test students’ reasoning ability. The results showed that over 40% of the students held alternate conceptions (ACs) to the scientifically accepted views about the concepts tested and that their ACs were related to their formal reasoning abilities and preexisting notions about particulate theory. There was a significant difference between students’ applied and theoretical knowledge, which suggests compartmentalization of knowledge within the students’ mental structures. In addition to student understanding, Haidar and Abraham found that for questions using everyday language (applied form), students tended to use everyday and macroscopic terms to answer the questions. They found for questions using scientific language (theoretical form), students tended to explain or reason questions in scientific and microscopic (or particulate theory) terms. This selected use of the particulate theory in students’ response was additional evidence for compartmentalization. Haidar and Abraham’s findings are corroborated by Lewis and Linn (9), who also found that students have separate school (scientific) and everyday understandings and vocabulary. Lewis and Linn found that giving attention to students’ intuitive conceptions and giving students the opportunity to integrate their experiences in the everyday world with scientific examples and explanation helped to prevent compartmentalization. Similar findings and recommendations were reported by Reif and Larkin (6). Nicoll (10) found that even chemistry seniors have not integrated scientific and popular definitions of “a chemical”. A case study by Garnett and Treagust (11) illustrated that high school chemistry students www.JCE.DivCHED.org



often overgeneralize and apply scientific definitions too literally. In some cases, students confuse the words that are used by the teacher or those read in textbooks because of everyday and scientific compartmentalization. The form of the question has been found to elicit different responses. For example, the terminology within the question has been shown to have an effect on how students respond to proportional questions (12), to analytical chemistry questions (13), and to giving definitions for science terminology (14–16). Chemical education researchers and chemistry instructors use questions to assess a student’s understanding of chemistry; therefore, it is important to investigate how the phrasing of a question may influence the response. While Haidar and Abraham investigated questions with either everyday or scientific wording, we investigate whether questions using a combination of everyday and scientific language stimulate students to explain phenomena in particulate terms. Purpose The goal of this study was to investigate at what point students are prompted to answer questions in particulate terms as described by Haidar and Abraham (7) by including an increasing number of scientific terms in a set of questions based on the PCCT. Specifically, the questions addressed are: 1. Does the question format prompt an everyday or a scientific response? 2. At what point, if at any point, are students cued by the content of the question to answer in scientific (particulate) terms? 3. Do the data collected from this study show a correlation of reasoning ability and particulate theory responses similar to the results of the Haidar and Abraham study?

Methods The six same concepts of the PCCT used by Haidar and Abraham were used in this study. A total of six questions for each concept included the original two questions from the PCCT and four new questions with progressively more scientific content proceeding from the everyday question format to the scientific question format used in the PCCT. (The six questions addressing each concept are given in the Appendix in the Supplemental Material.W) In each series of questions, question A was analogous to Haidar’s applied-type question; question B used the word “particle(s)”; question C used the words “atom(s)” or “molecule(s)”; question D was like C but included more scientific apparatus and more sci-

Vol. 81 No. 6 June 2004



Journal of Chemical Education

891

Research: Science and Education

1A. A spoonful of sugar is added to a clear plastic cup half filled with water. The sugar and the water are stirred with a spoon. The sugar dissolves in water. Briefly explain what happens when sugar dissolves in water.

1B. A tablespoonful of sugar is added to a clear plastic cup half filled with water. The sugar and the water are stirred and the sugar particles dissolve in the water. Briefly explain what happens when sugar particles dissolve in water.

1C. A tablespoonful of sugar molecules are added to a clear plastic cup half filled with water. The sugar and the water are stirred and the sugar molecules dissolve in the water. Briefly explain what happens when sugar molecules dissolve in water.

1D. 20.0 g of sugar molecules are added to 125 mL of distilled water in a 250-mL beaker. The sugar is stirred in the water and the sugar molecules dissolve in the water. Briefly explain what happens when sugar molecules dissolve in the water.

1E. The 250-mL beakers below, filled with 125 mL of water, will be used to represent the dissolving of sugar in water. Solid circles represent sugar molecules and open circles represent water molecules. If 4 molecules of sugar (solid state) are in beaker 1 and 12 molecules of water are in beaker 2, which drawing best represents the product of this process in beaker 3? Very briefly explain (in 5 lines or less) your answer by discussing the relationship between the molecules in each of the three beakers. The explanation is as important as the drawings .

(C)

(B)

(A) Beaker 1

Beaker 2

Beaker 3

Beaker 1

Beaker 2

(D)

Beaker 3

Beaker 1

Beaker 2

Beaker 3

(E) Beaker 1

Beaker 2

Beaker 1

Beaker 3

Beaker 2

Beaker 3

1F. The following models will represent the process of dissolving sugar in water. Solid circles represent sugar molecules and open circles represent water molecules. If 4 molecules of sugar (solid state) are in Box 1 and 12 molecules of water (liquid state) are in Box 2, select the drawing that best represents the product of this process in Box 3. Very briefly explain (in 5 lines or less) your selection by discussing the relationship between the molecules in each of the three boxes. The explanation is as important as the drawings.

(A)

(C)

(B) Box 1

Box 2

Box 1

Box 3

(D)

Box 2

Box 3

Box 1

Box 2

(E) Box 1

Box 2

Box 3

Box 1

Box 2

Box 3

Figure 1. Question 1A–1F for dissolution concept.

892

Journal of Chemical Education



Vol. 81 No. 6 June 2004



www.JCE.DivCHED.org

Box 3

Research: Science and Education

entific detail; question E relied more heavily on the terminology of “atom(s)” or “molecule(s)” and depicted particles in a process; and question F was analogous to Haidar’s theoretical-type question. The set of six questions for the dissolution concept are shown in Figure 1. Two experts agreed that the progression from A to E moved from Haidar’s everyday question to his scientific question. Question A was not considered to cue the student to use the particulate theory. All other questions are assumed to cue to some extent owing to the inclusion of various levels of scientific language. The arrangement of the six test forms (i–vi) that were administered online are shown in Table 1. For example, question 2B would concern insolubility and would contain the word “particles”. Each subject received a random series of six questions via an interactive Internet Web site, with one question from each concept. The series proceeded from everyday (question A) to scientific (question F) content with rotating concepts to prevent cueing from a previous question. Additionally, subjects were prevented from returning to a previous question. The responses to the questions were sent to an online database to be analyzed. There was a mistake made in the transfer of questions to the online test Web site that caused questions 3E and 4F to be repeated. These are in forms ii and iv and are italicized in Table 1. As a result, in form ii questions 3D and 3E were given in sequence, but ultimately showed no real difference from the other forms. Form iv contained questions 4A and 4F but these were not given consecutively. The TOLT was administered to the students to determine the students’ level of reasoning (8). The TOLT examines students’ performance with respect to proportional reasoning, controlling variables, probabilistic reasoning, correlational reasoning, and combinatorial reasoning. The TOLT is scored from 0 to 10 with one point given if both the answer selection and the explanation of the selection are correct. The students answered six questions in their test form and then took the TOLT. The students were not permitted to view the TOLT unless the six questions had been completed. The TOLT was chosen because it measures the same variables as the Group Assessment of Logical Thinking or GALT (17), with the exception of the conservation items. The two instruments have a number of common items. With this population, the shorter TOLT is a better choice because few if any students would be predicted to lack conservation of matter or conservation of volume. Our data supports this

Table 1. Order of Questions for Each Online Test Form Online Form i ii iii iv v vi

Questions 1A

a

2B

6A

1B

5A

6B

4A

5B

3A

4B

2A

3B

3C 2C 1C 6C 5C 4C

4D

5Eb

6Fb

3D

b

5Fb

b

4Fb

b

4Fb

b

2Fb

b

1Fb

2D 1D 6D 5D

3E 3E

2E

1E

6E

a Numbers refer to the concept (1, dissolution; 2, insolubility; 3, saturation; 4, diffusion; 5, states of matter; 6, effusion) and letters refer to the degree of scientific content in the questions (see text). b

Inadvertently repeated concept question.

choice, as shown by the mean, which is discussed later. Haidar and Abraham also used the TOLT. Subjects The study included approximately 1066 students enrolled in freshman chemistry for science majors among whom approximately 400 were first-semester students. Subjects had completed about 2兾3 of either the first or second semester of general chemistry. All of the concepts used are usually taught prior to college. Any direct instruction on these topics occurs at the beginning of first-semester general chemistry. These groups were not separated because it was determined that there were no significant differences in their responses. Minimal course credit was given for participating in the study. Certain biographical information was obtained from the students at the time they logged onto the test site: name, identification number, birth date, country of origin, current GPA, and the number and level of mathematics and chemistry courses taken or enrolled. Between 157 and 173 subjects took each of the six forms of the test. Scoring The students’ answers to each question were categorized with respect to the particulate content of the question and scaled according to the categories shown in Table 2. These categories, which were established by Haidar and Abraham, included: no response (NR), general (G), particulate general

Table 2. Scoring Criteria Used by Haidar and Abraham Degree of Understanding

Criteria for Scoring

No Response (NR)

Blank; “I do not know.“; “I do not understand.“

0

General (G)

Responses without any particulate terms

1

Particulate General (PG)

Responses using particulate terms other than “atoms” or “molecules” (e.g., particles or grains)

2

Particulate Specific (PS)

Responses that use the terms “atoms” or “molecules” but do not match the scientific conception

3

Particulate Specific Correct (PSC)

Responses that use the terms “atoms” or “molecules” and match the scientific conception

4

www.JCE.DivCHED.org



Vol. 81 No. 6 June 2004

Score



Journal of Chemical Education

893

Research: Science and Education

Figure 2. Use of particulate theory in total battery of all six concepts. NR: no response; G: general; PG: particulate general; PS: particulate specific; PSC: particulate specific correct.

(PG), particulate specific (PS), and particulate specific correct (PSC). Each of the categories was given a numerical score for use in statistical calculations as in Haidar and Abraham (7). Then PS responses were separated from PSC responses by reading through for correctness. Haidar and Abraham previously validated the correct scientific responses to the six concept questions. (See the correct responses section in the Supplemental Material.W The validated response for question 1, dissolution, is, “Sugar dissolves in water because sugar molecules interact with water molecules. This interaction is stronger than the interaction among sugar molecules.”) An independent evaluator scored randomly selected responses with an inter-rater reliability of 90.0%. Results

TOLT The average TOLT score was 6.57 (SD 2.52) out of a possible 10, which is similar to that found by other researchers for general chemistry (3). The correlation between the TOLT score and the use of particulate theory for all questions was slight, explaining less than 7% of the variance at best. Some of the correlations between the TOLT and the A–F forms of each of the six questions were significant, but a pattern was not established. Specifically, a small, but significant correlation occurred between the TOLT and questions 1B, 1C, 2C, 3A, 4A, 4C, 5A, 5B, and 6A. This variance is less than the moderate, but significant correlation found by Haidar and Abraham (7), which accounted for 2–13% of the variance found on the different forms and concepts of their PCCT. Complete Battery The combined percentage data from the question series for all concepts are shown in Figure 2 to illustrate the general trends. Analysis of the individual concepts can be found in the Supplemental Material.W The wording of the questions stimulated a specific response from students. The general responses (G) were generally highest for the everyday

894

Journal of Chemical Education



worded question A (56.58%) and lowest for the scientifically worded questions (C–F). Note how the particulate general (PG) responses were stimulated for the B question that used the word “particles” and how at question B, the PG maximum (25.58%) tends to occur where the particulate specific (PS) minimum occurs (29.19%). The PS responses are significantly higher at question C (the first question to introduce the terms atoms or molecules). In fact, whatever the terms (PG or PS) that are used in the questions immediately stimulate the student to respond using the same terms. It can be surmised that the students mimicked the questions. They responded to the questions with the same use of terminology that was used in the question. For all questions C–F there tended to be a continued increase in PS response that was cued by the questions. The PS responses were lowest for the everyday questions (A) and highest for the scientific questions (C–F). C and D questions in the battery provided a higher percentage of particulate specific correct (PSC) responses than did the other questions of the various series; they also showed a comparatively high percentage of PS responses. Specifically, the students gave the highest percentage of PSC responses to questions 1B (dissolution of sugar), 2C (insolubility), 3C (saturation), 4A (diffusion), 5C (states of matter), and 6D (effusion). This may indicate that some form of questioning similar to the format of question C in each series could be useful for consistently acquiring both PS and PSC responses from students. The general (G) responses seem to decrease as the PS responses increase between questions A and C–D. However, the general responses, even when the question used the terms atoms or molecules, were interesting. Haidar and Abraham (7) also found that about 23% of the students responded in general terms, even when the question was posed in the theoretical form using particulate terminology. This seems to support the idea that while most students are cued by the content of the question, a certain percentage of the students are not. The ANOVA (analysis of variance) results for the series of questions show that there is a significant difference at the 95% confidence level in the use of the particulate theory in this series of questions (p = 0.0001). The Games–Howell post-hoc test was chosen because it is resistant to differences in cells sizes. The Games–Howell analysis is an option on common statistics programs such as Statview or SAS (18). ANOVA and post-hoc results showing significant differences are found in Table 3. (Individual results for each concept are given in the Supplemental Material.W) The A questions that used everyday language were significantly different from the C, D, E, and F questions that increasingly used scientific language and the particulate terms atoms or molecules. Additionally, the B questions that used particulate general terms (particles) were significantly different from C, D, E, and F questions. The A and B questions were not significantly different from each other. There was also a difference between the D questions and the F questions. Students scored a lower percentage of PS and PSC responses on F than on D. One reason for this is that both questions E and F required students to make a choice and give an explanation. One problem with the online administration is that students often gave an explanation for their choice rather than explaining the choice based on the molecular interactions; thus they did not

Vol. 81 No. 6 June 2004



www.JCE.DivCHED.org

Research: Science and Education

Implications for Instructors

Table 3. Significant Differences for Questions A–F of All Six Concepts Questions Compared

Difference

a

Critical Differenceb

Significant

A vs B

0.0963

0.1206

no

A vs C

0.4076

0.1269

yes

A vs D

0.5263

0.1243

yes

A vs E

0.4870

0.1230

yes

A vs F

0.3941

0.1251

yes

B vs C

0.3113

0.1221

yes

B vs D

0.4300

0.1194

yes

B vs E

0.3907

0.1180

yes yes

B vs F

0.2978

0.1202

C vs D

0.1187

0.1257

no

C vs E

0.0794

0.1244

no

C vs F

0.0135

0.1264

no

D vs E

0.0393

0.1218

no

D vs F

0.1322

0.1239

yes

E vs F

0.0929

0.1225

no

a

Difference is the difference between the means of the two types of questions. b Critical Difference is the difference required between the means to make the assumption that the questions are significantly different from each other, with a 95% confidence level.

This study does not indicate how to eliminate compartmentalization. However, it does add to the understanding of how questioning affects students’ use of the particulate theory. Students seem to be immediately cued by the words in the questions. It is therefore not necessary to ask very abstract scientific questions to elicit particulate theory responses. Particulate terminology usage will be about the same whether the question simply mentions atoms or molecules or is elaborate and contains more scientific terms. Instructors only have to ask about everyday phenomena using the terms atoms or molecules as in the questions C to procure responses in particulate terms (PS), and it also appears to be the case that these yield the highest percentage of scientifically correct (PSC) responses without much scientific elaboration. This could help in more sound development of testing methods, to tailor questions to elicit desired responses. These series of questions in their different forms could be used as an evaluation tool to determine which students have integrated everyday and scientific understanding. It could be used to test at which point, if at any point, a student will be cued to respond using particulate theory and give the scientifically correct response. This could aid the instructor in determining how to guide students to become more like an expert and to limit or alleviate compartmentalization in regard to understanding of everyday phenomena. Future Research

use particulate terminology or scientifically correct particulate explanations. For example, a response for question 1E on dissolution that stated “the top option contained molecules with the correct spacing” uses particulate terms, but does not give a scientifically correct explanation as described by Haidar and Abraham (7). Conclusions The supposition for the study was that the more scientific wording that is used in the question, the more the student will be cued to answer in the same format in which the question is worded. The results of this study indicate that students are cued immediately when the particulate theory words like atoms or molecules are used in the questions. If a question contains even minimal particulate terminology, students will tend to answer in particulate terms. However, this does not mean that the response is correct, as indicated by the overall low percentages of PSC. None of the questions proved to elicit responses with particulate terminology, unless such terminology was used in the question. As Haidar and Abraham (7) suggested, this shows that students seem to compartmentalize information about chemistry. As they also found, formal reasoning ability, while associated, did not seem to play a large role in the students’ use of the particulate theory. Because chemistry requires thought at the microscopic level and students tend more toward macroscopic explanations without being cued, efforts to integrate everyday and scientific thought should be made so that students will think spontaneously in microscopic terms and understand why and when to use particulate theory.

www.JCE.DivCHED.org



This study was conducted in hopes of building upon previous research and adding to the body of knowledge that explicates how students respond to certain question types and how they utilize the particulate theory in their responses. In future studies it would be useful to correlate students’ level of understanding of these six concepts with their use of the particulate theory. Additional research is needed to give insight into how to boost the numbers of scientifically correct responses. Also the connections between compartmentalization, level of understanding, and reasoning ability should be explored. We need studies that will lead to the further understanding of what factors contribute to why students compartmentalize to aid them in linking everyday and scientific knowledge. These factors may help explain the small numbers of students who responded with scientifically correct particulate answers (PSC). Helping students scientifically understand phenomena at the particulate level is the goal of most instructors. Acknowledgments We would like to thank Larry Brown and Justin Graham for providing expert computer support for development of the data collection Web site and server space. WSupplemental

Material

The correct responses to the six concept questions validated by Haidar and Abraham, detailed discussion of the responses by the students to the concept questions, and the six questions for each concept are available in this issue of JCE Online.

Vol. 81 No. 6 June 2004



Journal of Chemical Education

895

Research: Science and Education

Literature Cited 1. Abraham, M. R.; Williamson, V. M.; Westbrook, S. L. J. Res. Sci. Teach. 1994, 31, 147–165. 2. de Vos, W.; Verdonk, A. H. J. Chem. Educ. 1987, 64, 692– 694. 3. Williamson, V. M.; Abraham, M. R. J. Res. Sci. Teach. 1995, 35, 512–534. 4. Gabel, D. L.; Bunce, D. M. Proceedings of the 64th Annual NARST Conference; Lake Geneva, Wisconsin, 1991. 5. Strike, K. A.; Posner, G. J. In Philosophy of Science, Cognitive Psychology, and Educational Theory and Practice; Duschl, R. A., Hamilton, R. J., Eds.; State University of New York Press: Albany, NY, 1992; pp 147–176. 6. Reif, F.; Larkin, J. H. J. Res. Sci. Teach. 1991, 28, 733–760. 7. Haidar, A. H.; Abraham, M. R. J. Res. Sci. Teach. 1991, 28, 919–938.

896

Journal of Chemical Education



8. Tobin, K.; Capie, W. Educ. Psyc. Msmt. 1981, 41, 413–423. 9. Lewis, E. L.; Linn, M. C. J. Res. Sci. Teach. 1994, 31, 657– 677. 10. Nicoll, G. J. Coll. Sci. Teach. 1999, 28, 382–387. 11. Garnett, P. J.; Treagust, D. F. J. Res. Sci. Teach. 1992, 29, 1079–1099. 12. Johnson-Laird, P. N. Mental Models: Towards a Cognitive Science of Language, Inference, and Consciousness; Cambridge University Press: Cambridge, U.K., 1983. 13. Mellon, M. G. J. Chem. Educ. 1987, 64, 735–739. 14. Pushkin, D. B. J. Res. Sci. Teach. 1997, 34, 661–668. 15. Lynch, P. P.; Chipman, H. H.; Pachaury, A. C. J. Chem. Educ. 1985, 22, 7675–7686. 16. Wolfe, R.; Lopez, A. J. Reading 1993, 36, 315–317. 17. Bunce, D. M.; Hutchinson, K. D. J. Chem. Educ. 1993, 70, 183–187. 18. StatView, vers. 5.0.1, SAS Institute 2000–2001.

Vol. 81 No. 6 June 2004



www.JCE.DivCHED.org