Laboratory Experiment pubs.acs.org/jchemeduc
Getting the Argument Started: A Variation on the Density Investigation Joi P. Walker* and Steven F. Wolf Department of Chemistry, East Carolina University, Greenville, North Carolina 27858, United States Department of Physics, East Carolina University, Greenville, North Carolina 27858, United States S Supporting Information *
ABSTRACT: The ability to “engage in argument from evidence” is one of the eight practices identified in the Next Generation Science Standards as well as an emerging focus of undergraduate chemistry curricula. Guiding students to make evidence-based claims that engender argumentation will require faculty to revise conventional expository laboratory experiments. The type of data collected and the method of collecting the data as well as the question to be answered must be carefully considered. This paper describes a simple yet effective change to a standard laboratory, the determination of density, which we have used since 2006 to engage students in argumentation about both method and outcomes. KEYWORDS: First-Year Undergraduate/General, Inquiry-Based/Discovery Learning
■
INTRODUCTION Teaching students to engage in and practice scientific argumentation has taken center stage with the progressive adoption of the Next Generation Science Standards (NGSS).1 While specific pedagogical approaches such as Argument-Driven Inquiry (ADI),2,3 Claim, Evidence, Reasoning, and Rebuttal (CER),4 and the Science Writing Heuristic (SWH)5 have been implemented in both high school and introductory college chemistry classes, the idea of scientific argumentation remains a difficult practice for teachers to teach and for students to engage in. The science laboratory is an ideal setting for students to be given the opportunity to struggle with the practices of science: asking questions, developing models, analyzing data, constructing explanations, problem solving, engaging in argument from evidence, and communication. While “good” results may be the goal of research, learning to “do” science involves more than getting the right answerit also involves understanding the unexpected answer. In designing laboratory experiences, educators need to allow students to stumble and struggle a bit if they want to create a context where students can engage in science practices. Laboratory courses are ubiquitous in the undergraduate chemistry curriculum and across the sciences. The experiments often employ a prescriptive approach that provides students an opportunity to learn how to collect data and use it to draw conclusions, while at the same time ensuring that students know what data to collect, how to collect it, and how to make sense of the data. Rather than designing investigations, students follow detailed procedures. Instead of analyzing and interpreting data, students have tests, calculations, and analysis methods clearly laid out (e.g., repeat 10 times, subtract value 2 from value 1, report the average and the standard deviation). Rather than engaging in scientific argumentation and communicating ideas with others, students copy and paste their results into a lab report that is dispiriting to both create and grade. In order to prepare students to actually “do” chemistry or achieve a broader goal of science literacy, we need to change our approach as faculty. Students, for © XXXX American Chemical Society and Division of Chemical Education, Inc.
example, need to learn how chemists design investigations, analyze and interpret data, engage in argumentation, and communicate their ideas to others, what one might refer to as “chemistry know-how”6 or a “grasp of practice”.7 In other words, students should be given opportunities to participate in the practices of science during their laboratory courses so they will be able to develop and refine new ideas and products without being told what to do to and how to do it. These same principles are manifest in the New Framework for K−12 Science Education,1 which calls for all students to learn science and engineering practices. The term science practice is used rather than “science processes” or “inquiry skills” in this framework to emphasize that engaging in scientific investigation requires not only skill but also knowledge that is specific to each practice. This paper presents a common laboratory experiment that was adapted in order to promote production of evidenced-based claims and group argumentation (see the student handout in the Supporting Information). The investigation was developed for a laboratory course taught using Argument-Driven Inquiry (ADI), an instructional model designed to give a more central place to the role of argumentation in the social construction of scientific knowledge while promoting inquiry.2,3,8 The ADI approach, consists of eight stages: 1. identification of the task and a guiding question by the instructor; 2. students work in groups to design and implement a method to collect the data needed to answer the guiding question; 3. students analyze their data and develop a tentative argument (a claim supported by evidence and a justification of the evidence); Received: August 16, 2016 Revised: March 20, 2017
A
DOI: 10.1021/acs.jchemed.6b00621 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Laboratory Experiment
4. students share their arguments and critique the arguments of their peers during an argumentation session; 5. the instructor leads an explicit and reflective discussion about the content and the nature of scientific inquiry; 6. each student writes an investigation report; 7. the reports go through a double-blind group review; 8. students are given an opportunity to revise and submit their report to the instructor for evaluation. This process is modeled on the scientific discovery and peerreview process. Just as professional scientists discover truths about the physical world by designing experiments and analyzing experimental results, so do our students. While the density investigation described herein was developed for use with a specific pedagogical approach, we are providing this example to illustrate realizing the broader goal of engaging students in argumentation.
■
DENSITY INVESTIGATION There are multiple examples of reformed laboratory activities focused on the density theme. The primary goal is identifying an unknown material, either solid or liquid, by measuring its density and comparing it to a list of known materials. Samsa9 described an investigation based on the story of Archimedes that required students to use overflow to determine the density of a rubber stopper that would not fit in a 10 mL graduated cylinder. Prilliman10 described an inquiry-based density experiment with understanding experimental error as the primary objective. While these experiments are useful for understanding density as a physical property and perhaps the practical utility of density, they were not designed to engage students in scientific practice. We were interested in getting students to explore how experimental methods might influence results. We achieved this by making two changes to traditional implementations. First, we changed the focus from “identify” to “distinguish”. Student groups were provided with three objects (commercially available for density experiments from many sources, e.g., Flinn Scientific Inc.11,12) and asked, “Are these objects made of the same material?” In addition, some groups received an object with a density less than 1.0 g/mL. This presented the groups with a particular challenge when trying to find volume by water displacement since the object floated. The second variation was introduced by using large cubes that would not fit in a typical graduated cylinder and providing an overflow or spill can (Figure 1) for volume displacement. No instructions are provided on using the spill can; it was simply added to the list of materials, and students were given the responsibility of figuring out how it might be used to measure an object’s volume. Students were also provided with calipers. This way they would have a choice to make regarding how the objects’ volumes would be determined. We would also like to point out that at no point did we consider giving students alternate means of measuring the objects’ masses. Most students are quite familiar with making mass measurements from their experience in middle and high school laboratories. Plus, even a device like a triple beam balance can measure to fractions of a tenth of a gram. Thus, the only discussions that would ensue would (at best) be about how to use a piece of technology that has not been used in modern chemistry laboratories for decades. Furthermore, since we already know that the volume measurement is the largest source of uncertainty, that is where we decided that student effort would be most meaningfully and productively focused. The role of the instructor, as with any inquiry-based curriculum, is less directive and more Socratic (see the instructor
Figure 1. Overflow or spill can.
notes in the Supporting Information). The instructor should begin the lab by identify the guiding question (“Are these objects made of the same material?”) and leading a discussion of how the materials provided might be used to answer the question. The instructor should emphasize that it is good scientific practice to determine values in more than one way, so that students develop two methods (volume displacement and calculation from caliper measurements) for determining the densities of their samples. This point is noted in the Getting Started section of the student handout. Once the students begin work, they will question how to find the displacement volume of the large cube, which is when the instructor may point out the spill can. The instructor should observe how the students use the spill can and make suggestions as needed. Students eventually work out that they need to fill the can to overflowing, let the water stop dripping, add the object, and collect the water that “spills” to find the object’s volume. While we might not want to openly state that lab can generate a rollicking fun time, the spill-can dilemma can serve as an icebreaker for the lab experience and group learning in general. Once students have mass and volume data, they move through the argument construction, and most groups arrive at the conclusion that two of the objects are made of the same material.
■
HAZARDS The use of glassware means that protective eyewear is required at all times. Adding the heavier metal pieces to the graduated cylinder must be done carefully to avoid cracking or breaking the glass.
■
STUDENT PRODUCTS The students in ADI laboratory courses develop their argument in preparation for a presentation to their peers at the end of the lab. Figure 2 is an example of a whiteboard that presents the claim that two of the three objects have similar enough density values to support the conclusion that they are made of the same substance and are distinct from the third object. This board provides density values determined using calipers (Method 1) and displacement (Method 2) to find the volume. In this instance, the data are complementary, so although the lab investigation was not designed to consistently produce “good” data, it can. During the peer-to-peer argumentation sessions, students are exposed to the nature of science in the variety of B
DOI: 10.1021/acs.jchemed.6b00621 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Laboratory Experiment
Figure 2. Student whiteboard (left) and corresponding dialogue (right).
Both of these passages demonstrate that students were able to be skeptics when it came to accepting the validity of their volume measurements. Spill cans are notoriously imprecise, and these students noticed that in their measurements both within and between groups. Floating objects presented a different challenge that students handled in two different ways. Either students simply determined that the portion of the object not submerged could be included as measurement error, or they pushed the object under the water using a paper clipanother standard technique. A particularly creative group used the heavy cube to submerge the floating cylinder in the spill can, obtaining a combined volume from which they subtracted the volume of the heavy cube. The group felt that pushing the floating cylinder down with a spatula introduced error.
methods, the influence of humans on data, the variability of empirical data, and the need for revision in light of new evidence. Following the lab investigation and argumentation session, students write individual lab reports that answer three questions: (1) What are you trying to do and why? (2) How did you go about your work and why? (3) What is your argument? Below are exemplars of student responses to questions 2 and 3, which illustrate the challenges they encountered with the floating object as well as the large cube that required the spill can. Some groups used a beaker rather than the spill can, resulting in even less precision. It is important to note that although we intentionally set obstacles to finding volume by water displacement, the volume could be obtained with a fair degree of precision using the calipers. Some groups would decide that the volume-bydisplacement data were unreliable and support their claim with values obtained using direct measurement with the calipers.
What is your argument?
Our measurements for volume and density after using the water [displacement] method came out to be completely different than the measurement we calculated using the formula. This could partly be because when measuring the amount of water spilt out of the spill-can we then poured that into the graduated cylinder which could have caused a loss of displaced water. It would be safe to say that our water method for the cube is not a reliable source of data. While the blue and black cylinders were very close, their densities were just too different for the material to be the same. The cube didn’t look the same and it definitely did not have a density close to either cylinder, although it is difficult to say exactly because of the displacement error in our findings. The sets of three objects all have two objects that are made of the same material. There are also paired sets within the lab so that students have a peer group with which to make a direct comparison. The majority of groups conclude that two objects are made of the same material. However, some groups will claim that all three objects are different on the basis of their data. Sometimes these claims are supported by the data, but an unexpected point of argument was what constitutes a “real” difference in density. For example, 11.26 g/mL and 11.21 g/mL agree within the estimated digit, but some groups argued that
How did you go about your work and why?
The water technique was not as simple for all the objects. When the red cylinder was placed in the graduated cylinder it floated as opposed to the purple cylinder which sank. To submerse the red cylinder, we used a small measuring spatula to press the object just below the surface until the meniscus could be clearly read. This observation lead us to hypothesize that if the red cylinder floated and others sank, they could not all possibly be made of the same substance. The irregular shape of the cube made it somewhat difficult to calculate its volume through displacement. The approach used to determine the volume was by use of a spill-can. The spill-can was filled with water until water began to flow out of the spout. When the water stopped flowing from the spout, the spill-can had then reached its maximum capacity and the cube was ready to be submerged into the water. A 50 mL graduated cylinder was placed underneath the spout to collect the water after the object was submerged. The cube was lowered into the water and the displaced water was caught in the graduated cylinder and measured giving the volume of the cube. C
DOI: 10.1021/acs.jchemed.6b00621 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Laboratory Experiment
these were different “enough”. The argumentation session, a peer-to-peer presentation and discussion, is instrumental in resolving this issue. Each set of objects is duplicated within the lab, so groups are able to directly compare multiple measurements and claims. Following the argumentation session, students have the opportunity to revise their claim or repeat measurements. The student products presented are taken from an ADI context, but the elements that stimulated argumentation, e.g., the guiding question focused on distinguishing rather than obtaining a specific value, the use of unfamiliar equipment such as the spill can, and the challenges created with the floating object and the large cube, could be incorporated into other pedagogical approaches.
(4) Zembal-Saul, C.; McNeill, K. L.; Hershberger, K. What’s Your Evidence? Engaging K−5 Students in Constructing Explanations in Science; Pearson Higher Education: Boston, MA, 2013. (5) Burke, K.; Greenbowe, T.; Hand, B. Implementing the Science Writing Heuristic in the Chemistry Laboratory. J. Chem. Educ. 2006, 83 (7), 1032−1038. (6) Buntine, M. A.; Read, J. R.; Barrie, S. C.; Bucat, R. B.; Crisp, G. T.; George, A. V.; Jamie, I. M.; Kable, S. H. Advancing Chemistry by Enhancing Learning in the Laboratory (ACELL); a Model for Providing Professional and Personal Development and Facilitating Improved Student Laboratory Learning Outcomes. Chem. Educ. Res. Pract. 2007, 8 (2), 232−254. (7) Ford, M. J. Disciplinary Authority and Accountability in Scientific Practice and Learning. Sci. Educ. 2008, 92 (3), 404−423. (8) Walker, J. P.; Sampson, V. Using the Laboratory to Improve Undergraduates’ Science Writing Skills through Meaningful Science Writing, Peer-Review and Revision. J. Chem. Educ. 2013, 90 (10), 1269− 1274. (9) Samsa, R. A. An Investigative Density Experiment. J. Chem. Educ. 1993, 70 (2), 149. (10) Prilliman, S. G. An Inquiry-Based Density Laboratory for Teaching Experimental Error. J. Chem. Educ. 2012, 89 (10), 1305−1307. (11) Density Identification Set. https://www.flinnsci.com/densityidentification-set/ap7204/ (accessed Jan 8, 2017). (12) Density Cube Set. https://www.flinnsci.com/density-cube-set/ ap6058/ (accessed Jan 8, 2017).
■
CONCLUSION The ability to “engage in argument from evidence” is one of the eight practices identified in the Next Generation Science Standards1 as well as an emerging focus of undergraduate chemistry curricula. Guiding students to make evidence-based claims that engender argumentation is not as simple as having them collect data and answer a question. The type of data collected and the method of collecting the data as well as the question to be answered must be carefully considered. The challenge is developing meaningful science experiences that provide sufficient opportunity for variation in method or outcomes while still leading to understanding of a scientific concept. Counterintuitively, in order to engender authentic science experiences, especially to develop argumentation skills, students need “bad data”, that is, data that will lead them to think about what they have measured and how they have gone about measuring it. This can be facilitated if we present an appropriate context and support students’ investigative natures. Students will then begin to participate in the practices of science during laboratory courses, fully realizing the potential of such courses.
■
ASSOCIATED CONTENT
S Supporting Information *
The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.6b00621. Student handout (PDF, DOCX) Instructor notes and unknown key (PDF, DOC)
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. ORCID
Joi P. Walker: 0000-0001-7783-4706 Notes
The authors declare no competing financial interest.
■
REFERENCES
(1) NGSS Lead States. Next Generation Science Standards: For States, By States; The National Academies Press: Washington, DC, 2013. (2) Sampson, V. Argument-Driven Inquiry in Chemistry: Lab Investigations for Grades 9−12; NSTA Press: Arlington, VA, 2014. (3) Walker, J. P.; Sampson, V.; Zimmerman, C. Argument-Driven Inquiry: An Introduction to a New Instructional Model for Use in Undergraduate Chemistry Labs. J. Chem. Educ. 2011, 88 (10), 1048− 1056. D
DOI: 10.1021/acs.jchemed.6b00621 J. Chem. Educ. XXXX, XXX, XXX−XXX