Why Ask Why? - ACS Publications - American Chemical Society

Jun 19, 2015 - materials (bricks), and expecting them to build the Taj Mahal by themselves .... If we step back and think about just what is involved,...
0 downloads 0 Views 2MB Size
Commentary pubs.acs.org/jchemeduc

Why Ask Why? Melanie M. Cooper* Department of Chemistry, Michigan State University, Lansing, Michigan 48824, United States ABSTRACT: There is a strong case to be made that the goal of science is to develop explanatory theories that help us organize our understanding and make predictions about the natural world. What then, is the goal of science education? What is it that we want students to know and be able to do, and how do we achieve these goals? Here, I argue that one overarching goal is to help students construct causal, mechanistic explanations of phenomena. In chemistry this means we are working to help students use their understanding of molecular level interactions to explain and predict macroscopic events. Furthermore, while constructing explanations is an important goal in itself, the very act of constructing explanations helps students develop a deeper understanding, and provides the kind of intellectual satisfaction that memorizing facts cannot. I hope to convince you that our current approaches to assessing student learning are, in fact, all too often counterproductive and almost certainly contribute to students’ inability to connect ideas and develop a useful understanding of chemistry and that hat these assessments send the wrong message about what chemistry means (and why it is valuable). I will offer some suggestions how we might design more meaningful approaches to curriculum development and assessment of student understanding. After reading this essay, I hope that I will have convinced you that: (i) if we value something, we must assess it; (ii) we cannot assume students will construct a coherent framework from the fragments we teach; and (iii) we must design assessments that provide us with enough evidence to make an argument that the student understands. KEYWORDS: First-Year Undergraduate/General, Chemical Education Research, Testing/Assessment, Curriculum FEATURE: Award Address



INTRODUCTION There is general agreement that learning environments that support student active engagement are effective in improving course grades and retention, particularly for underprepared students,1 but there is less evidence about what students are actually learning in these courses. There is an implicit assumption that the grades students earn in our courses are a reflection of their understanding of the subject matter, and therefore, increased success and retention rates in a course must mean that more students have learned more deeply. But we know that even “good” students, who have done all that we asked of them, often emerge from our courses with profound misconceptions and a fragmented understanding of important concepts, and there is still little evidence that students are able to transfer what they have learned to a new situation in the same course, never mind across courses or disciplines.2,3 The literature is rife with descriptions of what students cannot do,4 but sadly, there are far fewer reports of successful strategies that report strong evidence of sustainable improvements in student understanding. While it is easy to blame students (for laziness, lack of motivation, or general inability to do the work), in fact the only constant from year to year is our efforts to teach these students, and in this, we seem to be failing a large fraction of them. We know that experts in a field have a robust underlying theoretical understanding of the concepts in their discipline.5 Certainly most scientists have a large database of information at their fingertips and, even more importantly, they have this knowledge organized into coherent frameworks and contextualized so © XXXX American Chemical Society and Division of Chemical Education, Inc.

that it is readily accessible. This contrasts with the often fragmented jumble of facts and concepts that beginning learners (and, as it turns out, many who are not beginners) frequently work with.2,6 Yet in most of our introductory courses there is a tendency to favor breadth over depth, often trying to provide courses that “cover” everything that might be deemed important for future chemists, despite the fact that most students in introductory courses will never take another chemistry course. There is often a focus on an extensive array of facts and skills that are duly tested−but almost never synthesized into a deeper, resilient and applicable understanding. Without an underlying explanatory framework, students are often unable to put the pieces together themselves; it is no wonder then that students are rarely able to make sense of what we tell them, and often resort to memorization−not because it is easier, but because they do not have any alternative. The current model in which most introductory courses are a “mile wide and an inch deep” is, in essence, providing students with the fragments to build a coherent framework without providing the experiences and context that allow the necessary connections to be made. By analogy, we are providing students with the building materials (bricks), and expecting them to build the Taj Mahal by themselves as shown in Figure 1. The question then is, what features characterize the curricular structures and instructional activities that can provide the necessary scaffolding so that students can build a strong foundation on which to build their subsequent knowledge? A “blueprint”

A

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

Figure 1. Experts have assembled their fragments of knowledge (the marble blocks) into a coherent, contextualized, and useful framework (the Taj Mahal).

that can be considered as the disaggregated parts of inquiry. The Framework states:6 Scientific explanations are accounts that link scientific theory with specific observations or phenomenafor example, they explain observed relationships between variables and describe the mechanisms that support cause and effect inferences about them. This approach is particularly appropriate in chemistry, where the causal mechanisms lie at the unseen molecular level. Although we chemists tend to reserve “mechanism” for the curved arrows that represent the flow of electrons in organic reactions, it means so much more. A major goal of chemistry is to provide molecular level, mechanistic explanations for macroscopic phenomena, which includes for example, how and why energy is transferred, why particles interact, why chemical reactions occur, and why, in a closed system, reactions reach equilibrium. I should also note that the scientific practices of constructing explanations, arguments,12 and models13 are closely linked, and while in this paper I will be focusing mainly on explanations, much of what I have to say is also relevant to these other practices. The science education literature is clear about the benefits of constructing explanations.14−17 For example, “Asking deep explanatory questions” is one of only two instructional practices that are considered to have strong evidence for their efficacy, as reported in the IES practice guide18 (Strong evidence requires multiple studies, across a wide range of disciplines and age ranges that show improvement in learning). A wide range of instructional approaches that support students in their development of explanations have been proposed. Some combine the practices of explanation and argumentation, which are closely linked, using a claim-evidence-reasoning framework that originated with Toulmin’s argumentation schemes.19 While there has been some discussion of the differences between the two,12,16,20 for our purposes it is not necessary to go beyond the idea that having students construct explanations (and/or arguments) using evidence or scientific principles and reasoning is supported by a large body of evidence that shows improved student learning. It is the act of constructing explanations that provides the well-documented beneficial effects. That is, simply reading or hearing an explanation does not promote the same kind of cognitive engagement, nor does summarizing the textbook, or taking notes. Constructing explanations requires students to engage in a wide range of cognitive activities and skills; it requires that students articulate their thoughts and that they connect and reflect on and consider their ideas. For these

for how STEM education might be redesigned based upon the best available evidence is outlined in the National Research Council (NRC) A Framework for K−12 Science Education: Practices, Crosscutting Concepts, and Core Ideas.7 While the NRC Framework was developed for K−12, the research and learning theories on which it is based are also applicable to at least the first two years of college (and probably beyond). The NRC framework approach is designed to support students as they build their own robust underlying knowledge frameworks by identifying the core ideas in each discipline on which all other concepts can be built.8 Students’ knowledge cannot be installed but must be built up over time, a process that enables the student to organize and link these core ideas together to form a solid foundation for subsequent learning. However, at the college level, most current efforts aimed at improving student understanding tend to be focused on developing student-centered, interactive learning environments that are often layered on top of the existing curriculum.9,10 While there are many potential ways to support students as they develop a deep understanding, in this paper I will discuss an approach that focuses on helping them develop scientific explanations. Over the years, evidence has accumulated indicating that getting students to articulate not just that something happens, but why something happens, is exceedingly valuable for a variety of reasons. Researchers from cognitive science, philosophy, and science education have all weighed in on the value of explanation as an instructional practice, and while each discipline has a somewhat different approach and rationale, there is agreement that helping students construct explanations is crucial.



WHAT IS A SCIENTIFIC EXPLANATION? Just what we mean by explanation may depend on the context, but it has been noted that explanations can be characterized by the kinds of thought that are evidenced. The Institute of Education Sciences (IES) states11 Shallow knowledge taps basic factual or skill knowledge, whereas deep knowledge is expressed when learners are able to answer “why” questions and describe causal relationships between facts or concepts. and By (deep) explanations, we mean explanations that appeal to causal mechanisms, planning, well-reasoned arguments, and logic. Similarly, the NRC Framework for Science Education7 defines constructing explanations as one of the eight science practices B

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

Figure 2. Sequence by which properties can be determined from a molecular structure.

An Example from Structure−Property Relationships

reasons, a number of researchers have incorporated constructing explanations into proposed pedagogical frameworks. For example, Chi has proposed that as student engagement with learning materials progresses “from passive to active to constructive to interactive” (ICAP), their learning will improve.21 The “constructive and interactive” parts of this framework rely on student-generated explanations, predictions, and arguments. Linn’s knowledge integration framework22,23 also supports students as they make connections between key concepts to develop causal explanations. Philosophers of science have also weighed in on the value and nature of explanations. Strevens24 has written that most contemporary philosophers believe that “to understand a phenomenon is to grasp how the phenomenon is caused” and that there is “no understanding without explanation.” Indeed, if we know how something is caused, then it is much more likely that we will be able to make testable predictions about the future when conditions change. Gopnik,25,26 goes even further and proposes that understanding is an evolutionary adaptation. She postulates that in order to survive humans have evolved the need to develop theories about how the world works, and that the pleasure we get when we understand something−that is, when we construct a causal explanation−is our built in reward system. So, there is general agreement, helping students construct causal, empirically based explanations about phenomena is crucial for a number of reasons: it is an important goal of science education in itself, it improves learning, and perhaps most intriguingly it can provide an intellectual satisfaction that memorizing facts and performing rote calculations typically cannot. However, it is a truth (almost) universally acknowledged (at least by students) that the tests and quizzes we use to assign grades signify what is actually important. Regardless of our intent, the activities we use to grade students send an important, and sadly perhaps the only, message that some students hear. Even the most well constructed and evidencebased learning activities and curricular materials may not produce observable or meaningful learning gains if we do not develop assessments that actually assess what is important. In fact, it is entirely possible that many reform efforts might have failed to show improvements because the assessments used to measure reform were not commensurate with the reform itself.

Much of our work has focused on student understanding of structure−property relationships, and therefore, I will use this construct as an example. As might be expected, we have documented a litany of problems including student difficulties with drawing Lewis structures,27 the problems that students have with decoding the information contained in such structures,27−29 and the startling finding that the majority of students in our studies represent intermolecular forces as interactions within small molecules.30 Indeed, when we interviewed students about how they use structures to predict properties, we found little evidence that they used much of what they had been taught, and instead relied on rules and heuristics.2 If we step back and think about just what is involved, it becomes clearer why students have such difficulty. Consider, for example, the sequence of steps that students must follow to predict the properties of a substance from its structure (Figure 2). The student must be able to (1) draw the Lewis structure accurately; (2) use the Lewis structure appropriately to determine the electron pair geometry and molecular shape; (3) make predictions based on relative atomic electronegativities to determine bond polarities and use molecular shape to determine overall molecular polarity; (4) use the molecular polarity to predict the types and strengths of intermolecular forces, and (5) synthesize these factors, together with an understanding of the intermolecular potential energy changes and the influence of thermal energy, to predicate interaction stabilities and their implications for the macroscopic physical and chemical properties of a substance. The road from structure to properties is complex, and although experts can and do “chunk” parts of this pathway, for beginners we must acknowledge how difficult it is to perform these operations in sequence. Consequently, we dutifully assess each step, separately, as if each were the important idea (Table 1). For most of us−and particularly those of us who teach large enrollment courses−the most common types of questions (the ones that appear on many of our tests, and that are prevalent in publishers test banks), typically target the recall of these fragments of knowledge, the steps in the pathway, rather than the end goal. However, as the saying goes, “when everything is assessed, everything becomes important”, and when we emphasize the C

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

Table 1. Steps in the Structure−Property Pathway and the Ways They Are Commonly Assessed Step

Action

1

Draw the Lewis structure

2

Determine electron pair geometry and molecular shape Determine bond and molecular polarities Determine the strengths and types of intermolecular forces Determining physical properties

3 4 5a 5b

Determining chemical properties (e.g., acidity)

Common Instructional Strategies

Common Assessment Items Identify which is the correct structure, or construct a Lewis structure

Usually rules based, often where completing “octets” is the goal Usually rules based (VSEPR) Know electronegativity differences, and vector addition of bond dipoles Understand the electron density distribution in the molecule Identifying (relative) strengths of interactions between molecules Look for particular groupings of atoms

Identify or list the electron pair geometry and/or molecular shape of a particular molecule Identify whether a particular molecule is polar Identify which molecule will exhibit hydrogen bonding, or what intermolecular forces are present in the liquid phase of a particular compound Ranking tasks: Identify which compound has the highest boiling point, etc. Ranking tasks: Identify which compound is most acidic, etc. Predicting outcomes of reactions

compound has, the higher its boiling point, because they thought that it takes energy to break bonds. So, while ranking tasks may seem like an efficient way to test student understanding, unless accompanied by extensive student reasoning, even when answered correctly, they do not provide evidence that the student understands. Clearly, there is a need for other types of assessment that assess deeper learning.

pieces rather than the whole, it undermines the ultimate goal. We ask questions about identifying correct structures, or determining electron pair geometry, or which of these molecules will exhibit hydrogen bonding, but rarely do we ask students to synthesize what they know to answer the far more important types of questions, about how and why molecular structure predicts macroscopic behavior. No one is disputing that students must have the skills and knowledge to perform each step, but what we have to remember is that there is no reason to teach students these intermediate steps if we do not help them understand the ultimate purpose for why they are learning them. Most questions do not provide explicit linkages from one step to another and appear (to the student) to be isolated fragments. We should not be surprised then that many students do not know the purpose of Lewis structures:27 without context, they have no meaning. The tenets of meaningful learning31,32 hold that any new knowledge must be solidly connected to students prior knowledge, and that students must also understand the purpose of the new knowledge, so that they can choose to learn meaningfully rather than in a rote, shallow fashion. While it is clearly easier to assess the individual components, doing this sends a message to the students not only about what is important, but also that each task exists in isolation with no ultimate purpose. Certainly, multiple choice tests are reasonable ways to test low level knowledge, providing reliable and valid information about some aspects of what students know.23 Although multiple choice questions can be made difficult (a common, but flawed measure of “rigor”), there is little evidence that they are useful for measuring deep thinking33,34 or the complex constructs that are involved in structure−property relationships. Even multiplechoice questions that are intended to assess students’ ability to use their knowledge to make predictions may not evoke the kind of thinking that is intended. For example, as shown in Table 1, a common approach is to ask students to rank, for example, boiling points (or solubility or acidity or any number of other properties), the implication being that students who can do so correctly are making predictions based on their understanding of why the compounds have this particular property. However, several studies2,35,36 have shown that even students who make the correct choices often use strategies that are not scientifically valid. Often students rely on heuristics, rules, and test taking strategies. As noted previously,2 we asked students to use molecular structures to predict relative melting and boiling points. Many students chose the right answer (in fact, they mentioned that they had seen similar questions on tests), while providing faulty reasoning to justify their choices. For example, some students told us that the more bonds a



ASSESSMENT AS A PROCESS FOR ELICITING EVIDENCE ABOUT WHAT STUDENTS KNOW AND CAN DO We can never really know what a student knows; all we can do is make assumptions from the answers that they give on tests and quizzes. As we have seen most tests and quizzes assess facts or skills, they may test whether a student knows that something happens, but rarely why it happens. The question then is, how do we design tasks that both help students to (for example) explain how and why a molecular structure can be used to predict properties, and also provide evidence that the student has indeed used scientific reasoning to engage with the task? While it is beyond the scope of this paper to discuss assessment design in detail, the NRC report Knowing What Students Know37 is an excellent resource on this topic. The report identifies three essential aspects of assessment: cognition (what it is that you want student to know and do), observation (what you will ask students to do and what observations you will make), and interpretation (how you will interpret the observations). There are a number of approaches to the development of assessments that make use of this so-called assessment triangle,38−42 including Evidence Centered Design (ECD).41 The central tenet of these approaches is the idea that we must collect evidence that students understand the construct being assessed, so that we can make an argument about what it is that students know and can do. Clearly, there are many possible ways to elicit such evidence, but it seems clear that a well-constructed explanation should provide evidence that the student understands a phenomenon. Scaffolding Student Constructed Explanations

Asking students to construct explanations is an excellent formative assessment strategy and, with appropriate rubrics, can be useful in summative assessments. One thing is certain, however, we cannot expect students to develop the ability to construct explanations without coaching and practice. In our experience, asking students to explain how a phenomenon is caused, without rather extensive support and practice, usually results in shallow responses that lack the reasoning component we are looking for. If the goal is to have students develop the ability to generate coherent causal explanations for complex D

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

Table 2. An Example of a Question Designed To Elicit Evidence That Students Can Reason about the Link between Structures and Properties Question

Rubric

Dimethyl ether (CH3OCH3) and ethanol (CH3CH2OH) have the same molecular formula, but one of these compounds is a liquid at room temperature and the other is a gas. Draw Lewis structures for each substance and use them to help you determine which substance is a liquid. Provide a molecular level explanation for your choice, being sure to include a discussion of the interactions and energy changes involved.

Correct Lewis structures Claim: Ethanol is a liquid at room temperature, while dimethyl ether is not Evidence: Ethanol is capable of hydrogen bonding, while dimethyl ether is not. Reasoning: The interactions between ethanol molecules are stronger than the interactions between dimethyl ether molecules; Therefore, more energy is needed to overcome the attractions between ethanol molecules compared to dimethyl ether molecules; More energy needed to overcome attractive forces between molecules corresponds to a higher temperature, meaning ethanol will boil at a higher temperature.

respond to the questions. Although this is more timeconsuming than grading all multiple choice, or hand grading calculations, it sends an important message to students; the constructed response sections of the exam are significant and must be taken seriously (especially since they are worth 50% of the exam grade). Table 2 gives an example of a potential question and grading rubric that emphasizes the reasoning that we are expecting students to use. It is also possible to write multiple-choice questions that approximate this claim, evidence and reasoning scaffold for explanation questions. For example, Box 1 shows a question

phenomena, then they must be given ample opportunities to practice this kind of activity, while receiving appropriate feedback. There are a number of studies reporting ways to help students develop complete explanations,14,15 and most agree that scaffolding the explanation can produce a more coherent and rich response,20 for example, by situating the explanation in a phenomenon, or having students draw models and diagrams. We have also been investigating how to encourage students to construct coherent explanations using our online system, beSocratic.43,44 We pose questions that ask students to draw diagrams, molecular-level representations, and write explanations in response to prompts. We learned that simply asking students to “explain” did not elicit the kind of rich discussion that we were hoping for. This is not surprising, since most of our students are acculturated to assessment items that typically require them to choose an answer or calculate a number. One approach involves a scaffolded framework where we tell students that an explanation should have (i) a target or claim (what the explanation is about), (ii) the scientific principle or evidence on which the explanation is based, and (iii) the reasoning that links the two. It is the reasoning part of the explanation that not only provides the most useful insight into student thinking, but also is almost always missing from student-generated explanations. Using this scaffold, our goal is to help students develop the connections that lead them back to the core ideas that underlie all that they are learning. Other approaches involve asking students to draw a molecular level picture or diagram, or a graph and use it to help them explain a phenomenon. In general, we try to provide “hints” in the question about what we are looking for in terms of an explanation. So for example, a question about why one substance has a higher boiling point than another might include a reminder to discuss the forces and energy changes that are involved when a substance boils. Currently, I teach sections of over 400 students−a less than optimal situation that precludes individual feedback for complex student responses. After every class, students construct their answers to homework questions; they draw and write using our beSocratic system.44 In the next class, we show examples of student homework to discuss what factors are required for a complete answer to the homework. Our examinations combine both multiple choice and free response questions that require students to construct explanations and arguments, models, diagrams, and molecular pictures. Student responses on the constructed answer section of the exam are graded by graduate students and instructors using rubrics developed by considering what we deem to be acceptable evidence of student understanding. These rubrics evolve over time as we collect more information about how students

Box 1. A Multiple-Choice Question That Uses an Explanation Format in the Answer Choices (Claim, Evidence, Reasoning) Which is a stronger base? CH3NH2, or CH3OH A. CH3NH2, because N is more electronegative than O, and therefore is not as able to donate its lone pair into a bond with an acid. B. CH3NH2, because N is less electronegative than O, and therefore is better able to donate its lone pair into a bond with an acid. C. CH3OH, because O is more electronegative than N, and therefore is not as able to donate its lone pair into a bond with an acid. D. CH3OH, because O is less electronegative than N, and therefore is better able to donate its lone pair into a bond with an acid. where students must make a claim, and choose the correct explanation. This type of question is easier to grade, and while it may be tempting to return to all multiple choice tests, if this is the case, we may also be tempted to abandon formative explanation tasks−and then we are back to square one. This brings me to one final (important) point; if we expect students to develop deep understanding, it is highly unlikely to happen with the current “mile wide inch deep” curricula that are encompassed by traditional textbooks. Unless students have the opportunity and time to develop the ability to reason about phenomena, they are unlikely to be able to produce coherent explanations. Focusing on the “big ideas” in the context of scientific practices that put knowledge to use takes time, and makes it impossible to “cover” 25 or 30 chapters. There is a growing body of evidence that these important ideas should be developed over time in a carefully scaffolded progression.8,45 One possible approach is exemplified by our general chemistry curriculum, Chemistry, Life, the Universe and Everything (CLUE). CLUE is organized around three interconnected core ideas, structure, properties, and energy that are linked together E

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

by the idea of forces and interactions.46 Each of these ideas is developed and connected throughout the curriculum starting with the structure, properties and energy changes associated with atoms and progressing to the interconnected networks of chemical reactions that are the basis of simple biological processes. These core ideas of chemistry are developed as students use their knowledge to explain and model chemical phenomena, to help them construct for themselves a framework on which to build for the future.47 We have emerging evidence from this curriculum that not only are students more likely than those in traditional courses to understand fundamental concepts, but that this improvement is maintained throughout organic chemistry.48

(3) National Research Council. Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century; National Academy Press: Washington, DC, 2012. (4) National Research Council. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; Singer, S. R, Nielson, N. R., Schweingruber, H. A., Eds.; National Academies Press: Washington, DC, 2012. (5) National Research Council. How People Learn: Brain, Mind, Experience, and School.; National Academies Press: Washington, DC, 1999. (6) Bodner, G. M. I Have Found You an Argument: The Conceptual Knowledge of Beginning Chemistry Graduate Students. J. Chem. Educ. 1991, 68 (5), 385−388. (7) National Research Council. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas; National Academies Press: Washington, DC, 2012. (8) Krajcik, J. S.; Sutherland, L. M.; Drago, K.; Merritt, J. The Promise and Value of Learning Progression Research. In Making It Tangible: Learning Outcomes in Science Education; Bernholt, S., Neumann, K., Nentwig, P., Eds.; Waxmann: Münster, 2012; pp 261−284. (9) Gafney, L.; Varma-Nelson, P. Peer-Led Team Learning: Evaluation, Dissemination, and Institutionalization of a College Level Initiative. In Inovations in Science Education and Technology; Cohen, K., Ed.; Springer: Weston, MA, 2008. (10) Process Oriented Guided Inquiry Learning (POGIL); Moog, R. S., Spencer, J. N., Eds.; American Chemical Society: Washington, DC, 2008. (11) Organizing Instruction and Study to Improve Student Learning: What Works Clearinghouse http://ies.ed.gov/ncee/wwc/ PracticeGuide.aspx?sid=1 (accessed Feb 2015). (12) Osborne, J. F.; Patterson, A. Scientific Argument and Explanation: A Necessary Distinction? Sci. Educ. 2011, 95 (4), 627− 638. (13) Clement, J.; Rea-Ramirez, M. A. Model Based Learning and Instruction in Science; Springer: Secaucus, NJ, 2008. (14) McNeill, K. L.; Krajcik, J. S. Supporting Grade 5−8 Students in Constructing Explanations in Science: The Claim, Evidence, and Reasoning Framework for Talk and Writing; Pearson: Boston, MA, 2011. (15) Songer, N. B.; Gotwals, A. W. Guiding Explanation Construction by Children at the Entry Points of Learning Progressions. J. Res. Sci. Teach. 2012, 49 (2), 141−165. (16) Berland, L. K.; Reiser, B. J. Making Sense of Argumentation and Explanation. Sci. Educ. 2009, 93 (1), 26−55. (17) Chi, M. T.; Bassok, M.; Lewis, M. W.; Reimann, P.; Glaser, R. Self-Explanations: How Students Study and Use Examples in Learning To Solve Problems. Cogn. Sci. 1989, 13 (2), 145−182. (18) Karpicke, J. D.; Roediger, H. L. The Critical Importance of Retrieval for Learning. Science 2008, 319 (5865), 966−968. (19) Toulmin, S. E. The Uses of Argument; Cambridge University Press: Cambridge, U.K., 2003. (20) Kang, H.; Thompson, J.; Windschitl, M. Creating Opportunities for Students To Show What They Know: The Role of Scaffolding in Assessment Tasks. Sci. Educ. 2014, 98 (4), 674−704. (21) Chi, M. T.; Wylie, R. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educ. Psychol. 2014, 49 (4), 219−243. (22) Linn, M. C. The Knowledge Integration Perspective on Learning and Instruction. In The Cambridge Handbook of the Learning Sciences; Sawyer, R. K., Ed.; Cambridge Handbooks in Psychology; Cambridge University Press: Cambridge, U.K., 2005; pp 243−264. (23) Lee, H.-S.; Liu, O. L.; Linn, M. C. Validating Measurement of Knowledge Integration in Science Using Multiple-Choice and Explanation Items. Appl. Meas. Educ. 2011, 24 (2), 115−136. (24) Strevens, M. No Understanding without Explanation. Stud. Hist. Philos. Sci., Part A 2013, 44 (3), 510−515. (25) Gopnik, A. Explanation as Orgasm and the Drive for Causal Knowledge: The Function, Evolution, and Phenomenology of the



SUMMARY I hope that I have convinced you of the importance of asking students to explain: to themselves, to others, and perhaps most importantly (if we want lasting change) on the assessments that we use to assign grades. There is an enormous evidence base for the efficacy of using student-constructed explanations as a learning tool, and yet, we rarely use them in our assessment practices. I hope you will take up the banner, design assessments that provide explicit evidence for the construct you want to measure, and not be satisfied by merely assessing low level knowledge. In conjunction with this, I hope you will be inspired to redesign your curricula so that students can build a foundation of core ideas that can be used as the basis for predicting how novel systems will behave and for use when needed. If we do not ask students to put together coherent explanations, we cannot be surprised when even our best students do not understand.



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

The authors declare no competing financial interest. Melanie M. Cooper, Professor of Chemistry and LappanPhillips Professor of Science Education at Michigan State University, Lansing, Michigan, received the 2014 American Chemical Society Award for Achievement in Research for the Teaching and Learning of Chemistry, sponsored by Pearson Education, on March 17, 2014, in Dallas, Texas. This paper is adapted from her award address.



ACKNOWLEDGMENTS The author would like to thank Mike Klymkowsky and Sonia Underwood for helpful suggestions and edits. This work is supported by the National Science Foundation under DUE 0816692, DUE 1043707 (1420005), and DUE 1122472 (1341987). Any opinions, findings, conclusions, or recommendations expressed here are those of the authors and do not necessarily reflect the views of the National Science Foundation.



REFERENCES

(1) Freeman, S.; Eddy, S. L.; McDonough, M.; Smith, M. K.; Okoroafor, N.; Jordt, H.; Wenderoth, M. P. Active Learning Increases Student Performance in Science, Engineering, and Mathematics. Proc. Natl. Acad. Sci. U.S.A. 2014, 111, 8410−8415. (2) Cooper, M. M.; Corley, L. M.; Underwood, S. M. An Investigation of College Chemistry Students’ Understanding of Structure−Property Relationships. J. Res. Sci. Teach. 2013, 50, 699− 721. F

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Commentary

Theory Formation System. In Explanation and Cognition; Keil, F. C., Wilson, R. A., Eds.; MIT Press: Cambridge, MA, 2000. (26) Gopnik, A. Explanation as Orgasm*. Minds Mach. 1998, 8 (1), 101−118. (27) Cooper, M. M.; Grove, N.; Underwood, S. M.; Klymkowsky, M. W. Lost in Lewis Structures: An Investigation of Student Difficulties in Developing Representational Competence. J. Chem. Educ. 2010, 87, 869−874. (28) Cooper, M. M.; Underwood, S. M.; Hilley, C. Z. Development and Validation of the Implicit Information from Lewis Structures Instrument (IILSI): Do Students Connect Structures with Properties? Chem. Educ. Res. Pract. 2012, 13, 195−200. (29) Underwood, S. M.; Reyes-Gastelum, D.; Cooper, M. M. Answering the Questions of Whether and When Student Learning Occurs: Using Discrete-Time Survival Analysis To Investigate How College Chemistry Students’ Understanding of Structure-Property Relationships Evolves. Sci. Educ., in press. (30) Cooper, M. M.; Williams, L. C.; Underwood, S. M. Student Understanding of Intermolecular Forces: A Multimodal Study. J. Chem. Educ. 2015, DOI: 10.1021/acs.jchemed.5b00169. (31) Novak, J. D. A Theory of Education; Cornell University Press: Ithaca, NY, 1977. (32) Bretz, S. L. Novak’s Theory of Education: Human Constructivism and Meaningful Learning. J. Chem. Educ. 2001, 78, 1107−1117. (33) Smith, C. L.; Wiser, M.; Anderson, C. W.; Krajcik, J. S. Implications of Research on Children’s Learning for Standards and Assessment: A Proposed Learning Progression for Matter and the Atomic-Molecular Theory. Meas. Interdiscip. Res. Perspect. 2006, 4, 1− 98. (34) Stern, L.; Ahlgren, A. Analysis of Students’ Assessments in Middle School Curriculum Materials: Aiming Precisely at Benchmarks and Standards. J. Res. Sci. Teach. 2002, 39 (9), 889−910. (35) Maeyer, J.; Talanquer, V. The Role of Intuitive Heuristics in Students’ Thinking: Ranking Chemical Substances. Sci. Educ. 2010, 94, 963−984. (36) McClary, L.; Talanquer, V. Heuristic Reasoning in Chemistry: Making Decisions about Acid Strength. Int. J. Sci. Educ. 2011, 33 (10), 1433−1454. (37) National Research Council. Knowing What Students Know: The Science and Design of Educational Assessment; Pellegrino, J. W., Chudowsky, N., Glaser, R., Eds.; National Academies Press: Washington, DC, 2001. (38) National Research Council. Developing Assessments for the Next Generation Science Standards; The National Academies Press: Washington, DC, 2014. (39) Wilson, M. Constructing Measures: An Item-Response Modeling Approach; Erlbaum: Mahwah, NJ, 2005. (40) Claesgens, J.; Scalise, K.; Wilson, M.; Stacy, A. Mapping Student Understanding in Chemistry: The Perspectives of Chemists. Sci. Educ. 2009, 93, 56−85. (41) Mislevy, R. J.; Almond, R. G.; Lukas, J. F. A Brief Introduction to Evidence-Centered Design; The National Center for Research on Evaluations, Standards, Student Testing (CRESST), Center for Studies in Education, UCLA: Los Angeles, CA, 2003. (42) Towns, M. H. Guide To Developing High-Quality, Reliable, and Valid Multiple-Choice Assessments. J. Chem. Educ. 2014, 91 (9), 1426−1431. (43) Bryfczynski, S. P. BeSocratic: An Intelligent Tutoring System for the Recognition, Evaluation, and Analysis of Free-Form Student Input. Doctoral Dissertation, Clemson University, 2012. (44) Cooper, M. M.; Underwood, S. M.; Bryfczynski, S. P.; Klymkowsky, M. W. A Short History of the Use of Technology to Model and Analyze Student Data for Teaching and Research. In Tools of Chemistry Education Research; Cole, R., Bunce, D., Eds.; ACS Symposium Series; American Chemical Society, 2014; Vol. 1166, pp 219−239.

(45) Corcoran, T.; Mosher, F. A.; Rogat, A. Learning Progressions in Science: An Evidence Based Approach to Reform; RR-63; Consortium for Policy Research in Education: Philadelphia, PA, 2009. (46) Cooper, M. M.; Klymkowsky, M. W. Chemistry, Life, the Universe and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform. J. Chem. Educ. 2013, 90, 1116−1122. (47) Cooper, M. M.; Underwood, S. M.; Hilley, C. Z.; Klymkowsky, M. W. Development and Assessment of a Molecular Structure and Properties Learning Progression. J. Chem. Educ. 2012, 89, 1351−1357. (48) Cooper, M. M.; Williams, L. C.; Underwood, S. M.; Klymkowsky, M. W. Are Non-Covalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches. Proceedings of the National Academy of Sciences, submitted for publication.

G

DOI: 10.1021/acs.jchemed.5b00203 J. Chem. Educ. XXXX, XXX, XXX−XXX