Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX
pubs.acs.org/jchemeduc
Modifying Laboratory Experiments To Promote Engagement in Critical Thinking by Reframing Prelab and Postlab Questions Jon-Marc G. Rodriguez and Marcy H. Towns*
Downloaded via UNIV OF SUNDERLAND on October 13, 2018 at 15:06:29 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.
Department of Chemistry, Purdue University, West Lafayette, Indiana 47907, United States ABSTRACT: As described by the National Research Council, science practices reflect, in part, the way science is done. When researchers are developing an explanation for a phenomenon, they are using a combination of knowledge and skills reflected in the science practices. Laboratory-based chemistry courses provide the opportunity for students to move beyond surface-level thinking as they use quantitative reasoning skills, analyze data, and draw connections between observations and explanations. Thus, when students in a laboratory course are utilizing science practices, they are making use of the shared set of tools used by researchers. Here we discuss laboratory curriculum development that focuses on taking current laboratory experiments and modifying them by framing the prelab and postlab questions in terms of science practices, which provides the opportunity for students to engage in critical thinking. This approach toward creating a practice-centered curriculum is an easy way instructors can promote engagement in science practices without having to expend the time and resources needed to completely redesign laboratory experiments. KEYWORDS: First-Year Undergraduate/General, Laboratory Instruction
C
hemistry education is implicitly associated with laboratory instruction; however, it is worth validating the assumption that laboratory instruction is effective at promoting student learning. A review of the literature reveals there is a lack of evidence that directly correlates student learning with laboratory instruction, with some researchers using this gap in the literature to build an argument that chemistry courses should not have a laboratory component.1−5 Taken within the context of the enormous cost and time expenditure exhausted by laboratory courses, there is a need for research that evaluates the role of laboratories in learning chemistry. As indicated in the literature, part of the problem with learning in laboratory courses is the need for the goals of laboratory work to be more intentional, communicated more explicitly, and assessed more systematically, with multiple studies confirming there is a general disconnect between student and faculty goals in chemistry laboratory courses.6−12 In order to adequately address these claims and state whether or not learning is occurring it is necessary to be explicit about the nature of learning expected from these courses. In agreement with the National Research Council,13 this work asserts that the primary purpose of science education (e.g., chemistry laboratory coursework) is to emphasize the nature of science as the confluence of knowledge and skills by focusing on engagement in science practices, which encompasses: 1. Asking questions 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations © XXXX American Chemical Society and Division of Chemical Education, Inc.
7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information The importance placed on student engagement in skills such as these in university-level science education, particularly in chemistry education, has been recognized in the literature, and has been described in the ACS guidelines for bachelor degree programs as critical for a “rigorous and excellent” chemistry program that adequately prepares students for careers in chemistry.14−19 Therefore, our intent is to provide a resource for instructors to develop laboratory curriculum that explicitly promotes student engagement in science practices, which makes the goal of laboratory coursework clear, concise, and measurable.
■
PRACTICE-CENTERED LABORATORY COURSEWORK
Why Practices?
As seen in the description for each of the science practices provided in Table 1, there is some overlap. For example, developing and using models often involves describing a process using mathematical formalisms (i.e., using mathematical and computational thinking) that fit empirical data. The blending of the descriptions provided is a result of the nature of scientific inquiry; scientists have a set of tools they use to address questions, often making use of multiple tools at once. However, the key point is that students are engaging in Received: August 20, 2018 Revised: September 20, 2018
A
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Table 1. Science Practices (Adapted from NRC Framework)13 Science Practice
Description
1. Asking questions 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information
Asking scientific questions that can be investigated and addressed empirically Using a variety of models (mathematical, conceptual, etc.) as tools that reflect evidence, explain phenomena, and have predictive power; understanding the limits of models and the appropriate context to use a particular model Taking into account experimental design considerations; gathering data that addresses research questions; deciding on appropriate tools/instruments to collect data Using tools to look for patterns and trends in data; considering implications of sources of error or evidence that conflicts with existing models; refining models based on emerging evidence Expressing relationships mathematically; using computational methods for data analysis and mathematical modeling; using mathematical models to make predictions and describe phenomena Providing coherent explanations for processes; making connections between evidence and models Collaborating with peers; critically considering the validity of proposed explanations; communicating connections between claims and data Oral and written dissemination of concepts and results (including the use of data, figures, graphs, and mathematical formalism to communicate meaning); using primary literature as a source of information; critically evaluating claims presented in primary literature and other media
Similarly, critical thinking skills are involved when engaging in multiple practices simultaneously, for example, using new data (analyzing and interpreting data) to refine existing models (developing and using models).
activities that reflect the nature and process of science, and practices (as a unit) are clearly defined by the National Research Council (NRC),13 and thus lend themselves to be systematically measured, assessed, and identified. We acknowledge the important role laboratory coursework plays in chemistry education.18,19 On the basis of the nature of the science practices presented in Table 1, which center on generation, analysis, and discussion of empirical data, laboratory work provides an excellent opportunity for students to engage in science practices. When looking at the NGSS science practices, the ACS guidelines for B.S. degree programs, and the goals chemistry faculty have for laboratory coursework, there is an emerging consensus: They each view critical thinking as an important part of instruction (see Figure 1).6−8,13,18,19 As mentioned by Stowe and Cooper,21 “critical
Development of Tasks That Promote Engagement in Science Practices
Research has identified the need for explicitly articulated and systematically measurable goals, and through qualitative and quantitative surveying three common goals have emerged that are consistent among faculty that teach undergraduate chemistry laboratory-based courses; namely, in addition to engaging in critical thinking, students should be familiar with considerations of experimental design and learn laboratory skills/techniques.6−8 Experimental design considerations are encompassed in the science practice planning and carrying out investigations, and although learning chemistry laboratory skills/techniques is not explicitly outlined in the NGSS science practices, it is also implied when planning and carrying out investigations. Being able to design experiments involves considering which instruments to use and which techniques will be useful for collecting data that will address the research question. Due to the nature of the science practices as domaingeneral, they do not explicitly address learning specific data collection techniques (creating a standard curve, pipetting, titrating, etc.), and instead emphasize the overarching practices used in all science domains. For the purposes of this work, we will focus on the development and emphasis of science practices that are relevant to all scientific fields. As mentioned previously, faculty goals are addressed and clearly articulated in the NGSS science practices. Thus, from an assessment standpoint, the recent movement from science involving the vaguely defined act of inquiry (National Science Education Standards released in 1996) to a practice-centered view (Next Generation Science Standards released in 2012) is a positive step for educators, due to the clearly defined nature of the set of practices.15,20,22,23 The clearly defined practices have the potential to improve the ability to create and assess goals in laboratory courses, with a single item on an exam potentially assessing student ability with respect to multiple practices. The challenge then becomes assessing engagement in science practices in courses with large enrollment, which is not trivial since practices are noncontent goals that involve more than simple recall; this requires adaption of current assessments without losing the validity, reliability, and convenience afforded by traditional forms of assessment.14,17,22
Figure 1. Diagram illustrating the common emphasis placed on critical thinking, as presented in the literature: NGSS science practices, ACS guidelines for B.S. degree programs, and faculty goals.
thinking” is a term that is often vaguely defined but can be reframed in a readily measurable way by describing course objectives using the “language of practices”. In terms of the NGSS practices, critical thinking skills are involved in each of the practices, such as when analyzing and interpreting data, constructing explanations, or engaging in argument f rom evidence. B
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
context of this paper, the prelab and postlab questions for the traditional and modified laboratory experiments were characterized using the “Criteria for Constructed Response Assessment Tasks” from the 3D-LAP developed by Laverty et al.;16 thus, each question was assigned the appropriate code if it met the criteria for engagement in a particular science practice (see Table 2). After initial assignment of codes to each prelab and
On a similar note, due to the extent to which information is readily accessible, it becomes increasingly important to develop assessments and tasks that go beyond content knowledge and require more than factual recall, rote memorization, or simply looking up answers online.15,24,25 Addressing the need to develop assessments that measure engagement in the science practices, a group of researchers designed the Three-Dimensional Learning Assessment Protocol (3D-LAP), which provides a systematic way to evaluate test items based on whether the test item involves student engagement with the three dimensions of the NRC framework: science practices, crosscutting concepts, and core ideas.16 Due to the scope of this work, we will focus on the portion of the 3D-LAP related to science practices. The protocol developed by the researchers is straightforward to use, involving a simple and practical set of criteria for each science practice. In order for an assessment item to be described as involving student engagement in a specific science practice, the assessment item must meet each of the criteria outlined by the 3D-LAP.16 It is also important to note that the research group that designed the protocol created two different sets of criteria for evaluating test items, one for assessment tasks that involve “constructed responses” (free-response or short answer) and one for assessment tasks that involve “selected responses” (multiplechoice questions), with the development of two separate sets of criteria serving as an acknowledgment of the need for assessments that cater to large-enrollment courses.16
Table 2. Assigned Codes for Analysis of Prelab/Postlab Questions Code
Science Practice
N/A NSP SP1 SP2 SP3 SP4 SP5 SP6/7a
Not applicable No science practice 1. Asking questions 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information
SP8 a
Science practices 6 and 7 are combined in the 3D-LAP.
postlab question, an additional researcher independently coded the questions; the code assignments were then adjusted as necessary to reach a consensus between researchers.27 Having an additional researcher involved in this process was particularly useful to operationalize the criteria provided by Laverty et al.16 From a practical standpoint, when adapting prelab and postlab questions using the 3D-LAP we suggest instructors collaborate with other faculty members to help make decisions regarding what would need to be added to a question in order to elicit engagement in science practices.
■
COMPARISON OF TRADITIONAL AND MODIFIED PRELAB AND POSTLAB QUESTIONS In order to clearly communicate what it would entail to develop a modified laboratory curriculum, we provide an example of a traditional laboratory experiment26 and a modified laboratory experiment that has been adapted to promote student engagement in science practices. The laboratory experiments are placed in the context of acid− base titrations, which was chosen as a context because it is ubiquitous in undergraduate general chemistry. A survey of acid−base titration experiments from different general chemistry courses reveals that the protocol is largely the same (i.e., students do multiple colorimetric and/or potentiometric titrations to determine the concentration of an unknown solution), with the difference between the courses being how the experiment is framed in terms of the prelab questions and the postlab discussion/report. Taking a standard protocol and framing it in terms of science practices provides a simple model instructors can follow; rather than completely developing new laboratory experiments, which takes significant time and resources, they can simply modify their existing laboratory experiments by focusing on adapting prelab/postlab questions to engage students in critical thinking. This is analogous to the suggestion made by Underwood et al.25 for faculty to adapt current assessments to more explicitly engage students in critical thinking, rather than develop completely new assessments. Therefore, the modified laboratory experiment was developed by using the 3D-LAP to adapt the traditional laboratory prelab and postlab questions. In the sections that follow we present our discussion of the extent in which a traditional laboratory experiment promotes engagement in science practices. This analysis was informed by the work of Reed, Brandriet, and Holme,17 in which the authors used the 3D-LAP to analyze the role of science practices in ACS Exams Institute (ACS-EI) test items. In the
Traditional Prelab and Postlab Questions
Initial analysis of the traditional laboratory questions revealed that, among the six prelab questions, four required the students to do calculations, one acted as a reminder for students to bring a USB memory storage device, and one asked the students to copy the procedures into their laboratory notebook. Further analysis using the 3D-LAP indicated that only one of the prelab questions elicited engagement in science practices (SP5; using mathematics and computational thinking). As stated by Reed, Brandriet, and Holme,17 it is important to keep in mind that practices involve content and skills, which involves demonstrating what you can do with content knowledge. For this reason, simply performing a calculation, without any subsequent prompting that requires the student to think about the value or demonstrate an understanding of the value, is not adequate for evidence that the student is engaging in the science practice of using mathematics and computational thinking. This is encompassed in the 3D-LAP criteria; in order for a task to promote engagement in the science practice using mathematics and computational thinking, it must meet the following criteria:16 1. Question gives an event, observation, or phenomenon. 2. Question asks student to perform a calculation or statistical test, generate a mathematical representation, or demonstrate a relationship between parameters. 3. Question asks student to give a consequence or an interpretation (not a restatement) in words, diagrams, C
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Table 3. Traditional Prelab and Postlab Questions
were sorted using the ACS Anchoring Concepts Content Map (ACCM), certain science practices were more common for specific big ideas.17 In their analysis, Reed and colleagues17 noted that for test items that assessed content related to the big idea category of “Chemical Reactions”, which is the category that is relevant for our discussion here regarding acid−base reactions, developing and using models was the most common science practice. The idea that certain practices may be more relevant or better suited for a particular topic is useful to keep in mind for assessment and was perhaps a missed opportunity for this traditional laboratory experiment, which did not involve student engagement in developing and using models.
symbols, or graphs of their results in the context of the given event, observation, or phenomenon. That said, although multiple prelab questions had students perform calculations, in most cases students were not asked to assign any meaning to the values, so it did not involve engagement in science practices. Applying this analysis to the four postlab questions, only the first postlab question elicited engagement in sciences practices, SP4 (analyzing and interpreting data) and SP5 (using mathematics and computational thinking). Table 3 provides a summary of the codes assigned to each prelab and postlab question. Overall, the general impression of the traditional laboratory experiment was that it emphasized calculations without connecting the mathematics to chemistry, and science practices as a whole were not well-represented. From previous work that involved analyzing ACS test items for engagement in science practices, generally science practices were not common among the test items, but when the items
Modified Prelab and Postlab Questions
The development of the modified laboratory experiment centered on science practices involved starting with the traditional lab discussed above and focusing on modifying D
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Table 4. Modified Prelab and Postlab Questions
the science practices as being rooted in both content and skill, focusing on a conceptual understanding in the prelab fits well with how the science practices are conceptualized. As shown in Table 4, in comparison to the traditional prelab and postlab questions, the general format for the modified laboratory experiment involves the representation of each of the science practices, fewer questions, and a greater emphasis on conceptual understanding. In addition, for the prelab, the first question prompts students to consider the general questions that are being investigated in the experiment and how the data generated from the method used will help respond to these questions. This reflects engagement in the science practice planning and carrying out investigations, and it addresses Augustian and Seery’s28 suggestions regarding emphasizing the general overview of the experiment instead of specific details (such as how to calculate the concentration of analyte in a titration or simply writing down all the procedures in the laboratory manual).
the prelab and postlab questions to more explicitly engage students in science practices. Additionally, the modified laboratory experiment was designed so that students would engage in each of the eight science practices over the course of the laboratory experiment. Furthermore, in addition to using the 3D-LAP criteria to write these questions, a recently published review paper by Agustian and Seery28 provided some input regarding how to frame prelab assignments. Agustian and Seery28 discussed how students have too much to think about in lab (chemistry concepts, procedure, lab techniques, safety, etc.), and it is best to minimize the amount of information given to the students before the laboratory session by only focusing on “supportive information” in prelabs (what the students need in order to have a general understanding and overview of the theory behind the task), instead of “procedural information” (the specific information students need to complete each step of the task). These ideas were used to help frame the prelab questions in the modified laboratory experiment by primarily focusing on ideas such as the connection between the content, the question at the center of the lab experiment, and how the method/approach is well suited to answer the question of interest. Due to the nature of
■
CONCLUSION In our discussion of placing this study in the context of science practices, we integrated recommendations from the National E
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
(8) Bruck, L. B.; Towns, M.; Bretz, S. L. Faculty perspectives of undergraduate chemistry laboratory: Goals and obstacles to success. J. Chem. Educ. 2010, 87 (12), 1416−1424. (9) DeKorver, B. K.; Towns, M. H. General Chemistry Students’ Goals for Chemistry Laboratory Coursework. J. Chem. Educ. 2015, 92 (12), 2031−2037. (10) Galloway, K. R.; Bretz, S. Development of an Assessment Tool to Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory. J. Chem. Educ. 2015, 92 (7), 1149−1158. (11) Galloway, K. R.; Bretz, S. Measuring Meaningful Learning in the Undergraduate Chemistry Laboratory: A National, CrossSectional Study. J. Chem. Educ. 2015, 92 (12), 2006−2018. (12) Reid, N.; Shah, I. The role of laboratory work in university chemistry. Chem. Educ. Res. Pract. 2007, 8 (2), 172−185. (13) National Research Council. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas; National Academies Press: Washington, DC, 2012. http://doi.org/10.17226/ 13165. (14) Brandriet, A.; Reed, J. J.; Holme, T. A Historical Investigation into Item Formats of ACS Exams and Their Relationships to Science Practices. J. Chem. Educ. 2015, 92 (11), 1798−1806. (15) Cooper, M. M. Chemistry and the Next Generation Science Standards. J. Chem. Educ. 2013, 90 (6), 679−680. (16) Laverty, J. T.; Underwood, S. M.; Matz, R. L.; Posey, L. A.; Jardeleza, E.; Cooper, M. M.; et al. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol. PLoS One 2016, 11 (9), e0162333. (17) Reed, J. J.; Brandriet, A. R.; Holme, T. A. Analyzing the Role of Science Practices in ACS Exam Items. J. Chem. Educ. 2017, 94 (1), 3− 10. (18) Wenzel, T. J.; Larive, C. K.; Frederick, K. A. Role of Undergraduate Research in an Excellent and Rigorous Chemistry Curriculum. J. Chem. Educ. 2012, 89 (1), 7−9. (19) Wenzel, T. J.; Mccoy, A. B.; Landis, C. R. An Overview of the Changes in the 2015 ACS Guidelines for Bachelor’s Degree Programs. J. Chem. Educ. 2015, 92, 965−968. (20) Stowe, R.; Cooper, M. Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. J. Chem. Educ. 2017, 94 (12), 1852−1859. (21) Stowe, R. L.; Cooper, M. M. Practicing What We Preach: Assessing "Critical Thinking" in Organic Chemistry. J. Chem. Educ. 2017, 94, 1852 DOI: 10.1021/acs.jchemed.7b00335. (22) National Research Council. Developing Assessments for the Next Generation Science Standards;Pellegrino, J. W.; Wilson, M. R.; Koenig, J. A.; Beatty, A. S., Eds.; The National Academies Press: Washington, DC, 2014. DOI: 10.17226/18409 (23) Osborne, J. Teaching Scientific Practices: Meeting the Challenge of Change. Journal of Science Teacher Education 2014, 25, 177−196. (24) Reed, J. J.; Holme, T. A. The Role of Non-Content Goals in the Assessment of Chemistry Learning. In Innovative Uses of Assessment for Teaching and Research; Kendhammer, L. K., Murphy, K. L., Eds.; Amercian Chemical Society: Washington, DC, 2014; pp 147−160. (25) Underwood, S.; Posey, L.; Herrington, D.; Carmel, J.; Cooper, M. Adapting Assessment Tasks To Support Three-Dimensional Learning. J. Chem. Educ. 2018, 95 (2), 2017−207. (26) Chemistry Department, Purdue University. “Acid-Base Equilibria: Monoprotic Acids”. In Chemistry 12901: Laboratory Manual; Hayden-McNeil, LLC, 2016; pp 115−132. (27) Charmaz, K. Constructing Grounded Theory: A Practical Guide through Qualitative Analysis; Sage Publications: Thousand Oaks, CA, 2006. (28) Agustian, H. Y.; Seery, M. K. Reasserting the role of prelaboratory activities in chemistry education: a proposed framework for their design. Chem. Educ. Res. Pract. 2017, 18, 518−532. (29) Galloway, K. R.; Bretz, S. Measuring Meaningful Learning in the Undergraduate General Chemistry and Organic Chemistry Laboratories: A Longitudinal Study. J. Chem. Educ. 2015, 92 (12), 2019−2030.
Research Council, input from the American Chemical Society, and research regarding goals communicated by chemistry faculty, but we did not mention any student-generated ideas. It is important to note that there exists ample literature on student goals for laboratory coursework. A review of the literature reveals that students are not engaging in a deeper conceptual understanding in the laboratory in part due to primarily being motivated by affective constructs (such as the desire to leave lab early or get a good grade); although currently overlooked by faculty, these goals can be properly addressed by creating the space for students to engage in laboratory work and make conceptual connections without concern about academic consequences for making “mistakes”.9−11,29−32 Currently there is a lack of literature that supports the claim that chemistry laboratory coursework helps students learn, suggesting the need for changes at the level of curriculum. By using science practices as a lens to make connections between the lecture content and laboratory coursework, the underlying aim of a practice-centered curriculum is to help students develop a more sophisticated view of chemistry and, more generally, science. This work serves as a resource for practitioners interested in providing the opportunity for students to engage in critical thinking in a way that can be easily implemented and readily assessed.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. ORCID
Jon-Marc G. Rodriguez: 0000-0001-6949-6823 Marcy H. Towns: 0000-0002-8422-4874 Notes
The authors declare no competing financial interest.
■
ACKNOWLEDGMENTS We wish to thank the Towns research group for their support and helpful comments on the paper and gratefully acknowledge Kinsey Bain and Kathleen Jeffery for their suggestions on this work.
■
REFERENCES
(1) Elliott, M. J.; Stewart, K. K.; Lagowski, J. J. The Role of the Laboratory in Chemistry Instruction. J. Chem. Educ. 2008, 85 (1), 145−149. (2) Hawkes, S. J. Chemistry Is Not a Laboratory Science. J. Chem. Educ. 2004, 81 (9), 1257. (3) Hofstein, A.; Lunetta, V. N. The Laboratory in Science Education: Foundations for the Twenty-First Century. Sci. Educ. 2004, 88 (1), 28−54. (4) Hofstein, A.; Lunetta, V. N. The Role of the Laboratory in Science Teaching: Neglected Aspects of Research. Review of Educational Research 1982, 52 (2), 201−217. (5) Singer, S. R.; Nielson, N. R.; Schweingruber, H. A. DisciplineBased Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; National Academies Press: Washington, DC, 2012. http://doi.org/10.17226/13362. (6) Bretz, S.; Fay, M.; Bruck, L. B.; Towns, M. H. What faculty interviews reveal about meaningful learning in the undergraduate laboratory. J. Chem. Educ. 2013, 90 (3), 281−288. (7) Bruck, A. D.; Towns, M. Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory. J. Chem. Educ. 2013, 90 (6), 685−693. F
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
(30) Galloway, K. R.; Bretz, S. Using cluster analysis to characterize meaningful learning in a first-year university chemistry laboratory course. Chem. Educ. Res. Pract. 2015, 16 (4), 879−892. (31) Galloway, K. R.; Bretz, S. Video episodes and action cameras in the undergraduate chemistry laboratory: eliciting student perceptions of meaningful learning. Chem. Educ. Res. Pract. 2016, 17, 139−155. (32) Galloway, K. R.; Malakpa, Z.; Bretz, S. Investigating Affective Experiences in the Undergraduate Chemistry Laboratory: Students’ Perceptions of Control and Responsibility. J. Chem. Educ. 2016, 93 (2), 227−238.
G
DOI: 10.1021/acs.jchemed.8b00683 J. Chem. Educ. XXXX, XXX, XXX−XXX