Faculty Goals, Inquiry, and Meaningful Learning in the Undergraduate

Faculty goals for learning in the undergraduate General. Chemistry and Organic Chemistry laboratory were measured. The experiments they selected for t...
3 downloads 8 Views 866KB Size
Chapter 6

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Faculty Goals, Inquiry, and Meaningful Learning in the Undergraduate Chemistry Laboratory Stacey Lowery Bretz,* Kelli Rush Galloway, Joanna Orzel, and Elizabeth Gross Department of Chemistry & Biochemistry, Miami University, Oxford, Ohio 45056, United States *E-mail: [email protected]

Faculty goals for learning in the undergraduate General Chemistry and Organic Chemistry laboratory were measured. The experiments they selected for the laboratory courses were characterized with regard to inquiry. Students in these courses were asked to report their expectations and experiences with regard to meaningful learning. Analysis of these three data sets showed that faculty goals do not always align with their experiments and that there is little connection between faculty goals and students’ learning.

Introduction From Ira Remsen’s wonder at his remarkable observations upon dropping a copper penny into nitric acid, to Oliver Sacks’ tales of his childhood explorations in Uncle Tungsten (1), chemists have long understood the importance of hands-on experimentation in the laboratory to learning chemistry. Several reviews document the chronology of the role of laboratory in chemistry education from the early 19th century through the next two centuries (2–6). Given the nearly ubiquitous existence of the teaching laboratory in undergraduate chemistry courses, it is surprising how little evidence exists to support the widely held view that laboratory courses are essential:

© 2016 American Chemical Society Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

“Laboratories are one of the characteristic features of education in the sciences…rare to find any science course…without a substantial component of laboratory activity. However, very little justification is normally given… assumed to be necessary and important (7).”

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

“…research has failed to show simple relationships between experiences in the laboratory and student learning (5).” “Duplicating what we chemists do in our laboratories (or what chemists of earlier generations used to do) does not enhance students’ understanding of chemistry’s centrality, but makes chemistry an irrelevance. Laboratory classes do not help students to understand how chemical principles affect their universe...The most important issue in the context of laboratory classes is whether there needs to be a laboratory program at all (8).” In addition to sparse evidence regarding its effectiveness, the costs of laboratory instruction must also be considered: reagents that are ordered and consumed each year for hundreds of thousands of students, disposal and treatment of waste, as well as stipends and tuition for graduate student teaching assistants. The question must be asked: do laboratory courses in chemistry warrant their costs? Surely an argument could be constructed that labs are worth the significant financial infrastructure because carefully articulated goals for learning in the chemistry laboratory would lead to purposefully chosen experiments in the laboratory curriculum, culminating in meaningful learning and students’ experiences (Figure 1). Framing such an argument requires data, of course. This paper reports the findings of a research study designed to investigate alignment between faculty goals for laboratory learning and the experiments selected for students to carry out in undergraduate General Chemistry and Organic Chemistry laboratory courses.

Figure 1. Ideally, students’ expectations for learning in the undergraduate chemistry laboratory and their experiences would be influenced by the laboratory curriculum which was constructed in order to align with faculty goals. 102 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Methods Two studies were designed to investigate the alignment (or lack thereof) across faculty goals, laboratory curriculum, and student expectations and experiences (Figure 2). Study 1 analyzed faculty goals for the undergraduate laboratory using a previously published instrument by Bruck and Towns (9) and investigated whether these goals corresponded to the experiments chosen for the laboratory curriculum which was analyzed using a previously published rubric to characterize the level of inquiry in the experiments (10, 11). Study 2 examined these same faculty goals by comparing them to their students’ responses on a previously published instrument that measures meaningful learning in the undergraduate chemistry teaching laboratory (12). The research protocol was approved by the Institutional Review Board and all respondents provided informed consent.

Figure 2. The trio of instruments used to collect data and analyze alignment across faculty goals, laboratory experiments, and students’ expectations and experiences in the undergraduate chemistry laboratory. Research Questions Study 1: How are faculty goals for General Chemistry and Organic Chemistry laboratory aligned with their selected laboratory experiments as characterized by their level of inquiry? Study 2: How are faculty goals for General Chemistry and Organic Chemistry laboratory aligned with students’ expectations and experiences? Sample Faculty whose students had completed the Meaningful Learning in the Laboratory Instrument (13) were invited to also complete the Faculty Goals Survey (9). A total of 34 faculty responded (Table 1). These faculty provided copies of the experiments for their General Chemistry (GC) or Organic Chemistry (OC) laboratory courses. A total of 289 experiments were analyzed across four courses: 103 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

• • • •

General Chemistry I (N=145 experiments) General Chemistry II (N=61 experiments) Organic Chemistry I (N=73 experiments) Organic Chemistry II (N=10 experiments)

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Table 1. Chemistry faculty who responded to the Faculty Goals Survey Institution Type

General Chemistry (GC)

Organic Chemistry (OC)

Total

Community College

1

0

1

Liberal Arts

1

2

3

Comprehensive

10

3

13

Research University

10

7

17

Total

22

12

34

Instruments Faculty Goals Survey The Faculty Goals Survey (FGS) was developed by Bruck and Towns (9) in order to quantitatively measure the prevalence of chemistry faculty’s goals for learning in the undergraduate chemistry laboratory that had previously been reported in in-depth qualitative studies (14, 15). The FGS consists of 29 items across 7 factors (research experience, group work, error analysis, connections between laboratory and lecture, transferable skills (both lab specific and not), and laboratory writing) to which Faculty respond to FGS items using a Likert scale of 1 (strongly disagree) to 6 (strongly agree). The FGS was administered using the online survey tool Qualtrics. Data were analyzed using the statistics package SPSS, and descriptive statistics, including histograms, were calculated for each FGS item by course. Each FGS item was coded a priori as cognitive, affective, or psychomotor using Novak’s Meaningful Learning framework (16, 17). Interrater agreement was calculated among three researchers until consensus was reached regarding the cognitive, affective, and psychomotor codes. The same FGS data set was drawn upon for both Study 1 and Study 2.

Inquiry Rubric The inquiry rubric characterizes 6 dimensions (see Table 2) of a laboratory experiment and characterizes them as either provided to students or not. The sum of these six elements in a laboratory experiment provides an indication of the “degrees of freedom” that a student has in making choices about what problem 104 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

to investigate, data to collect, data to analyze and against what theory, as well as how to communicate results and what conclusions can be drawn. As the degrees of freedom increase, the experiment can be considered to offer more opportunities for inquiry, ranging from confirmation (all procedural details have been selected by faculty and provided to students) to authentic inquiry (no details have been provided regarding the theoretical, experimental, or analytical choices to be made).

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Table 2. Inquiry rubric to evaluate undergraduate chemistry laboratory experiments Level 0 Confirmation

Level ½ Structured Inquiry

Level 1 Guided Inquiry

Level 2 Open Inquiry

Level 3 Authentic Inquiry

Problem/ Question

Provided

Provided

Provided

Provided

Not Provided

Theory/ Background

Provided

Provided

Provided

Provided

Not Provided

Procedures/ Design

Provided

Provided

Provided

Not Provided

Not Provided

Results Analysis

Provided

Provided

Not Provided

Not Provided

Not Provided

Results Communication

Provided

Not Provided

Not Provided

Not Provided

Not Provided

Conclusions

Provided

Not Provided

Not Provided

Not Provided

Not Provided

*

Data sourced from Bruck, L.B.; Bretz, S.L.; Towns, M.H. Characterizing the level of inquiry in the undergraduate laboratory. J. Coll. Sci. Teach., 2008, 37, 52-58.

In Study 1, each of the 289 experiments was evaluated using the inquiry rubric. Inter-rater reliability was calculated between two researchers using Cohen’s Kappa, which equaled 0.875 (18).

Meaningful Learning in the Laboratory Instrument (MLLI) The Meaningful Learning in the Laboratory Instrument (MLLI) measures students’ expectations and experiences of the cognitive and affective domains of learning within the context of the “doing” of laboratory experiments. Each of the 30 MLLI items was coded a priori using the meaningful learning framework to be cognitive (e.g., I expect to focus on concepts, not procedures), affective (e.g., I worry about finishing on time), or a combination cognitive/affective (e.g., I felt unsure about the purpose of the procedures). MLLI was administered online via Qualtrics, and students were asked to indicate their agreement (from 0%, Completely Disagree to 100%, Completely Agree) with each statement. The MLLI was administered to students twice – 105 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

once at the beginning of the semester prior to completing any laboratory work to measure students’ expectations for learning, then again at the end of the semester to capture students’ learning experiences in the laboratory. The verbs in the items were changed to past tense for the end of semester administration. Data collection and analyses for multiple MLLI studies have been previously reported (12, 13, 19).

Results and Discussion

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Inquiry Rubric Figure 3 depicts the level of inquiry across the 289 experiments evaluated in Study 1. Confirmation and Structured Inquiry were the two most common levels in both GC I and GC II courses, while Structured Inquiry and Guided Inquiry were most common in OC I. All experiments (N=10) in OC II were from one university and judged to be Guided Inquiry. None of the 289 experiments evaluated in Study 1 were at the level of Open or Authentic Inquiry, meaning that students were never asked to generate procedures or consider elements of experimental design, nor to pose their own question to investigate.

Figure 3. Levels of Inquiry across four types of courses in Study 1. Figure 4 shows how the level of inquiry varied across schools, ranging from 100% Confirmation (School 14, S14) to 80% Guided Inquiry (School 06, S06) in GC I. (Each school who volunteered to collect MLLI and/or FGS data was assigned a two-digit number. Not all schools are included in Figure 4 because some schools did not collect and/or return complete data sets.) For the five schools who provided 106 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

GC II experiments, the level of inquiry ranged from 58% Confirmation (School 24) to 40% Guided Inquiry (School 19). The five schools providing data for OC I were predominantly Guided Inquiry.

Figure 4. Levels of Inquiry across schools in Study 1.

Faculty Goals Survey GC and OC faculty responded similarly to each of the FGS items. Data for three items are included here as representative of their responses: “Laboratory activities and experiments selected for this course are designed to focus on skills that are transferable to research-oriented laboratories,” (Figure 5) “…have students present data in multiple formats,” (Figure 6) and “…teach students to build logical arguments based on their data” (Figure 7). 107 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Figure 5. GC (left, N=22) and OC (right, N=12) faculty responses to FGS item that laboratories should be focused on skills that are transferable to research-oriented laboratories. (1 = strongly disagree, 6 = strongly agree).

Figure 6. GC (left, N=22) and OC (right, N=12) faculty responses to FGS item that the laboratory should be designed to have students present data in multiple formats, such as PowerPoint, posters, laboratory reports, etc.. (1 = strongly disagree, 6 = strongly agree).

108 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Figure 7. GC (left, N=22) and OC (right, N=12) faculty responses to FGS item that the laboratories should teach students to build logical arguments based on their data. (1 = strongly disagree, 6 = strongly agree).

Meaningful Learning in the Laboratory Instrument (MLLI) In order to explore the data sets for both Study 1 and Study 2, one school was randomly selected (School 02) from amongst those that submitted FGS responses for two or more faculty, experiments for levels analysis using the rubric, and MLLI data for at least GC I and GC II. Plots of representative MLLI responses for GC I students (N=138) at School 02 can be found in Figures 8, 9, and 10. Pre-semester expectations are plotted on the x-axis and post-semester experiences on the y-axis. The diagonal line represents responses where experiences matched expectations. Points below the diagonal line represent responses where expectations exceeded experiences. In Figure 8, slightly more than half of the students (N=66) reported that their experiences with regard to making decisions about what data to collect failed to meet their expectations. In Figure 9, more than 75% of the students (N=108) reported that while they had expected to be excited about doing chemistry, their experiences failed to meet these expectations. In Figure 10, more than 73% of the students (N=101) reported that while they had expected to be required to interpret their data beyond only doing calculations, but their experiences failed to meet these expectations.

109 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Figure 8. GC I students’ experiences vs. expectations for MLLI item 3 “to make decisions about what data to collect.”

Figure 9. GC I students’ experiences vs. expectations for MLLI item 8 “to be excited to do chemistry.”

110 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

Figure 10. GC I students’ experiences vs. expectations for MLLI item 22 “to interpret my data beyond doing calculations.”

Faculty Goals Survey and Inquiry Rubric In Study 1, the authors examined each of the FGS items to identify those that corresponded to the characterization of experiments as confirmation or inquiry experiences. Five items were identified and agreed upon by all the authors. Three FGS items corresponded to inquiry experiences in the laboratory: • • •

Teach students to build logical arguments based on data. The laboratory is designed to encourage the development of scientific reasoning skills. Laboratory is a place for students to learn to analyze data.

while two FGS items corresponded to confirmation experiences in the laboratory: • •

Explore concepts already discussed in lecture. The goal for laboratory instruction is to reinforce lecture content.

These five items were designated, respectively as Logic, Reasoning, Analyze Data, Concepts, and Lecture (Table 3).

111 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Table 3. Faculty Goals Survey responses for School 02 GC I Logic

Strongly Agree

Strongly Agree

Disagree

Strongly Agree

Agree

Strongly Agree

Concepts

Strongly Agree

Disagree

Lecture

Strongly Agree

Agree

Reasoning Analyze Data

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

GC II

To explore alignment between faculty goals and inquiry levels of experiences in laboratory experiments, data was analyzed for a randomly selected school from among those with FGS and Levels data for both GC I and GC II in the data set. The responses of the two faculty from School 02 on these five FGS items (Logic, Reasoning, Analyze Data, Concepts, Lecture) were summarized and compared to the levels of inquiry in the GC I and GC II experiments carried out at School 02. Table 3 and Figure 11 summarize the FGS and Levels data for School 02, respectively.

Figure 11. Levels of Inquiry in GC I and GC II for School 02.

Faculty Goals Survey and Meaningful Learning in the Laboratory Instrument In Study 2, because the FGS items and the MLLI items were all coded based on meaningful learning theory, we expected to identify cognitive or affective items that were identified as faculty goals and as elements of meaningful learning for students. Surprisingly, however, the items on the FGS could not be mapped reliably onto MLLI items, despite being coded as cognitive or affective by multiple researchers.

112 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Limitations

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

There are limitations to this study. We asked faculty to answer the FGS, provide their experiments, and their students to complete the MLLI twice in one semester – ideally for GC I, GC II, OC I, and OC II. Our data set consisted almost entirely of “partial” responses, e.g., faculty who did not answer the FGS, but sent experiments and whose students completed MLLI, or MLLI data for just one of the courses. This limited the data set of complete responses where FGS goals from a given instructor could be mapped to their selected experiments and their students’ MLLI responses.

Conclusions Figure 1 depicts a rational argument for how faculty goals for learning in the laboratory should drive experiment selection and ultimately students’ experiences. Two research studies were carried out to compare faculty goals with the degree of inquiry in their experiments (Study 1) and meaningful learning for their students (Study 2). In Study 1, the level of inquiry for the ‘same’ course varied across universities, with a general trend that as students moved from GC to OC, the degree of inquiry in their experiments increased. A comparison of Table 3 and Figure 11 reveals little consistency at School 02 between faculty goals and their selected experiments as some goals aligned with their selected experiments while others did not. In Study 2, while many of the students’ experiences failed to meet their expectations for cognitive and affective learning, we were ultimately unable to answer the original research question to examine how faculty goals for General Chemistry and Organic Chemistry laboratory are aligned with students’ expectations and experiences, despite using previously published data collection tools that generated reliable and valid data in previous studies. The FGS was developed from a voluminous corpus of interview data with faculty teaching General Chemistry, Organic Chemistry, and upper division (physical, analytical, biochemistry) laboratories at institutions whose laboratory program had remained unchanged for many years and at institutions that had successfully procured external funding to innovate their laboratory programs. The FGS items represent the consensus of chemistry faculty about what is important to learn in the undergraduate chemistry laboratory at a wide variety of institutions and courses. Meanwhile, the MLLI items represent cognitive and affective dimensions of the undergraduate chemistry laboratory consistent with theory about how human beings learn. The fact that these two methodological approaches resulted in non-overlapping data sets is not the result of poor research design, but rather an important piece of evidence that faculty approaches to choosing laboratory experiments for students are not aligned with opportunities for cognitive and affective learning . To further pursue the aims of Study 2, a new study was planned where faculty were asked to answer MLLI as they hoped their students would. Data analysis is underway and will be published in a future manuscript. 113 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Acknowledgments This work was supported by the Volwiler Family Endowment to the Miami University Department of Chemistry & Biochemistry and National Science Foundation grant number 0733642. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

References

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

1. 2. 3. 4.

5.

6.

7. 8.

9.

10.

11. 12.

13.

14.

Sacks, O. Uncle Tungsten: Memories of a Chemical Boyhood; Vintage Books: New York, 2002. Good, H. G. On the early history of Liebig’s laboratory. J. Chem. Educ. 1936, 13, 557–562, DOI: 10.1021/ed013p557. Hofstein, A.; Lunetta, V. N. The role of the laboratory in science teaching: Neglected aspects of research. Rev. Educ. Res. 1982, 52, 201–217. Hofstein, A.; Lunetta, V. N. The laboratory in science education: Foundations for the twenty-first century. Sci. Educ. 2004, 88, 28–54, DOI: 10.1002/ sce.10106. Hofstein, A.; Mamlok-Naaman, R. The laboratory in science education: The state of the art. Chem. Educ. Res. Pract. 2007, 8, 105–107, DOI: 10.1039/ B7RP90003A. Elliot, M. J.; Stewart, K. K.; Lagowski, J. J. The role of the laboratory in chemistry instruction. J. Chem. Educ. 2008, 85, 145–149, DOI: 10.1021/ ed085p145. Reid, N.; Shah, I. The role of laboratory work in university chemistry. Chem. Educ. Res. Pract. 2007, 8, 172–185, DOI: 10.1039/B5RP90026C. Rice, J. W.; Thomas, S. M.; O’Toole, P. In Tertiary Science Education in the 21st Century; Australian Council of Deans of Science: Melbourne, 2009; p 13. Bruck, A. D.; Towns, M. H. Development, implementation, and analysis of a national survey of faculty goals for undergraduate chemistry laboratory. J. Chem. Educ. 2013, 90, 685–693, DOI: 10.1021/ed300371n. Fay, M. E.; Grove, N. P.; Towns, M. H.; Bretz, S. L. A rubric to characterize inquiry in the undergraduate chemistry laboratory. Chem. Educ. Res. Pract. 2007, 8, 212–219, DOI: 10.1039/B6RP90031C. Bruck, L. B.; Bretz, S. L.; Towns, M. H. Characterizing the level of inquiry in the undergraduate laboratory. J. Coll. Sci. Teach. 2008, 37, 52–58. Galloway, K. R.; Bretz, S. L. Development of an assessment tool to measure students’ meaningful learning in the undergraduate chemistry laboratory. J. Chem. Educ. 2015, 92, 1149–1158, DOI: 10.1021/ed500881y. Galloway, K. R.; Bretz, S. L. Measuring meaningful learning in the undergraduate chemistry laboratory: A national, cross-sectional study. J. Chem. Educ. 2015, 92, 2006–2018, DOI: 10.1021/acs.jchemed.5b00538. Towns, M.; Bretz, S. L.; Bruck, L. B. Faculty perspectives of undergraduate chemistry laboratory: Goals and obstacles to success. J. Chem. Educ. 2010, 87, 1416–1424, DOI: 10.1021/ed900002d. 114

Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by PURDUE UNIV on November 25, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch006

15. Bretz, S. L.; Fay, M. E.; Bruck, L.; Towns, M. H. What faculty interviews reveal about meaningful learning in the undergraduate laboratory. J. Chem. Educ. 2013, 90, 281–288, DOI: 10.1021/ed300384r. 16. Novak, J. D. Human constructivism: A unification of psychological and epistemological phenomena in meaning making. Inter. J. Pers. Const. Psych. 1993, 167–193, DOI: 10.1080/08936039308404338. 17. Bretz, S. L. Human constructivism and meaningful learning. J. Chem. Educ. 2001, 78, 1107, DOI: 10.1021/ed078p1107.6. 18. Cohen, J. Weighted kappa: Nominal scale agreement with provision for scaled disagreement or partial credit. Psych. Bull. 1968, 70, 213–220, DOI: 10.1037/h0026256. 19. Galloway, K. R.; Bretz, S. L. Using cluster analysis to characterize meaningful learning in a first-year university chemistry laboratory course. Chem. Educ. Res. Pract. 2015, 16, 879–892, DOI: 10.1039/C5RP00077G.

115 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.