Development of the Quantization and Probability Representations

Jun 28, 2019 - students (N = 655) and physical chemistry/biophysical chemistry students (N ...... Additional test statistics and item responses (PDF,...
0 downloads 0 Views 6MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Development of the Quantization and Probability Representations Inventory as a Measure of Students’ Understandings of Particulate and Symbolic Representations of Electron Structure Zahilyn D. Roche Allred and Stacey Lowery Bretz* Department of Chemistry and Biochemistry, Miami University, Oxford, Ohio 45056, United States

Downloaded via VOLUNTEER STATE COMMUNITY COLG on July 21, 2019 at 07:37:30 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

S Supporting Information *

ABSTRACT: This article describes the development of the Quantization and Probability Representations Inventory (QuPRI) as a measure of student understanding of the electron structure of the atom. The QuPRI was created using a mixed-method sequential design such that the items and distractors were generated on the basis of the analysis of semistructured interviews in which students were asked to interpret multiple representations of the electron structure of hydrogen, helium, and carbon atoms. The QuPRI was administered to first-semester general chemistry students (N = 655) and physical chemistry/biophysical chemistry students (N = 38). Descriptive statistics and item function are presented for each sample, including evidence for the reliability and validity of the data generated by the QuPRI. Students’ confidence in their responses and reasoning about the electron structure of atoms is discussed. KEYWORDS: High School/Introductory Chemistry, First-Year Undergraduate/General, Upper-Division Undergraduate, Chemical Education Research, Physical Chemistry, Misconceptions/Discrepant Events, Testing/Assessment, Atomic Properties/Structure, Quantum Chemistry FEATURE: Chemical Education Research



INTRODUCTION Students are first introduced to atoms during grade school. High school students in chemistry courses are taught more detailed model(s) of the electron structure of the atom, such as the Bohr model and/or the quantum model. Ausubel and Novak’s Meaningful Learning Theory posits that, in order for students to learn meaningfully, they must be able to make connections between new material to be learned and what they already know (i.e., their prior knowledge).1,2 Because university students will bring a variety of ideas regarding the electron structure of the atom to their introductory chemistry courses, and this prior knowledge will shape how they learn other concepts in chemistry,3−7 it is important for instructors to assess what students already know and teach them accordingly (to paraphrase Ausubel).8 Although asking students to create concept maps9 or answer open-ended questions has the potential to yield insights into students’ understandings, these can be very time-consuming for teachers to read and analyze. Assessments such as concept inventories offer an efficient alternative to measuring students’ understandings.10,11 In the past two decades, nine assessments have been developed to measure students’ ideas about a variety of concepts related to quantum mechanics. Eight instruments were developed in physics,12−19 and one instrument was devised for upper-level chemistry students.20 The instruments to measure physics students’ knowledge focused on their mathematical © XXXX American Chemical Society and Division of Chemical Education, Inc.

understanding of quantum mechanics using multiple choice items12,13,15−19 or short answer questions.14 The singular chemistry instrument, known as the Quantum Chemistry Concept Inventory (QCCI),20 was developed with a focus on quantum concepts as typically taught in physical chemistry courses. None of these measures assess students’ interpretations of electron structure representations, despite the large body of literature in chemistry education and science education reporting not only students’ difficulties with understanding the quantum model of the atom but also their preferences for more concrete models such as the Bohr model.3,5−7,21−31 Many of these studies have documented how students invoke quantum mechanics concepts to describe their mental model of the atom, which often resembles the Bohr model. A common finding in all of these studies is students’ difficulties with distinguishing between classical and quantum mechanics concepts and terms.6,7,25,30,32−35 Research by Park and Light (2009) regarding students’ mental models of the atom proposed that, for students to develop a sound, conceptual understanding of the quantum model, they must first develop an understanding about the concepts of probability and energy quantization and how these Received: February 2, 2019 Revised: June 15, 2019

A

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 1. Phases of the semi-structured interviews including the representations shown to students to elicit their ideas about energy quantization and probability. (A) Common energy level diagram for the hydrogen atom. (B) Multiple representations of the helium atom created by the authors. (C) Boundary surface representations used to depict the carbon atom. Reproduced with permission from Stoker’s General, Organic, and Biological Chemistry, 4E. Copyright 2007 Brooks/Cole, a part of Cengage, Inc., www.cengage.com/permissions. (D) Electron probability diagrams to depict the carbon atom. Reproduced with permission from https://www.chemguide.co.uk/atoms/properties/atomorbs.html. Copyright 2000 Jim Clark.

two concepts are related.7 Furthermore, research in mathematics education and cognitive psychology has demonstrated how students’ intuition and everyday ideas influence their probabilistic reasoning, resulting in inconsistent reasoning when thinking about events involving likelihood.36−43 Given these previous findings and the importance of students developing a conceptual understanding of the quantum model of the atom, we sought to create an assessment to investigate students’ thinking about energy quantization and probability using multiple representations. Interpreting representations and forging meaningful connections among them is a difficult task for students.44,45 Students struggle to see similarities and differences among multiple representations of the same phenomenon, instead focusing on the surface features of representations.44−46 Several instruments have been developed by our research group in the past decade to characterize students’ thinking at the interface of multiple representations for a variety of phenomena and reactions, including flame tests,47 enzyme−substrate interactions,48 chemical bonding,49 redox reactions,50 and dissolution and precipitation.51 Given the prevalence of representations involved in learning the electron structure of the atom and the absence of an assessment to measure student thinking about those representations, the Quantization and Probability Representations Inventory (QuPRI) was developed. The research questions that shaped this inquiry follow:



a mixed-method sequential design,52,53 following the recommendations of the National Research Council.54 Rather than generating assessment items, responses, and distractors by asking experts to identify salient content and draft items accordingly,55−58 the QuPRI was developed by analyzing students’ interviews to formulate attractive distractors using students’ language to ensure the elicitation of common student misconceptions.11 Previous literature26,27,35,59,60 and the general chemistry textbook61 (in use at the authors’ institution when the interviews were designed) were surveyed to identify typical representations of both the atom and energy level diagrams. The authors generated an energy level diagram for the hydrogen atom and four typical representations of the atom, namely, the Bohr model, electron cloud model, probability model, and boundary surface model. The diagram and representations were used during semi-structured interviews to elicit students’ interpretations of particulate and symbolic representations of the electron structure of the atom with regards to both probability and energy quantization. Students’ explanations and discussions of the energy level diagram and four representations of the atom were analyzed in order to construct the items (including distractors) for the QuPRI. Inventory Development and Structure

Students enrolled in the general chemistry (GC), physical chemistry (PC), and biophysical chemistry (BPC) courses were invited to participate in semi-structured interviews,62 as the quantum model of the atom is taught in these courses. The students in the GC course were enrolled in one of four large lecture sections that met three times per week and a laboratory section that met once per week. The lectures were taught by different instructors, but they all used the same commercially available textbook.61 Students were assessed with three multiple choice exams during the semester and an American Chemical Society standardized final exam. In the GC course, students were introduced to the concept of energy quantization and the historical development of the electron models of the atom. Students enrolled in their third or fourth year of their chemistry or biochemistry major had the option to enroll in either the PC or the BPC course, both which were offered in the Department of Chemistry and Biochemistry. Both the PC course and the BPC course consisted of large lectures that met

1. What evidence exists for the validity and reliability of the data collected during the administration of the QuPRI? 2. What are the prevalent interpretations of general chemistry and physical chemistry students about multiple representations of the electron structure of the atom, particularly with regard to electron probability and energy quantization?

METHOD The previously published assessments that measure students’ thinking about quantum mechanics concepts were developed by using answers to open-ended questions,15 modifications to endof-chapter questions,12 and previous research findings about students’ misconceptions.12,20 The QuPRI was developed using B

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

interpretations of the representations by using the constant comparative method.64,65 A codebook was generated by the authors who met on a weekly basis to discuss and revise the codes.66 Students’ interpretations of the multiple particulate and symbolic representations were compared to each other in order to examine similarities and differences between and within students’ responses. In addition to inductive coding, students’ responses were also compared to previously reported findings in the literature. To ensure the trustworthiness of the findings, a detailed audit trail was created, and the authors met monthly with chemistry education researchers who were not involved in the data collection and analysis for peer debriefing and scrutiny.67 Findings from these qualitative data analyses have been reported elsewhere.25,63 The analyses of students’ interpretations of the multiple representations resulted in the identification of both accurate and inaccurate reasoning about probability and energy quantization; these ideas were used to develop each item on the QuPRI and its distractors. For example, during phase III of the interview, students were asked to explain the meaning of the dots of the electron probability model. The explanations offered by three students included the following: “I feel like they [dots] are just everything, like the electrons, the protons, and neutrons, like all of them, but just like sporadically placed.” (Agustina, BPC student) “...I’ve seen electrons depicted kind of in this form. So, I assume that’s what they were trying to convey that these [dots] were individual electrons.” (Mariana, GC student) “I mean, I would say dots are just basically where the mass of the nucleus is centered because directly in the middle is where the concentration of the dots is the highest.” (Tomás, GC student) These explanations were used to craft distractors B, C, and D in Box 1. For each item, direct quotes were used as distractors,

twice per week. The PC and BPC courses were taught by two different instructors. In these courses, students were taught a more detailed explanation of the mathematical foundation for the quantum model of the atom. The primary difference between the PC and BPC courses was that students enrolled in the BPC course were taught all the concepts in the context of biomacromolecules. For this study a total of 34 interviews (26 GC, 3 PC, and 5 BPC) were conducted to elicit students’ ideas about the electron structure of the atom. All students were interviewed postinstruction on the quantum model of the atom and postassessment by their course instructor. Institutional Review Board approval was obtained for the research, and all interview participants were informed of their rights as participants in the study. All students were assigned a pseudonym for reporting purposes. The interview guide was designed to contain four phases in which students’ ideas about both electron probability and energy quantization were elicited using both particulate and symbolic representations (Figure 1). The macroscopic domain of Johnstone’s triangle44 was not investigated in this study because connections between the macroscopic and symbolic domains have been previously reported in the development of the Flame Test Concept Inventory.47 The semi-structured interviews began by exploring the students’ understandings about electron probability and energy quantization, because they had been taught and tested about these ideas in general chemistry and either physical chemistry or biophysical chemistry. That is, phase I explored the students’ prior knowledge before asking them to interact with any representations of electron probability and energy quantization. Students were asked whether (or not) electron probability and/or energy quantization were related to the atom, and if so, how were these concepts related to the atom. Students’ responses were explored with follow-up questions in order to ensure that the interviewer understood their answers. In phase II of the interview, students were shown an energy level diagram for the hydrogen atom (Figure 1A) and were asked to explain each feature of the representation, e.g., horizontal lines, n values, and energy values. Phase III of the interview guide asked students to interact with two sets of representations: helium atoms (Figure 1B) and orbitals in a carbon atom (Figure 1C,D). During phase III, students were asked to describe and interpret the salient features of four representations of the helium atom: an electron probability model, an electron cloud model, a boundary surface model, and a Bohr model. They were asked to rank these four representations from most preferred to least preferred when thinking about a model of the helium atom.25 They were also asked to interpret two electron probability statements.63 Students were then shown two sets of atomic orbitals (Figure 1C,D) and asked to choose the orbitals that they would use to represent the carbon atom, to explain why they chose those orbitals, and to explain why they did not choose the others. To conclude the interview, students were asked in phase IV to describe any connections that might exist between the energy level diagram in phase II and the atomic orbital representations shown in phase IIIb. Each interview was transcribed verbatim. The transcripts were augmented both with notes from any drawings the students made using a Livescribe digital pen and with additional information gleaned from watching the video (e.g., annotating what specific part of a particular representation a student was pointing to when saying “this represents...”). The transcripts were then analyzed to identify students’ explanations and C

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

whenever possible, in order to reflect the students’ descriptions of probability and energy quantization, and their interpretations of the features of the representations. A total of 53 items were initially developed from the data analysis. After multiple rounds of revisions between the authors, the number of items was reduced to 31. These 31 items were sent to 11 faculty members who were experienced teaching first-year chemistry and/or physical chemistry at multiple universities. The instructors were asked to review (1) the accuracy of the chemistry content, (2) the clarity of the items and distractors, (3) which items best represented the concepts of probability and energy quantization, and (4) any suggestions on topics that might have been omitted but they considered to be essential for an inventory focused on representations of electron structure with a focus on energy quantization and probability. Revisions to the 31 items were made on the basis of the feedback of the faculty experts. For example, three experts suggested the authors more clearly distinguish between an orbital representation that depicts “the region of space in which there is a high probability of f inding the electron”61 and the definition of an orbital, which “is the space where the electron IS because the integral of [Ψ2] over all space is 1” (Professor D). Although these three experts suggested describing the orbital in terms of a mathematical expression, the authors decided to not make mention of the wave function as part of the inventory because not all first-year general chemistry students are introduced to wave functions. Instead, the item stems and distractors were modified to ensure sufficient distinction between an orbital representation and an orbital. After addressing additional suggestions and comments from the faculty experts and multiple rounds of revisions between the authors, the QuPRI was ready to be pilot tested with students. The pilot-test version of the QuPRI consisted of 26 items, including three items that intended to measure students’ interpretations of the Bohr model, nine items that aimed to assess students’ ideas about electron probability, and nine items that were designed to assess students’ ideas about energy quantization and their understanding of the energy level diagram for the hydrogen atom. In addition, six of these items afforded students the opportunity to conflate the ideas of energy quantization and probability based on the distractors (see Table S.1). The representations included on the QuPRI are shown in Figure 2. The pilot-test version of the QuPRI also included one item that asked students to choose one representation that comes to mind when thinking about the helium atom from among four different models of the atom and another item that asked the students which of the four models they thought was most scientifically accurate. The last item on the QuPRI asked the students to select the phrase that comes to their mind when they hear the word “quantization” from among four possibilities. These three final items did not have a singular correct answer but rather were used to gather data about the students’ preferences. Each of the 26 items also asked students to indicate their confidence about their response by marking an “X” on the scale in Box 1 from 0% (just guessing) to 100%, (absolutely certain).50,68,69

Figure 2. Representations included on the Quantization and Probability Representations (QuPRI) inventory.

quantum model of the atom. Students required approximately 15 min to complete the inventory. After the administration of the QuPRI, 18 of these students (8 GC, 10 PC/BPC) participated in individual response process validation interviews. These students were purposefully sampled67 from all the students who volunteered to participate in the study, primarily on the basis of their total score on the QuPRI in order to allow the authors to investigate whether all students, regardless of their total score, were interpreting the items and distractors as intended. Students’ college major, year in college, race/ethnicity, and gender were also taken into consideration when selecting students to interview a group of students who were representative of those enrolled in both the GC and the PC/BPC courses. Students were interviewed 1 week after they had answered the QuPRI. Each interview lasted approximately 45 min. During the interviews, students were given a blank copy of the QuPRI and asked to answer each item while thinking aloud about how they interpreted the item and each response option. The validation interviews offered insights into the clarity of the items and responses, including the distractors. Results from the student validation interviews are discussed below. Data Analysis

The raw data for responses to the 26-item QuPRI from students who provided consent to participate in the study were entered into SPSS.70 Students’ confidence ratings were entered as a percentage for each item. Students who omitted confidence responses or who created patterns with their responses (e.g., ABABAB...) were removed from the study. A total of 655 GC, 7 PC, and 31 BPC students provided consent, answered every item, and provided a confidence rating for each item (see Figure S.1 for the flow of participants in the study). Correct answers for each of the 23 assessment items were scored as 1, and incorrect

Concept Inventory Administration

The 26-item QuPRI was pilot-tested with both GC (N = 862) and PC/BPC (N = 46) students at a large, predominantly undergraduate institution in the midwestern United States. Responses were collected after the students had been formally taught and assessed in their GC or PC/BPC class regarding the D

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

answers were scored as 0. The answers for the three final items that asked for students’ preferences were not scored as correct or incorrect. Descriptive statistics and reliability coefficients were calculated using SPSS. Item and test level psychometrics such as item difficulty, item discrimination, and Ferguson’s δ were calculated using Excel.



student in the PC/BPC sample answered all the questions correctly. These results suggest that the QuPRI is also challenging for PC/BPC students. Validity

The validity of the data generated by the QuPRI was examined through multiple forms.73 Test content validity was established when 11 instructors with years of experience teaching general chemistry and/or physical chemistry reviewed the accuracy of the items. Comments and suggestions from these experts were used to modify questions and reduce the number of items. After pilot-testing the QuPRI with both samples, response process validity interviews were conducted with n = 18 students (n = 8 GC, n = 10 PC/BPC) to ensure that students interpreted the item stems and distractors as intended. Two items were modified on the basis of students’ comments that they found distractors to be confusing during the responses process interviews. For example, item 6 in the pilot version of the QuPRI asked students to choose the best description for an interpretation of the statement saying there is 90% probability of finding an electron within a three-dimensional representation depicting an orbital. About one-fourth of the students who participated in the response process validity interviews found one of the distractors “contradictory”, subsequently leading them to examine each of the options using test-taking strategies. Students’ thoughts and comments about the item and multiple choice options were used to modify the item. Both content and response process validity are essential and common practices in the development of an assessment as these forms of validity enable researchers to generate evidence against threats to the validity of the results obtain from the assessment.73−75 The concurrent validity of the QuPRI data was examined by comparing the GC sample and the PC/BPC sample. Concurrent validity refers to the extent to which a measurement compares to some criterion, specifically, an expected outcome.73,76 Students in the PC/BPC sample were expected to perform better than students in the GC sample, and as expected, the more experienced PC/BPC students generated a higher mean score of 15.1 ± 2.9, whereas the GC students’ mean score was 10.4 ± 3.5 (Table 1). In order to determine if this difference in the scores was statistically significant, a Mann−Whitney U-test was conducted because the distribution of scores for the GC sample was not normally distributed. Results from the Mann−Whitney U-test (U = 20,976.5, p < 0.001, η2 = 0.074) suggest that, as students have additional instruction regarding the quantum model of the atom, they perform better on the QuPRI.

RESULTS AND DISCUSSION

Descriptive Statistics

The central tendencies and range of scores are shown in Table 1. The distributions of scores and mean confidence ratings for the Table 1. Descriptive Statistics for QuPRI Responses GC (N = 655)

PC/BPC (N = 38)

Statistic

Items

Confidence (%)

Items

Confidence (%)

Mean Std Dev Minimum Median Maximum Ferguson’s δ

10.4 3.5 2.0 10.0 19.0 0.96

58.5 17.0 0.2 58.9 100.0

15.1 2.9 7.0 15.0 20.0 0.92

73.9 11.5 51.2 74.7 95.7

GC students are shown in Figure 3. The scores for the GC sample (Figure 3) were not normally distributed according to the Kolmogorov−Smirnov test for normality (D = 0.088, p < 0.001). However, neither a ceiling effect (i.e., the QuPRI was very easy for many students) nor a floor effect (i.e., the QuPRI was very difficult for many students) was observed. No student earned a score of 0 or the maximum possible score of 23. Initially, the PC and BPC samples were evaluated independently, but the results of a t-test indicated that these two samples were equivalent.71 (The descriptive statistics data and tests of normality for both the PC and the BPC samples are available in the Supporting Information, Tables S.2 and S.3.) Therefore, the two samples were combined for additional data analyses. The distributions of total scores and mean confidence ratings for the PC/BPC students are shown in Figure 4. The Shapiro−Wilks (S−W) test was used to assess the normality of the total scores distribution for the combined PC/BPC sample because the sample size was below 50, as recommended for small sample sizes.72 The result of the S−W test suggested a normally distributed sample (W = 0.957, p = 0.238).72 Inspection of the distribution of scores for the GC students in Figure 3 and the PC/BPC students in Figure 4 shows that the PC/BPC scores are left skewed in comparison to those of the GC students, but no

Figure 3. Distributions of GC students’ total scores and mean confidence ratings for their QuPRI responses. E

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 4. Distributions of PC/BPC students’ total scores and mean confidence ratings for their QuPRI responses.

Reliability and Item Function

Difficulty and discrimination indices were calculated to examine the QuPRI at the item level. Figure 5 depicts the item difficulty (p) and discrimination indices (D) for both the GC sample and the PC/BPC sample. Item difficulty was calculated as the proportion of students who answered an item correctly. Items with difficulty indices of 0.8 or higher are considered easier items, whereas items with difficulty indices of 0.3 or lower are considered difficult items.10,80 The QuPRI items show a wide range of difficulty indices. As can be seen in Figure 5, six items had a difficulty below 0.30 for the GC students, while only two items fell below this threshold for the PC/BPC students. However, many of the items were more difficult for the GC students, suggesting that when students have completed more chemistry classes, the QuPRI items are easier for the students. Discrimination (D) values of 0.30 or greater suggest that an item can differentiate between the top-performing (27%) and low-performing (27%) students.10,80 Values below the 0.30 threshold might indicate that an item cannot adequately discriminate between the top and bottom students, for example when the item is too difficult (or too easy) because most students, regardless of their overall performance, answered the item either incorrectly (or correctly). Figure 5 shows that several items do not discriminate well between the low- and highperforming students for both the GC and the PC samples. Most of the items that poorly discriminate for the PC/BPC sample are also easier. Two items (question 3 and question 23) were quite challenging for both samples. Item 23 asked students to determine what connection, if any, exists between an energy

The internal consistency of the QuPRI scores for both the GC sample and the PC/BPC sample was examined for all 23 items, for the probability items, and for the energy quantization items (Table 2). Cronbach’s α is the most widely used statistic to Table 2. Internal Consistency for the QuPRI Cronbach’s α Items

GC (N = 655)

PC/BPC (N = 38)

All Probability Energy quantization

0.63 0.45 0.65

0.56 0.28 0.56

measure how closely related items measured the same construct, and α ≥ 0.70 has been the generally accepted standard to indicate internal consistency. Inspection of Table 2 indicates that none of the calculated α values reach this threshold. However, previous research has suggested that Cronbach’s α may not be an appropriate measure of reliability for assessments that are not unidimensional, including concept inventories,10,50,68,77−79 because individual items are constructed to detect students’ fragmented knowledge. Therefore, additional measures of reliability were examined by administering the final version of the QuPRI to GC students at multiple institutions across the United States of America. Results from the national sample were similar those presented for the GC sample as discussed in this paper. (See Supporting Information for results of the national study, Table S.4 and Figure S.2.)

Figure 5. Difficulty and discrimination indices for each of the 23 QuPRI items. F

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 3. Descriptive Statistics for GC and PC/BPC on QuPRI Items Involving Representations of the Bohr Model, Probability, and Energy Quantization GC (N = 655)

PC/BPC (N = 38)

Scores

Bohra

Probabilityb

Quantizationc

Bohra

Probabilityb

Quantizationc

Mean Std Dev Minimum Median Maximum

1.2 0.8 0.0 1.0 3.0

5.0 1.8 0.0 5.0 9.0

2.9 2.0 0.0 3.0 8.0

1.5 0.9 0.0 2.0 3.0

6.1 1.4 3.0 6.0 8.0

5.6 1.8 1.0 5.5 8.0

a

Maximum possible score of 3. bMaximum possible score of 9. cMaximum possible score of 8.

Figure 6. Bubble plots for GC students and PC/BPC students on items involving representation of probability and energy quantization. The red lines represent the mean scores.

for both the GC sample and the PC/BPC sample on each of these groups of items. Students’ thinking about probability and energy quantization were inspected further using bubble plots (Figure 6). Note that 10% of the GC students demonstrated poor understanding about the idea of energy quantization, with 66 GC students earning a score of zero on the items designed to assess students’ thinking about energy quantization. (Curiously, 16 of these students scored above average on the items that measured students’ ideas about representations of probability.) These 66 students were unable to accurately interpret the features of an energy level diagram and were highly attracted to distractors. For example, when asked to interpret the negative sign for energy values on an energy level diagram (Box 2, item 18), 43.5% of GC students selected distractor D (Figure 7). (See Figure S.3 in Supporting Information for PC/BPC results.) This

level diagram and a 2s atomic orbital representation. This item was difficult for many students because, as we know from both the interviews conducted to develop the QuPRI and from the response process interviews, students tend to focus on the surface features of the representations, leading them to incorrectly conclude that both representations depict energy levels or that there is no connection between the two. It is essential to retain items 3 and 23 on the QuPRI, despite their poorer discrimination indices, because these items can detect incorrect ideas that both low- and high-performing students reason with, even after being taught and assessed on these ideas.10 Ferguson’s delta (δ) was calculated for both the GC and the PC/BPC samples (Table 1) as a measure of the degree to which the scores earned by the students reflect the range of possible scores. A δ value ≥0.90 indicates a broadly distributed sample, meaning that students earned a variety of scores across the possible range of total scores.81 Ferguson’s δ for both samples exceeded the ideal value of 0.90 (Table 1), indicating that students earned more than 90% of the possible scores. However, caution should be taken when interpreting δ as it has been previously reported to be dependent on the population of students used to calculate the value and in some cases difficult to interpret when comparing populations.82 Students’ Interpretations of Multiple Representations of Electron Structure As Measured by the QuPRI

Students’ interpretations of items that included representations of the Bohr model, of probability, and of energy quantization were also analyzed. Table 3 summarizes the descriptive statistics

Figure 7. General chemistry students’ responses to item 18. G

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

descriptions and explanations offered by students in the semistructured interviews, some QuPRI items can detect students’ reasoning with inappropriate connections between classical and quantum models. For example, item 1 asks students to choose the statement that best describes the position of an electron relative to the nucleus, and item 4 asks students whether probability is related to an electron and if so how. The majority of students in both samples answered these items correctly suggesting that both GC and PC/BPC students understand that there is uncertainty associated with describing/depicting where electrons are within the atom. However, 12.7% of GC students (Table 4A) inappropriately associated the idea of probability with the spaces between lines in an energy level diagram. It is especially important to note that even upper-level students are still influenced by ideas from classical mechanics when interpreting quantum representations of the atom, and conversely, they inappropriately invoke the concept of probability to interpret classical representations. Students have difficulty decoding representations of the quantum models such as the electron cloud (Table 4C) and probability diagrams of the 1s and 2s orbitals (Table 4D,E). For the electron cloud representation (Table 4C), 29.2% of GC students and 18.4% of PC/BPC students thought the difference in shaded regions depicted energy levels. Data from the semi-structured interviews used to develop the QuPRI indicate that students consider the shaded regions to be analogous to the concept of an orbit in the Bohr model of the hydrogen atom. Conversely, when students were asked to select the statement that best depicted the 2s orbital of a carbon atom, 12.2% of GC students interpreted representation E in Table 4 to mean that there is a 90% chance of finding two electrons in the center and four electrons in the outer region of the 2s orbital. Although students might be very familiar with the Bohr model representation (Table 4B), they still find it challenging to decode the features of this figure. Both the GC and the PC/BPC students, on average, answered correctly one or two of the three items (Table 3) that assess students’ interpretations of the Bohr model. The distractors associated with representation B and representation F in Table 4 point to the prevalence of mixing ideas by using language from the quantum model to describe a planetary model of the atom.

distractor was drawn from analysis of the semistructured interviews during the qualitative portion of the study, where multiple students associated the negative values with energy being released: “...when the energy is negative ah... it’s usually like in thermodynamics it’s like an exothermic process. Usually when like the energy is... it’s either notated plus or negative umm... so, [when] it’s negative [the atom] is like releasing energy and it happens spontaneously and when it’s positive is an endothermic process, and it requires energy to like put in.” (Diego, GC) Although some PC/BPC students also performed poorly on items designed to measure their interpretations of an energy level diagram and their ideas about energy quantization, on average, the PC/BPC students demonstrated better understanding of these ideas than did the GC students. As can be seen in Table 4, a substantial number of students did not understand the meanings of essential features in an energy level diagram such as n values, the spaces in between the horizontal lines, and the negative sign for energy values. However, it can also be observed that fewer PC/BPC students misinterpreted the features of the energy level diagram when compared to the number of GC students who misinterpret these features. These results suggest that students who have completed more chemistry course work show progression in their understanding of energy quantization. The data in Table 4 also indicate that PC/BPC students continue to be challenged to interpret both the Bohr model and probability representations that depict the electron structure of the atom, even after three or four years of college chemistry courses. Both GC and PC/BPC students had difficulties with interpreting multiple representations of the electron structure of the atom. Many items on the QuPRI were designed to assess students’ interpretations of one representation associated either with energy quantization or with probability. However, because the distractors for each item were developed to reflect the

Students’ Confidence

The students’ confidence about their answers to each of the QuPRI items was investigated. Descriptive statistics for the average confidence per student in both samples are presented in Table 1. Figure 8 plots each student’s average confidence across all items against their total score, including horizontal and vertical lines to indicate the mean confidence and QuPRI scores for each sample, respectively. Inspection of Figure 8 shows that the confidence of the students in both samples is not wellcalibrated to reflect their knowledge. Many students in our sample, both high-performing and low-performing, cannot accurately differentiate between what they know and what they do not know.83 This phenomenon was is known as the Dunning−Kruger effect84 and has been reported elsewhere in the chemistry education research literature.50,85 The students’ self-reported confidence about their response to each QuPRI item provides additional evidence of the robustness of the reasoning reported above regarding energy quantization, probability, and the electron structure of the atom. Mental Models vs Expert Models

In order to further investigate students’ reasoning with multiple representations of the atom, students were asked, during the H

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 4. Common Misinterpretations of Energy Level Diagram Features and Representations of Atomic Models

preferred representations.25 Interestingly, some students made a distinction during the interviews between the model that they “think about” (their mental model) versus the one they think is “more accurate” (an expert model).25 In order to examine the prevalence of this “dual model” distinction (“think about” vs “more accurate”), two items were specifically created for the QuPRI (Box 3, questions 24 and 25). Item response curves were created to explore how GC student responses to questions 24 and 25 related to overall performance

semi-structured interviews conducted to develop the QuPRI, to rank four given representations from their most preferred to their least preferred when thinking about the helium atom. No context was specified for choosing a preferred representation in order to not predispose students toward choosing or rejecting any of the representations. Analyses of the findings in this part of the interviews identified that many students were torn between the electron cloud model and the Bohr model, with 20 out of 34 students ranking these two representations as their two most I

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 8. Scatterplots comparing students’ average reported confidence across all items on the QuPRI vs their total scores. The horizontal lines indicate the mean confidence scores, and the vertical lines indicate the mean QuPRI scores for each sample.

QuPRI, GC students think of the helium atom using the Bohr model. This is consistent with previous research.7,21,26−28,31 However, when asked to identify the most accurate representation of the helium atom, the GC item response curve looks quite different, indicating a wide variety of responses from GC students across both low scores and high scores on the QuPRI. Students with lower total scores identified the Bohr model as the most accurate representation, while students with higher total scores recognized the electron cloud and electron probability models as accurate models of the atom. Similar results were found for the PC/BPC sample (see Supporting Information Figure S.4). The combination of student responses to questions 24 and 25 suggests that some students may be able to reason with multiple models and multiple representations of the electron structure of the atom. For example, as suggested by the results presented in Figure 9, a high-performing student might identify the representation of the Bohr model as their mental model and still recognize the limitations of this model and have a sound understanding of quantum ideas. However, this data also suggests that some students who have been taught the quantum model of the atom do not consider it to be accurate and struggle to separate features of representations for the classical, planetary model from those of the quantum model.



LIMITATIONS There are several limitations to the methods and findings of this research. First, there are several concepts related to the quantum model of the atom that are not assessed in this representations

on the QuPRI (Figure 9). Inspection of the item response curve for question 24 shows that, regardless of the total score on the

Figure 9. Item response curves contrasting students’ most preferred representation of the helium atom (question 24) vs the representation they consider to be most accurate (question 25). J

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

formative assessment to measure the prior knowledge regarding the quantum model of the atom of their GC students coming from high school, as well as the prior knowledge of their PC/ BPC students coming from general chemistry. As a David Ausubel wrote “[t]he most important single factor influencing learning is what the learner already knows. Ascertain this and teach him accordingly.”8 Individual items from the QuPRI could be used during class time as clicker questions to assess students’ thinking during instruction. Faculty could generate in-class discussion by asking students to rank the four representations as in questions 24 and 25, or they could ask students to discuss the strengths and weaknesses of each representation for communicating the electron structure of the atom. In the absence of any particular context, the Bohr model may indeed be sufficient, but faculty have the responsibility to help students identify the limitations of every model, including the Bohr model. These activities would afford instructors the opportunity to better understand how students interpret and reason with these representations of the electron structure of the atom. Research by Orgill and Crippen88 suggests that students can use energy level diagrams to solve algorithmic problems such as calculating an energy difference with the Rydberg equation, but that students find it difficult to interpret the diagram qualitatively. Students have been taught to manipulate symbols and values without understanding the underlying meaning of these values. Instructors could use the quantization related items in the QuPRI to gain insights into their students’ understandings of the features and concepts encoded in an energy level diagram. Future research studies could use the QuPRI to measure the effectiveness of curricular changes involving models of the atom, such as the use of an atoms-first textbook vs a traditional textbook in the course. Park and Light proposed that students must have a conceptual understanding about the concepts of probability and energy quantization in order to interpret and understand quantum models of the atom.7 While the findings reported herein do not test Park and Light’s premise, the QuPRI provides a data collection tool for future studies to do just that. Colleagues interested in obtaining a copy of the QuPRI for classroom use or research should contact the corresponding author.

inventory. Although atomic spectroscopy, quantum numbers, electron configurations, periodic trends, and d orbitals are often taught in concert with the concepts and representations of energy quantization and probabilistic models of the atom, these constructs were not intentionally examined in this research. By establishing these boundary conditions for our research, we were able to focus the research on Park and Light’s proposed threshold concepts of probability and energy quantization. Second, the QuPRI data reported herein was gathered from a single institution. Findings from students in other institutions or who have learned chemistry using a different curriculum or pedagogy, e.g., students in an atoms-first general chemistry curriculum or students in a POGIL86,87 chemistry course, might differ. The response process validation interviews resulted in the addition of one new item (for a total of 27 items) and minor wording modifications to two other items. We expect that these changes to the QuPRI will further improve the validity and reliability of the data generated by the QuPRI. Data using the modified QuPRI have been collected from several institutions across the United States and are in the process of being analyzed, but they are not presented in this paper.



CONCLUSIONS AND IMPLICATIONS This paper reports the development of the Quantization and Probability Representations Inventory (QuPRI) to measure students’ ideas about probability and energy quantization as they relate to their understanding of the electron structure of the atom. Both quantitative and qualitative data suggest that the items have generated valid and reliable data for two samples with different levels of content instruction, namely, GC students and PC/BPC students. A comparison of students’ scores between the two samples provides evidence for concurrent validity in that students who have completed more chemistry coursework perform better on the QuPRI, even though some alternative conceptions are still prevalent among PC/BPC students and these can be detected by the QuPRI. The attractiveness of the distractors in Table 4 provides evidence for face validity in that many students misinterpret the representations used to communicate the concept of probability in a variety of representations used to depict the quantum model of the atom. Although both samples performed similarly on the items that assess their understanding about the Bohr model and probability, the PC/BPC students demonstrated more knowledge than the GC students on the items designed to measure their interpretations of the quantum model and of an energy level diagram. The findings reported herein demonstrate that students invoke elements of the classical model when reasoning about the quantum model and vice versa, even among students who are nearing the completion of their undergraduate studies in chemistry. Faculty need to be aware that, regardless of students’ understandings of quantum concepts, they may well still hold the Bohr model as their mental model of the atom. The results of items 24 and 25 in Figure 9 indicate that multiple models of the atom can coexist in some students’ minds. However, only some students were able to distinguish between their mental model of the atom and what experts would consider a more accurate depiction of the electron structure of the atom. Other students answer questions about the quantum model and yet still indicate that the Bohr model is the most accurate representation of the electron structure of the atom. Given that the QuPRI is easy to administer and requires approximately 15 min of class time, faculty could use it as a



ASSOCIATED CONTENT

S Supporting Information *

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.9b00098. Additional test statistics and item responses (PDF, DOCX)



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Zahilyn D. Roche Allred: 0000-0003-2971-4878 Stacey Lowery Bretz: 0000-0001-5503-8987 Notes

Any opinions, findings, and conclusions or recommendations are those of the authors and do not necessarily reflect the views of the National Science Foundation. The authors declare no competing financial interest. K

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

(22) Papageorgiou, G.; Markos, A.; Zarkadis, N. Students’ Representations of the Atomic Structure- the Effect of Some Individual Differences in Particular Task Contexts. Chem. Educ. Res. Pract. 2016, 17, 209−219. (23) Zarkadis, N.; Papageorgiou, G.; Stamovlasis, D. Studying the Consistency Between and Within the Student Mental Models of Atomic Structure. Chem. Educ. Res. Pract. 2017, 18, 893−902. (24) Muniz, M. N.; Crickmore, C.; Kirsch, J.; Beck, J. P. UpperDivision Chemistry Students’ Navigation and Use of Quantum Chemical Models. Chem. Educ. Res. Pract. 2018, 19, 767−782. (25) Roche Allred, Z. D.; Bretz, S. L. University Chemistry Students’ Interpretations of Multiple Representations of the Helium Atom. Chem. Educ. Res. Pract. 2019, 20 (20), 358−368. (26) Harrison, A. G.; Treagust, D. F. Secondary Students’ Mental Models of Atoms and Molecules: Implications for Teaching Chemistry. Sci. Educ. 1996, 80 (5), 509−534. (27) Harrison, A. G.; Treagust, D. F. Learning about Atoms, Molecules, and Chemical Bonds: A Case Study of Multiple-Model Use in Grade 11 Chemistry. Sci. Educ. 2000, 84 (3), 352−381. (28) Cokelez, A.; Dumon, A. Atom and Molecule: Upper Secondary School French Students’ Representations in Long-term Memory. Chem. Educ. Res. Pract. 2005, 6 (3), 119−135. (29) Adbo, K.; Taber, K. S. Learners’ Mental Models of the Particle Nature of Matter: A Study of 16-year-Old Swedish Science Students. Int. J. Sci. Educ. 2009, 31 (6), 757−786. (30) Stefani, C.; Tsaparlis, G. Students’ Levels of Explanations, Models, and Misconceptions in Basic Quantum Chemistry: A Phenomenographic Study. J. Res. Sci. Teach. 2009, 46 (5), 520−536. (31) Ü nlü, P. Pre-Service Physics Teachers’ Ideas on Size, Visibility and Structure of the Atom. Eur. J. Phys. 2010, 31, 881−892. (32) Cervellati, R.; Perugini, D. The Understanding of the Atomic Orbital Concept by Italian High School Students. J. Chem. Educ. 1981, 58 (7), 568−569. (33) Tsaparlis, G. Atomic and Molecular Structure in Chemistry Education: A Critical Analysis from Various Perspectives of Science Education. J. Chem. Educ. 1997, 74 (8), 922−924. (34) Taber, K. S. Learning Quanta: Barriers to Stimulating Transitions in Student Understanding of Orbital Ideas. Sci. Educ. 2005, 89 (1), 94− 116. (35) Wright, T. Images of atoms. Aust. Sci. Teach. J. 2003, 49 (1), 18− 24. (36) Albert, J. H. College Students’ Conceptions of Probability. Am. Stat. 2003, 57 (1), 37−45. (37) Garfield, J.; Ahlgren, A. Difficulties in Learning Basic Concepts in Probability and Statistics: Implications for Research. J. Res. Math. Educ. 1988, 19 (1), 44−63. (38) Peterson, C. R.; Beach, L. R. Man As an Intuitive Statistician. Psychol. Bull. 1967, 68 (1), 29−46. (39) Shaughnessy, J. M. Misconceptions of Probability: An Experiment with a Small-Group, Activity-Based, Model Building Approach to Introductory Probability at the College Level. Educ. Stud. Math. 1977, 8, 295−316. (40) Tversky, A.; Kahneman, D. Belief in the Law of the Small Numbers. Psychol. Bull. 1971, 76 (2), 105−110. (41) Tversky, A.; Kahneman, D. Judgment under Uncertainty: Heuristics and Biases. Science 1974, 185 (4157), 1124−1131. (42) Konold, C. Understanding Students’ Beliefs About Probability. In Radical Constructivism in Mathematics Education; Springer Netherlands: Dordrecht, 1991; pp 139−156. (43) Konold, C.; Pollatsek, A.; Well, A.; Lohmeier, J.; Lipson, A. Inconsistencies in Students ’ Reasoning about Probability. J. Res. Math. Educ. 1993, 24 (5), 392−414. (44) Johnstone, A. H. Why is Science Difficult to Learn? Things are Seldom What they Seem. J. Comput. Assist. Learn. 1991, 7, 75−83. (45) National Research Council. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; The National Academies Press: Washington, DC, 2012. (46) Ainsworth, S. The Educational Value of Multiple-Representations When Learning Complex Scientific Concepts. In Visualization:

ACKNOWLEDGMENTS The authors of this paper thank the students who volunteered to be part of the study. The research presented herein is based upon work supported by the National Science Foundation under Grant 1432466.



REFERENCES

(1) Novak, J. D. Meaningful Learning: The Essential Factor for Conceptual Change in Limited or Inappropriate Propositional Hierarchies Leading to Empowerment of Learners. Sci. Educ. 2002, 86 (4), 548−571. (2) Bretz, S. L. Novak’s Theory of Education: Human Constructivism and Meaningful Learning. J. Chem. Educ. 2001, 78 (8), 1107. (3) Griffiths, A. K.; Preston, K. R. Grade-12 Students’ Misconceptions Relating to Fundamental Characteristics of Atoms and Molecules. J. Res. Sci. Teach. 1992, 29 (6), 611−628. (4) Sewell, A. Cells and Atoms - Are they related? Aust. Sci. Teach. J. 2002, 48 (2), 26−30. (5) McKagan, S. B.; Perkins, K. K.; Wieman, C. E. Why We Should Teach the Bohr Model and How to Teach it Effectively. Phys. Rev. Spec. Top. Educ. Res. 2008, 4, 010103. (6) Taber, K. S. Conceptualizing Quanta: Illuminating the Ground State of Student Understanding of Atomic Orbitals. Chem. Educ. Res. Pract. 2002, 3 (2), 145−158. (7) Park, E. J.; Light, G. Identifying Atomic Structure as a Threshold Concept: Student Mental Models and Troublesomeness. Int. J. Sci. Educ. 2009, 31 (2), 233−258. (8) Ausubel, D. P. Educational Psychology: A Cognitive View; Holt, Rinehart and Winston, Inc.: New York, 1968. (9) Novak, J. D.; Gowin, D. B. Learning How to Learn; Cambridge University Press: London, 1984. (10) Adams, W. K.; Wieman, C. E. Development and Validation of Instruments to Measure Learning of Expert-like Thinking. Int. J. Sci. Educ. 2011, 33 (9), 1289−1312. (11) Bretz, S. L. Designing Assessment Tools to Measure Students’ Conceptual Knowledge of Chemistry. In Tools of Chemistry Education Research; Bunce, D. M., Cole, R. S., Eds.; American Chemical Society: Washington, DC, 2014; pp 155−168. (12) Cataloglu, E.; Robinett, R. W. Testing the Development of Student Conceptual and Visualization Understanding in Quantum Mechanics through the Undergraduate Career. Am. J. Phys. 2002, 70 (3), 238−251. (13) Frank, J. Developing a Quantum Mechanics Concept Inventory. Unpublished Master Thesis, Uppsala University, 2004. (14) Goldhaber, S.; Pollock, S. J.; Dubson, M.; Beale, P.; Perkins, K.; et al. Transforming Upper-Division Quantum Mechanics: Learning Goals and Assessment. AIP Conference Proceedings 2009, 145−148. (15) Wuttiprom, S.; Sharma, M. D.; Johnston, I. D.; Chitaree, R.; Soankwan, C. Development and Use of a Conceptual Survey in Introductory Quantum Physics. Int. J. Sci. Educ. 2009, 31 (5), 631−654. (16) McKagan, S. B.; Perkins, K. K.; Wieman, C. E. Design and Validation of the Quantum Mechanics Conceptual Survey. Phys. Rev. Spec. Top. - Phys. Educ. Res. 2010, 6 (2), 020121. (17) Zhu, G.; Singh, C. Surveying Students’ Understanding of Quantum Mechanics in One Spatial Dimension. Am. J. Phys. 2012, 80 (2), 252−259. (18) Marshman, E. M. Improving the Quantum Mechanics Content Knowledge and Pedagogical Content Knowledge of Physics Graduate Students. Unpublished Dissertation, University of Pittsburgh, 2015. (19) Sadaghiani, H. R.; Pollock, S. J. Quantum Mechanics Concept Assessment: Development and Validation Study. Phys. Rev. Spec. Top. Phys. Educ. Res. 2015, 11 (1), 010110. (20) Dick-Perez, M.; Luxford, C. J.; Windus, T. L.; Holme, T. A Quantum Chemistry Concept Inventory for Physical Chemistry Classes. J. Chem. Educ. 2016, 93 (4), 605−612. (21) Akaygun, S. Is the Oxygen Atom Static or Dynamic? The Effect of Generating Animations on Students’ Mental Models of Atomic Structure. Chem. Educ. Res. Pract. 2016, 17 (4), 788−807. L

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Theory and Practice in Science Education; Gilbert, J. K., Reiner, M., Nakhleh, M., Eds.; Springer: Dordecht, 2008; pp 191−208. (47) Bretz, S. L.; Murata Mayo, A. V. Development of the Flame Test Concept Inventory: Measuring Student Thinking about Atomic Emission. J. Chem. Educ. 2018, 95 (1), 17−27. (48) Bretz, S. L.; Linenberger, K. J. Development of the EnzymeSubstrate Interactions Concept Inventory. Biochem. Mol. Biol. Educ. 2012, 40 (4), 229−233. (49) Luxford, C. J.; Bretz, S. L. Development of the Bonding Representations Inventory to Identify Student Misconceptions about Covalent and Ionic Bonding Representations. J. Chem. Educ. 2014, 91 (3), 312−320. (50) Brandriet, A. R.; Bretz, S. L. The Development of the Redox Concept Inventory as a Measure of Students’ Symbolic and Particulate Redox Understandings and Confidence. J. Chem. Educ. 2014, 91 (8), 1132−1144. (51) Abell, T. N.; Bretz, S. L. Development of the Enthalpy and Entropy in Dissolving and Precipitation Inventory. J. Chem. Educ. 2019, DOI: 10.1021/acs.jchemed.9b00186. (52) Towns, M. H. Mixed Methods Designs in Chemical Education Research. In Nuts and Bolts of Chemical Education Research; Bunce, D. M., Cole, R. S., Eds.; American Chemical Society: Washington, DC, 2008; pp 135−148. (53) Creswell, J. W. Research Design Qualitative Quantitative and Mixed Methods Approaches; Sage Publications: Thousand Oaks, CA, 2003. (54) National Research Council. Knowing What Students Know: The Science and Design of Educational Assessment; The National Academies Press: Washington, DC, 2001. (55) Tamir, P. An Alternative Approach to the Construction of Multiple Choice Items. J. Biol. Educ. 1971, 5, 305−307. (56) Treagust, D. F. Development and use of diagnostic tests to evaluate students’ misconceptions in science. Int. J. Sci. Educ. 1988, 10 (2), 159−169. (57) Villafañe, S. M.; Bailey, C. P.; Loertscher, J.; Minderhout, V.; Lewis, J. E. Development and analysis of an instrument to assess student understanding of foundational concepts before biochemistry coursework. Biochem. Mol. Biol. Educ. 2011, 39 (2), 102−109. (58) Mulford, D. R.; Robinson, W. R. An Inventory for Alternate Conceptions among First-Semester General Chemistry Students. J. Chem. Educ. 2002, 79 (6), 739−744. (59) Justi, R.; Gilbert, J. History and Philosophy of Science through Models: Some Challenges in the Case of “the Atom. Int. J. Sci. Educ. 2000, 22 (9), 993−1009. (60) Budde, M.; Niedderer, H.; Scott, P.; Leach, J. Electronium”: A Quantum Atomic Teaching Model. Phys. Educ. 2002, 37 (3), 197−203. (61) Silberberg, M. S.; Amateis, P. Chemistry: The Molecular Nature of Matter and Change, 7th ed.; McGraw-Hill Education: New York, 2014. (62) Bowen, C. W. Think-Aloud Methods in Chemistry Education: Understanding Student Thinking. J. Chem. Educ. 1994, 71 (3), 184− 190. (63) Roche Allred, Z. D. Investigating Students’ Understandings about the Electronic Structure of the Atom with Regards to Energy Quantization and Probability. Ph.D. Dissertation, Miami University, 2019. (64) Strauss, A.; Corbin, J. Basics of Qualitative Research: Techniques and Procedure for Developing Grounded Theory; Sage Publications: Thousand Oaks, CA, 1998. (65) Fram, S. M. The Constant Comparative Analysis Method Outside of Grounded Theory. Qual. Rep. 2013, 18, 1−25. (66) Lincoln, Y. W.; Guba, E. G. Naturalistic Inquiry; Sage Publications: Newbury Park, 1985. (67) Bretz, S. L. Qualitative Research Designs in Chemistry Education Research. In Nuts and Bolts of Chemical Education Research; Bunce, D. M., Cole, R. S., Eds.; American Chemical Society: Washington, DC, 2008; pp 79−99. (68) McClary, L. M.; Bretz, S. L. Development and Assessment of a Diagnostic Tool to Identify Organic Chemistry Students’ Alternative Conceptions Related to Acid Strength. Int. J. Sci. Educ. 2012, 34 (15), 2317−2341.

(69) Brandriet, A. R.; Bretz, S. L. Measuring Meta-ignorance Through the Lens of Confidence: Examining Students’ Redox Misconceptions about Oxidation Numbers, Charge, and Electron Transfer. Chem. Educ. Res. Pract. 2014, 15 (4), 729−746. (70) SPSS. https://www.ibm.com/products/spss-statistics (accessed Jun 10, 2019). (71) Lewis, S. E.; Lewis, J. E. The Same or Not the Same: Equivalence as an Issue in Educational Research. J. Chem. Educ. 2005, 82 (9), 1408− 1412. (72) Razali, N. M.; Wah, Y. B. Power Comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling Tests. J. Stat. Model. Anal. 2011, 2 (1), 21−33. (73) Arjoon, J. A.; Xu, X.; Lewis, J. E. Understanding the State of the Art for Measurement in Chemistry Education Research: Examining the Psychometric Evidence. J. Chem. Educ. 2013, 90 (5), 536−545. (74) Wren, D.; Barbera, J. Gathering Evidence for Validity during the Design, Development, and Qualitative Evaluation of Thermochemistry Concept Inventory Items. J. Chem. Educ. 2013, 90 (12), 1590−1601. (75) American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. Standards for Educational and Psychological Testing; American Education Research Association: Washington, DC, 2014. (76) Trochim, W. M. K. Measurement Validity Types. http://www. socialresearchmethods.net (accessed Jun 10, 2019). (77) Streiner, D. L.; Streiner, D. L. Starting at the Beginning: An Introduction to Coefficient Alpha and Internal Consistency Starting at the Beginning: An Introduction to Coefficient Alpha and Internal Consistency. J. Pers. Assess. 2003, 80 (1), 99−103. (78) Sijtsma, K. On the Use, the Misuse, and the Very Limited Usefulness of Cronbach’s Alpha. Psychometrika. 2009, 74 (1), 107− 120. (79) Taber, K. S. The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Res. Sci. Educ. 2018, 48, 1273−1296. (80) Towns, M. H. Guide to Developing High-Quality, Reliable, and Valid Multiple-Choice Assessments. J. Chem. Educ. 2014, 91 (9), 1426−1431. (81) Ferguson, G. A. On the Theory of Test Discrimination. Psychometrika. 1949, 14 (1), 61−68. (82) Terluin, B.; Knol, D. L.; Terwee, C. B.; Vet, H. C. W. De. Understanding Ferguson’s δ: Time to Say Goodbye? Heal. Qual. Life Outcomes 2009, 7, 38. (83) Caleon, I. S.; Subramaniam, R. Do Students Know What They Know and What They Don’t Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students’ Alternative Conceptions. Res. Sci. Educ. 2010, 40 (3), 313−337. (84) Kruger, J.; Dunning, D. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. J. Pers. Soc. Psychol. 1999, 77 (6), 1121−1134. (85) Pazicni, S.; Bauer, C. F. Characterizing illusions of competence in introductory chemistry students. Chem. Educ. Res. Pract. 2014, 15, 24− 34. (86) Moog, R. S.; Farrell, J. J. Chemistry: A Guided Inquiry, 7th ed.; John Wiley & Sons, Inc: Hoboken, NJ, 2017. (87) Moog, R. S.; Spencer, J. N. POGIL: An Overview. In Process Oriented Guided Inquiry Learning (POGIL); American Chemical Society: Washington, DC, 2008; pp 1−13. (88) Orgill, B. M.; Crippen, K. Teaching With External Representations: The Case of a Common Energy-Level Diagram in Chemistry. J. Coll. Sci. Teach. 2001, 40 (1), 78−84.

M

DOI: 10.1021/acs.jchemed.9b00098 J. Chem. Educ. XXXX, XXX, XXX−XXX