Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX
pubs.acs.org/jchemeduc
Historical Analysis of the Inorganic Chemistry Curriculum Using ACS Examinations as Artifacts Shalini Srinivasan,† Barbara A. Reisner,‡ Sheila R. Smith,§ Joanne L. Stewart,∥ Adam R. Johnson,⊥ Shirley Lin,∇ Keith A. Marek,# Chip Nataro,○ Kristen L. Murphy,◆ and Jeffrey R. Raker*,†,¶ †
Department of Chemistry, University of South Florida, Tampa, Florida 33620, United States Department of Chemistry and Biochemistry, James Madison University, Harrisonburg, Virginia 22807, United States § Department of Natural Sciences, University of MichiganDearborn, Dearborn, Michigan 48128, United States ∥ Department of Chemistry, Hope College, Holland, Michigan 49423, United States ⊥ Department of Chemistry, Harvey Mudd College, Claremont, California 91711, United States ∇ Department of Chemistry, The United States Naval Academy, Annapolis, Maryland 21401, United States # Department of Chemistry, Bemidji State University, Bemidji, Minnesota 56601, United States ○ Department of Chemistry, Lafayette College, Easton, Pennsylvania 18042, United States ◆ Department of Chemistry and Biochemistry, University of WisconsinMilwaukee, Milwaukee, Wisconsin 53201, United States ¶ Center for the Improvement of Teaching and Research in Undergraduate STEM Education, University of South Florida, Tampa, Florida 33620, United States ‡
S Supporting Information *
ABSTRACT: ACS Examinations provide a lens through which to examine historical changes in topic coverage via analyses of course-specific examinations. This study is an extension of work completed previously by the ACS Exams Research Staff and collaborators in general chemistry, organic chemistry, and physical chemistry to explore content changes in the principal courses of the postsecondary chemistry curriculum. In this study, we consider how inorganic chemistry content coverage has varied over a 55-year period since the first inorganic chemistry ACS Examination was released in 1961. A total of 860 items was evaluated on the basis of problem type (i.e., algorithmic, conceptual, or recall), use of visual-spatial or reference components, and content coverage. Our analyses identify core content areas in the inorganic chemistry curriculum, consistent with those reported in faculty surveys. Each examination also contained questions addressing a variety of specialty areas that vary widely within the discipline between 1961 and 2016. Unlike the results from historical reviews of general chemistry and organic chemistry ACS Examinations, we observe great variability across the 13 inorganic chemistry examinations with an absence of strong trends in inclusion or exclusion of problem types, visual-spatial or reference components, or content across the 13 exams analyzed. Our results offer a framework for using historical ACS Examinations as a tool to make decisions about the future of content coverage in postsecondary inorganic chemistry education. KEYWORDS: Upper-Division Undergraduate, Inorganic Chemistry, Testing/Assessment
■
courses (e.g., general-organic-biochemistry).1−3 The iterative process of developing, rewriting and revising examination content is accomplished by committees of faculty volunteers.1 Consistent development processes across the history of ACS Exams allow for examinations to be used as artifacts to characterize curricular changes in the subdisciplines.2 The first review was conducted of physical chemistry examinations by Schwenz as reported in the Advances in Teaching Physical
INTRODUCTION
Since the 1930s, the American Chemical Society, Division of Chemical Education, Examinations Institute (ACS Exams) has developed standardized chemistry examinations. The release of the first general chemistry examination in 1934 paved the way for new tests and test series, eventually leading to the development of examinations for undergraduate chemistry courses across the curriculum and for upper-level high school chemistry courses.1−3 At present, ACS Exams offers examinations for general chemistry, traditional chemistry courses (i.e., analytical chemistry, biochemistry, inorganic chemistry, organic chemistry, physical chemistry), and for several specialized © XXXX American Chemical Society and Division of Chemical Education, Inc.
Received: October 19, 2017 Revised: January 21, 2018
A
DOI: 10.1021/acs.jchemed.7b00803 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Chemistry ACS Symposium Series.4 Similar analyses have been conducted for organic chemistry;3 results, spanning a 60-year period, revealed a stabilization of the organic chemistry curriculum in the 1970s.3 Using a similar approach, four versions of general chemistry examinations (i.e., first-semester, second-semester, full-year, and conceptual), across a 20-year time period, were evaluated;2 development of the Anchoring Concepts Content Map (ACCM) in General Chemistry allowed this study to organize content by assigning examination items to specific places (content areas) on this map.5,6 Results for content coverage on each examination revealed item distributions that were representative of topics covered in the course and aligned with corresponding ideas on the map (e.g., second-term general chemistry examinations emphasized content related to equilibrium, thermodynamic, and kinetics; when aligned with ideas on the map, 60% of all second-term examination items were aligned with ideas represented by these three content areas).2 Results from the study of general chemistry examinations further indicated that topics such as atoms, bonds, reactions, and intermolecular forces were assessed more frequently than other topics (e.g., kinetics and experiments) in general chemistry.2 The current study evaluates all 13 ACS inorganic chemistry examinations, spanning a 55-year history and a total of 860 items. Our analyses are presented in the context of survey research work7−9 and statements made about the postsecondary inorganic chemistry curriculum during the time period that the 13 inorganic chemistry examinations were developed and released.14,17 We utilize three frameworks to analyze items from each reviewed examination: • Algorithmic, conceptual, recall typology • Use of visual-spatial or reference components in prompts and answer choices • Content coverage
this could also be true of ACS Examinations, we argue that these examinations offer a more accurate view of content coverage from a multi-institutional perspective by virtue of the examination development process.2,3 As additional evidence to support our claim, all examination items undergo trial testing. It can be assumed that content not universally covered will not test well; this would then be observed in item difficulty and discrimination statistics used to make decisions about items to be included in the released version of an examination. Thus, we argue that the trial testing process provides data for the examination writing committee in identifying appropriate and representative topic coverage. Although earlier versions of ACS Examinations have incorporated open response, essay type items, examinations in recent decades (not since 1972 for ACS Inorganic Chemistry Examinations) typically have a multiple-choice format with either four or five answer options. Relevant data (e.g., graphs, tables of data, or other representations) in the item prompt, along with distractors (i.e., incorrect responses), constitute the entirety of an examination item.3 Distractors are developed by the examination committees on the basis of answer options most likely to be selected by a student due to common misconceptions or errors.2,3 Each ACS Examination contains 40−120 items, with the majority of examinations containing 60 or 70 items.1−3
■
INORGANIC CHEMISTRY CURRICULUM Tracking changes in the inorganic chemistry curriculum in the United States is challenging due to the variation in coverage of topics and themes among major inorganic chemistry textbooks. Unlike topics consistently covered in general chemistry and organic chemistry,3,12,13 the inorganic chemistry curriculum is diverse in both the number of courses offered and the topics addressed in each course. In 2001, Pesterfield and Hendrickson showed that 56% of surveyed universities and colleges offered only one inorganic lecture course and 39% offered two courses;14 this alone suggests variability in time devoted to inorganic chemistry content in the undergraduate curriculum. Pesterfield and Hendrickson asked about content coverage specifically in “senior-level lecture courses”. Topics covered by most faculty included symmetry, covalent bonding−valence bond theory, and transition-metal complexes, while areas like bioinorganic chemistry, oxidation−reduction, materials, and general descriptive chemistry were of minimal interest and not as frequently addressed in the in-depth course.14 Commentaries have noted the challenge of standardizing the inorganic chemistry curriculum because of variability in guidelines provided by the ACS Committee on Professional Training (ACS CPT) on the number of required inorganic courses over time (from zero to two courses), likely leading to the diverse curricular offerings at institutions.7−9,15−18 However, there is a consensus among the community of inorganic chemists about topics that are crucial to the study of inorganic chemistry;7−9 core topics are (1) atoms and electronic structure, (2) covalent bonding and molecular orbital theory, (3) transition-metal complexes and coordination chemistry, (4) acids−bases and solvents, (5) symmetry and group theory, and (6) solids and solid-state chemistry.7−9 A tiered, scaffolded, two-semester model of content coverage, analogous to those in many physical and organic chemistry curricula, has been argued as a potentially valuable step toward an effectual inorganic chemistry curriculum that, while diverse, covers critical concepts somewhere in the undergraduate curriculum.9
ACS Examinations as Pedagogical Artifacts
The topics covered on an examination are set by an examination writing committee, an autonomous and diverse group of faculty members from a variety of academic institutions; previous examination writers constitute roughly half of any committee.1,3,10 The diversity of committee members ensures that content coverage on examinations is a reflection of the course curriculum at the time of examination development.2,3 Development of an ACS Examination entails a multistep process in which the examination committee sets content coverage, writes examination items, and edits and selects items for trial testing.1 Following trial testing, the committee evaluates and discusses field testing results and item-level statistics to select the final set of items for the released examination.1 Timing between the development of examinations is based on a number of factors including the use of the test and the time required for test development.3 At the time of this paper, inorganic chemistry examinations are developed approximately every seven years;7−9 in contrast, general chemistry and organic chemistry examinations are developed every two to four years and physical chemistry examinations every six to eight years.3 While textbooks represent an important means of historical artifacts, the encyclopedic nature and breadth of coverage in textbooks are generally unrepresentative of content coverage in most chemistry courses;11 this unrepresentativeness is especially pronounced in inorganic chemistry courses. Although B
DOI: 10.1021/acs.jchemed.7b00803 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
categories of structures and diagrams is shown in Figure 1. The wedge and dash structure in Figure 1 is coded as a
Content areas for the analyses conducted in the present study mirror those in recent survey research studies of foundationand in-depth-level inorganic chemistry courses.7−9 By using this framework, we are able to make direct comparisons between the analyzed examinations and current faculty-reported topic coverage.
■
RESULTS The analyses reported herein are derived from 13 examinations, released between 1961 and 2016, spanning the inorganic chemistry curriculum over the past 55 years; these examinations include a total of 860 items with 60 to 100 items per examination. The 12 examinations released between 1961 and 2014 were designed to target a third-year/fourth-year-level inorganic chemistry course; in 2016, the Inorganic Chemistry Foundations examination was designed to target a second-year/ third-year course. Consequently, it should be noted for the analyses conducted in this study, the target population for the 2016 examination was different from all of the rest. Examinations in 1969 and 1972 included open response items that were utilized in these analyses; examinations after 1972 contained multiple-choice items exclusively. Examination items are classified using three frameworks: The first involves classifying items into algorithmic, conceptual, and recall problem types. Algorithmic problems are solved using a rule-based, heuristic approach,19,20 while conceptual problems require student understanding of ideas beyond recall, consequently requiring a justification or explanation of an answer choice.19,20 Recall type problems typically require memorization of content with little application of the content.19 With the guidance of the above definitions and framework, items in the current study that required the application of a set of rules to solve the problem, for instance, identifying ions that contained a certain number of unpaired electrons or calculating formal charge, were coded as algorithmic. Items that focused on predicting products or recognizing appropriate reagents for a reaction sequence were consistently categorized as conceptual problems while those that required memorization, such as identifying the appropriate analytical technique to reveal information about the physical properties of materials, were classified as recall problems. Items which did not initially arrive at a categorical consensus included problems involving the evaluation of periodic properties, reasons for occurrence of certain phenomena, preparation of compounds given a set of reactants, and calculations of bond order changes for a species. While the conceptual category was common to these problems, these items were also identified as potentially either algorithmic (e.g., periodic properties and occurrence of phenomena) or recall (e.g., reactions and bond order) categories. Adjudication of these items was based on an evaluation of steps a student might engage in to solve the problem. For instance, while calculating a bond order might appear sufficiently formulaic, the bond order is predicated on a correct Lewis structure, which might pose a challenge, given the species in question; thus, the problem type would ultimately be more conceptual than algorithmic. The second framework involves classifying items by use of visual-spatial or reference components in item prompts and responses.2,3 Visual-spatial or reference components include the following: chemical equations, graphs, tables, chemical structures, pictures, mathematical expressions, or particulate nature of matter (PNOM) diagrams.2 A coding example for items that incorporate overlapping and slightly nebulous
Figure 1. Visual-spatial coding examples: (A) structure and (B, C) structure and diagram.
structure; this categorization was also extended to reference components in which the wedge and dash notations were not explicitly depicted. Thus, a reference component displaying a simple line structure with a functional group formula was also coded as a structure. In contrast, reference components B and C are coded as both structures and diagrams, the latter coding being used to highlight the three-dimensional mode of representing information about chemical constitution.21 The third framework involves classifying items into content areas: transition-metal complexes and coordination chemistry, main-group and descriptive chemistry, covalent bonding and molecular orbital theory, acids−bases and solvents, atoms and electronic structure, solids and solid-state chemistry, redox chemistry, organometallic chemistry, symmetry and group theory, analytical techniques, bioinorganic chemistry, materials chemistry and nanoscience, thermodynamics, nuclear chemistry, and green chemistry. These content areas mirror selfreported data from faculty teaching inorganic chemistry courses.7−9 A random selection of 200 items across the 13 examinations were independently rated on problem type and content area by four raters. The remaining 660 items were rated by one rater on problem type and content area. The 860 items were rated independently by authors Srinivasan and Raker on visual-spatial or reference components. Inter-rater reliability was determined on the basis of one-way random-effects intraclass correlation (ICC) models with absolute agreement for the content areas.22 ICC values of least 0.70 or higher are required for a scale to be deemed “adequate”; a value of 0.80 or higher is mandated for a “good” scale.22,23 ICC values ranged from 0.53 to 0.96, with 2 of the 14 content areas falling below the 0.80 threshold (i.e., thermodynamics, and materials chemistry and nanoscience; see Table 1). For the two content areas that displayed low interrater reliability, all items in those content areas were reevaluated by two raters to agree on a final content area assignment. Final assignments were adjudicated by first author Srinivasan. Algorithmic, Conceptual, and Recall (ACR) Framework
Utilization of the ACR framework to analyze assessment items dates back to the late 1980s.24 Many researchers at the time (and many researchers since) have been interested in the association between ability to solve algorithmic problems versus conceptual problems, with research suggesting that students can solve algorithmic problems but lack the ability to solve conceptual problems.19,24−27 The ACR model has been used to evaluate items in organic chemistry,28 and general chemistry and organic chemistry ACS Examination items.2,3,20,29,30 The percentage of algorithmic, conceptual, and recall problems for each of the ACS inorganic chemistry examinations since 1961 are reported in Figure 2. Unlike with the historical C
DOI: 10.1021/acs.jchemed.7b00803 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Table 1. Inter-Rater Reliability Statistics for Assignment of Content Areas ICC Valuesa Content Area Transition-Metal Complexes and Coordination Chemistry Main-Group and Descriptive Chemistry Covalent Bonding and Molecular Orbital Theory Acids, Bases and Solvents Atoms and Electronic Structure Solids and Solid-State Chemistry Redox Chemistry Organometallic Chemistry Symmetry and Group Theory Analytical Techniques Bioinorganic Chemistry Materials Chemistry and Nanoscience Thermochemistry Nuclear Chemistry Green Chemistry a
Single Rater [95% CI] 0.6997 0.6218 0.7039 0.6979 0.4923 0.7281 0.6427 0.7467 0.7329 0.8007 0.7753 0.2203 0.3934 0.8422 0.7996
[0.6453, [0.5593, [0.6500, [0.6432, [0.4214, [0.6772, [0.5821, [0.6983, [0.6827, [0.7603, [0.7310, [0.1504, [0.3200, [0.8088, [0.7591,
0.7506] 0.6819] 0.7543] 0.7490] 0.5635] 0.7752] 0.7004] 0.7912] 0.7794] 0.8372] 0.8157] 0.2972] 0.4696] 0.8719] 0.8362]
Average of Raters [95% CI] 0.9031 0.8680 0.9048 0.9023 0.7950 0.9146 0.8780 0.9218 0.9165 0.9414 0.9325 0.5306 0.7218 0.9552 0.9410
[0.8792, [0.8354, [0.8813, [0.8782, [0.7444, [0.8935, [0.8478, [0.9025, [0.8959, [0.9269, [0.9158, [0.4147, [0.6531, [0.9442, [0.9265,
0.9233] 0.8955] 0.9247] 0.9227] 0.8378] 0.9324] 0.9034] 0.9381] 0.9339] 0.9536] 0.9465] 0.6284] 0.7798] 0.9646] 0.9533]
F Values
Significance
10.32 7.58 10.51 10.24 4.88 11.71 8.19 12.79 11.98 17.07 14.80 2.13 3.59 22.34 16.96