Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX
pubs.acs.org/jchemeduc
Repairing Leaks in the Chemistry Teacher Pipeline: A Longitudinal Analysis of Praxis Chemistry Subject Assessment Examinees and Scores Lisa Shah,† Jie Hao,‡ Jeremy Schneider,† Rebekah Fallin,‡ Kimberly Linenberger Cortes,§ Herman E. Ray,‡,∥ and Gregory T. Rushton*,†,⊥ †
Department of Chemistry, Stony Brook University, Stony Brook, New York 11794, United States Department of Statistics and Analytical Science, Kennesaw State University, Kennesaw, Georgia 30144, United States § Department of Chemistry and Biochemistry, Kennesaw State University, Kennesaw, Georgia 30144, United States ∥ Analytics and Data Science Institute, Kennesaw State University, Kennesaw, Georgia 30144, United States ⊥ Institute for STEM Education, Stony Brook University, Stony Brook, New York 11794, United States ‡
S Supporting Information *
ABSTRACT: Teachers play a critical role in the preparation of future science, technology, engineering, and mathematics majors and professionals. What teachers know about their discipline (i.e., content knowledge) has been identified as an important aspect of instructional effectiveness; however, studies have not yet assessed the content knowledge of aspiring chemistry teachers in the United States. The Praxis Chemistry Subject Assessment is the most nationally representative measure of teacher content knowledge, used in 39 U.S. states in the past decade. In the presented study, we report findings concerning (i) the demographics of Praxis Chemistry Subject Assessment examinees (i.e., prospective chemistry teachers); and (ii) the longitudinal trends in exam performance across several demographic test-taker characteristics. These findings reveal substantial differences in performance and pass rates among examinees of different genders, races/ ethnicities, undergraduate majors, undergraduate GPAs, and geographic locales in which they intend to teach. We establish potential leaks in the teacher pipeline that may impact the quality and diversity of chemistry teachers in the United States and suggest ways to improve the chemistry teaching workforce. KEYWORDS: Chemical Education Research, High School/Introductory Chemistry, First-Year Undergraduate/General FEATURE: Chemical Education Research
■
INTRODUCTION Teachers have been identified as the single most impactful factor on student achievement in the United States. The effect of individual teacher qualifications on students’ science, technology, engineering, and mathematics (STEM) achievement in particular has been found to be especially critical, among which certification and a degree in the field being taught are most strongly correlated with student outcomes.1−3 In-field degrees have been used as evidence for substantial, subject-specific coursework in the fundamentals of the discipline, and in-field certifications signify a level of proficiency that is adequate for teaching the subject. It is therefore not surprising that most states administer certification exams as an approximate measure of a teaching candidate’s subject-specific content knowledge prior to awarding teaching licenses. While some states have elected to administer state-specific exams,4−7 the Praxis Chemistry Subject Assessment series, administered by Educational Testing Service (ETS) and © XXXX American Chemical Society and Division of Chemical Education, Inc.
used in 39 states over the past decade, is the most nationally representative measure of teacher content knowledge available.8 These licensure exams are offered for several subjects, including chemistry, biology, physics, earth science, mathematics, and computer science. Though some states offer alternative pathways to certification, for most of those hoping to teach high school chemistry, the Praxis Chemistry Subject Assessment serves as one of the final barriers to entry into the profession. The exam is a 100-question multiple choice instrument that “is designed to measure the knowledge and competencies necessary” for beginning high school chemistry teachers.9 Questions are intended to assess fundamental knowledge of thermodynamics, atomic and nuclear structure, chemical periodicity, and acid− base chemistry. Publicly available exam items are typically at the Received: November 2, 2017 Revised: March 3, 2018
A
DOI: 10.1021/acs.jchemed.7b00837 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
their teacher had a bachelor’s degree in the subject as opposed to an out-of-field degree.14 The results of these studies further validate the critical role of CK in effective STEM teaching.
level of an introductory college chemistry course, which prospective chemistry teaching candidates are likely to have taken.9 What teachers themselves know about the subject they teach will understandably impact what their students know when they leave the classroom. However, while the importance of teacher content knowledge and its impact on student outcomes have been reported,1−3 just how much subject-specific knowledge chemistry teachers in the United States have as they enter their first chemistry teaching assignment has yet to be established. Determining how knowledgeable incoming chemistry educators are may help ensure that beginning teachers are sufficiently prepared to teach chemistry students from day one. Using Praxis Chemistry Subject Assessment data for test-takers between 2006 and 2016, we hoped to quantify the subject-specific content knowledge of novice chemistry teachers. Certification exams like those in the Praxis Subject Assessment series often serve as the final hurdle on the path to becoming a teacher. Previous reports on the high school chemistry teaching workforce have determined that, over that past two decades, the percentage of females teaching chemistry is now greater than that of males while the workforce continues to be underrepresented by minority groups.10 Additionally, our most up-to-date analyses revealed that the majority of the chemistry workforce still lacks in-field (chemistry) degrees. We wondered whether the pipeline to producing a qualified and diverse chemistry teaching workforce suffered from a lack of qualifications and/or diversity of prospective candidates in the population of Praxis Chemistry Subject Assessment examinees.11 We therefore sought to use Praxis Chemistry Subject Assessment data to additionally determine how the demographic characteristics and performance of those intending to teach chemistry (i.e., those taking the exam) compare with those who likely go on to teach the subject (i.e., successful examinees).
■
RATIONALE AND RESEARCH QUESTIONS An understanding of what chemistry teachers in the United States know about their field going into teaching and who intends to teach high school chemistry should help inform future policy measures aimed at improving the preparedness and diversity of the chemistry teaching workforce. Our work analyzes Praxis Chemistry Subject Assessment demographic and performance data for all examinees over the past decade to provide a national picture of who has intended to teach high school chemistry during this time and how they have performed. We investigated (i) the subject-specific content knowledge of those intending to teach high school chemistry and (ii) whether their demographic characteristics were consistent with our previous analyses of the chemistry teaching workforce.10,11 Insights from these findings should help assess whether chemistry teachers in the United States are sufficiently prepared to teach students, if efforts to encourage underrepresented students to pursue chemistry teaching are observable, and whether equally qualified candidates are also equally likely to become certified. Specifically, we sought to answer the following research questions: 1. What have been the demographic characteristics of Praxis Chemistry Subject Assessment examinees (i.e., those intending to teach high school chemistry) in the past decade? How has this compared with the characteristics of those who are likely to teach chemistry (i.e., successf ul Praxis Chemistry Subject Assessment examinees)? 2. How have demographic characteristics correlated with Praxis Chemistry Subject Assessment performance in the past decade?
■
■
THEORETICAL FRAMEWORK While teacher quality is certainly multifaceted, the centrality of content knowledge in effective teaching is well-accepted. Shulman’s12 theory on teacher knowledge subcategorizes what teachers should know into three main areas: (i) content knowledge (CK); (ii) pedagogical content knowledge (PCK); and (iii) curricular knowledge. Shulman suggests that teachers should be at least as knowledgeable as a “mere subject matter major”, given that they must be able to teach students both what and why something is so in the discipline. PCK and curricular knowledge are described as specialized offshoots of a teacher’s foundational CK, suggesting that teachers are unlikely to be effective without a strong understanding of the content: “[W]hat pedagogical prices are paid when the teacher’s subject matter competence itself is compromised by deficiencies of prior education...?”12 This emphasis on teacher CK is, of course, motivated by a concern for its impact on student outcomes. In fact, several highly cited studies have reported strong correlations between STEM teachers with degrees/majors/minors in the subject being taught (i.e., indicators of subject matter competence) and student achievement in the subject.13−16 Darling-Hammond, for example, reported that fourth and eighth grade students of teachers with in-field majors in the subject being taught have outperformed their peers on the National Assessment of Educational Progress (NAEP).13 At the high school level, Sharkey and Goldhaber have shown that 10th grade mathematics students outperformed their peers on a national assessment if
ANALYTICAL METHODOLOGY
Study Sample
All three principal investigators were granted exemptions for this study from the human subjects’ research committees at their respective institutions. The study population includes all testtakers of the Praxis Chemistry Subject Assessment between June 2006 and May 2016 (i.e., academic years 2006−2015).17 Demographic information was extracted from examinees’ selfreported responses to embedded questions about their demographic backgrounds (e.g., gender, race/ethnicity, undergraduate major, undergraduate GPA) at the time of their exam. Details about demographic characteristics analyzed in this work are listed in the Supporting Information (Table S1). For the purpose of protecting the privacy of examinees, geographical information is presented at the census region level, grouped by the state postal codes of each test-taker’s home address. The exam uses a 100−200 score scale, where the scaled score is a function of the raw percentage of correct answers and exam difficulty. Only the highest score for repeat examinees was included in our analyses to avoid multiple scores for individuals.18 When demographic-specific analyses were performed on test-takers (e.g., impact of race/ethnicity of exam performance), only individuals who explicitly reported on these prompts were included in the analysis, regardless of whether or not they answered all of the demographic questions (i.e., pairwise deletion). Only 47.1% of test-takers answered all demographic questions; thus, limiting all analyses to just this population would B
DOI: 10.1021/acs.jchemed.7b00837 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
demographic attributes of interest such as race/ethnicity, gender, and undergraduate major, as well as the corresponding two-way and three-way interactions resulting in a total number of 469 candidate interactions screened by the model. A stepwise linear regression was employed to determine the best linear model from the entire set of candidate independent variables. The stepwise procedure would either enter or remove one variable at each step based on specified information criteria, such as Akaike’s Information Criterion (AIC)21 and Schwarz Bayesian Criterion (SBC).22 We choose the SBC as the selection criteria and the 10fold Cross Validation (CV) as the stopping criterion. SBC tends to suggest simpler models with lower-dimensionality than AIC (whereas 10-fold CV is employed to reduce bias caused by the variable selection technique). The SBC is defined to be
have substantially (and unnecessarily) reduced the available population size. As a result, each analysis conducted for the results displayed in this work includes slightly different populations. For each analysis, a minimum of 97.3% of the total test-taking population was analyzed except in the case of undergraduate major, where 78.7% of this population was analyzed. Given that the entire test-taking population from June 2006 to May 2016 who reported particular demographics is represented in our analysis, statistical significance parameters (e.g., p-values) are not reported. These are generally used to support the inferential analyses of representative samples, whereas our analysis, which includes the entire population in question, is purely descriptive. Standard for Passing
SBC = n × ln(SSE/n) + p × ln(n)
ETS evaluates the psychometrics of each Praxis Chemistry Subject Assessment administered, and has made data publicly available for some exams (Cronbach’s α 2015 = 0.90).18 For each test form administered, ETS convenes a panel of approximately 20 experienced K−12 and/or college educators who are familiar with the competencies required for beginning high school chemistry teachers.19 Briefly, panelists individually assign each item a score reflecting the probability that a “just qualified candidate” would answer it correctly. After several rounds of discussion among fellow panelists and revisions, these judgments are summed and averaged to produce a final recommended passing score. This information is disseminated to individual states, which consider this recommended score when setting their own passing standard. Through a comparison of the scaled score with an individual state’s passing criteria (a threshold “cut” score), the scaled score is used to determine whether a candidate passed the exam if intending to teach in a particular state. Our analyses focused on all test-takers, regardless of passing status, since information about whether individual candidates were awarded certification or not was not made available to us. Therefore, it was necessary to establish a standard for determining who was likely to have passed the exam. While 74.7% of prospective chemistry teaching candidates indicated that they intended to teach in the same state where they took the exam, it was necessary to assume a common passing standard for all candidates for modeling purposes. Gitomer20 has set a precedent for using the median cut score of states and territories that require the exam as a common passing standard. While individual states have the option to change their cut score, our analysis of publicly available data on an individual state’s yearly cut score in the past decade has revealed relatively small differences across years: Of the 39 states analyzed, 35 maintained identical cut scores across all 10 years, and the remaining four states had cut scores that varied no more than 10 points during this time (see Supporting Information, Table S2). Thus, we decided to use the median of each year’s median cut score as our standard for passing. Using this method, we found that the median Praxis Chemistry Subject Assessment cut score was a scaled score of 152. To determine the approximate corresponding percentage, we used the median raw percent correct of all individuals who scored exactly at the passing standard with a scaled score of 152. The corresponding percent correct score was estimated to be 59.0%.
(1)
where SSE is the sum of squared errors, n is the sample size, and p is the number of parameters included in the model. The stepwise procedure is able to yield a single optimal model based on the specified criterion. However, there is usually more than one equivalent optimal model with slightly different combinations of effects. Therefore, it was necessary to manually include several important variables (e.g., race/ethnicity, gender) based on previous experience or knowledge of the specific area. This process was used to build an aggregate model that identifies which demographic characteristics were most strongly correlated with Praxis Chemistry Subject Assessment performance in the past decade. The top four most strongly associated characteristics or interactions (i.e., combinations of characteristics) with exam performance were included as variables in the final model (and included below in the Findings section), which explained 21% of the variation in scores (total η2, Table1). The Table 1. Stepwise Linear Regression Models Including Top Examinee Characteristics Most Strongly Associated with Performance on the Praxis Chemistry Subject Assessment from 2006 to 2016 Stepwise Models
Total η2
Gender·GPA Gender·GPA, UMajor Gender·GPA, UMajor, Race/Ethnicity Gender·GPA, UMajor, Race/Ethnicity, (Planned) Geographic Area to Teach In
0.09 0.16 0.20 0.21
decision to include just the top four variables was motivated by little to no increase (i.e.,