It's Just Math: Research on Students' Understanding of Chemistry and

In this case study, we created a matrix of problems. Each row of ... In examining variations in the presentation of mathematically rooted problems, we...
1 downloads 0 Views 3MB Size
Chapter 8

Transition of Mathematics Skills into Introductory Chemistry Problem Solving Downloaded via UNIV OF ROCHESTER on May 13, 2019 at 17:56:52 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

Benjamin P. Cooke1 and Dorian A. Canelas2,* 1Academic Resource Center, Department of Mathematics, Duke University, Durham,

North Carolina 27708, United States 2Department of Chemistry, Duke University, Durham, North Carolina 27708, United States *E-mail: [email protected]

We investigate and discuss the math-chemistry link for a subpopulation of undergraduate learners present at every institution: students who have extremely limited experience with chemistry problem solving or have standardized test math scores in the lowest quartile for their incoming matriculating class. Our institutional experience has been that additional college-level coursework in mathematics, prior to enrollment in chemistry, does not improve student outcomes. In this case study, we created a matrix of problems. Each row of the matrix included three categories of problems: (1) symbolic math algebra problems that involved solving or simplifying an equation with variables such as x and y, (2) word problems using the same type of equation or systems of equations, but with domain-general, everyday terms, and (3) analogous problems using domainspecific chemistry terminology. We explored precourse student responses on these items, final exam scores, and course grade outcomes collected over eight years in a first-semester introductory chemistry course. Most of the students answered 100% of the algebra problems correctly, but very few of the students submitted correct answers for 100% of the analogous word problems even in the domaingeneral category. With our instrument, the domain-general word problem outcomes were the best predictor of student success in the course: these were more predictive than the chemistry problems on the instrument. Results are discussed within the frameworks of Johnstone’s triangle and Vygotsky’s zone of proximal development.

Like all natural sciences, the discipline of chemistry occupies space at the intersection of quantitative and qualitative conceptual reasoning. It is not surprising, then, that many studies show a correlation between a student’s high school or college general chemistry grades and their scores on standardized math assessments (1–17). National (6, 10, 11, 18–20) and international (7–9, 12, © 2019 American Chemical Society

Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

13, 21) concerns about learner attrition from science, technology, engineering, and mathematics (STEM) during the first year of college call attention to the urgency of providing suitable scaffolding in introductory coursework. Studies have shown not only that math and chemistry scores correlate, but also that “ability differences in math are important because they affect students’ chemistry competency beliefs” (22). Mastery of even the most rudimentary applied math skills, such as dimensional analysis (23), is crucial to success in introductory quantitative chemistry coursework. Researchers have shown that stoichiometry is a particularly discerning topic in terms of predicting student success in the first semester of college general chemistry (6, 15), suggesting that being adept at creating and solving stoichiometric ratio problems is an essential factor in novice chemistry learner achievement. Moreover, fluency in mathematics underpins many more advanced chemistry applications (24–26). For learners who continue in science majors, college-level chemistry and advanced calculus courses are often taken in parallel or in rapid succession, with expectations that students can immediately apply newly acquired mathematical operations, which are often introduced in the abstract, symbolic domain, to physical problems in chemistry. As an illustration, Becker and Towns explored (26) higher level mathematics skills: student interpretations of partial derivatives in thermodynamics expressions. They noted that “more work is needed that explores what resources students bring from mathematics courses, and how chemistry and physics instructors can aid students as they reinterpret mathematics in chemistry contexts” (26). In light of the correlation between math achievement and chemistry course performance, the chemistry education research community must ask ourselves, how well do we understand the nature of this relationship? Moreover, what barriers exist to facilitating the transition of students in the application of math concepts to chemistry problems? It is imperative that the chemistry education community probe this relationship and design activities to smooth the assimilation of math skills into quantitative chemistry problem-solving skills. As faculty design chemistry courses for the first two years of college, information regarding their particular student body’s science and mathematical experience in high school is particularly relevant. Several math assessments to gather data of this type have been reported in the literature (27–29), including a recent test that has been widely deployed in the state of Texas at postsecondary schools (30). Along these lines, we designed a short quantitative assessment for students in our most introductory course, Introduction to Chemistry and Chemical Problem Solving (described below). SAT or ACT math score and high school chemistry background were already being used as a primary placement guide for first-year undergraduate chemistry classes, but we wanted to take assessments of math skills one step further. We did this by probing the specific links between problem-solving proficiencies on algebra problems, domain-general word problems, and domain-specific chemistry word problems. Our goal was to understand the patterns in the problem solving of our students in this course so that we can develop future learning activities to meet the students where they are. The case study presented herein provides data from a pilot study of that assessment.

Theoretical Framework In examining variations in the presentation of mathematically rooted problems, we employ a range of theories from the tradition of constructivism. Vygotsky’s theory of the zone of proximal development (31), scaffolding in learning (32), and generative learning theory (33) play key roles. In addition, we explore the movement of students around the space defined by Johnstone’s triangle (34) from the purely symbolic domain of mathematical expressions to additional symbolic 120 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

representations of chemical reaction equations with the addition of the submicro- and macrodomains. The invocation of Johnstone’s triangle in chemical education research has become so widespread in such research that it has rightly found a place as a critical paradigm of our discipline (35, 36). This model was further refined by Mahaffy (37), who created a tetrahedron with the “human element” as the apex of the pyramidal structure.

Methods Description of the Course Various aspects of the course, Introduction to Chemistry and Chemical Problem Solving, have been previously described (10, 11, 38–41). Since its conception in 2009, the classroom structure has combined live lectures, chemical demonstrations, and student-centered, cooperative learning activities for small student groups. Creation of this course as part of a larger departmental curriculum revision has led to substantially higher performance and retention of students who matriculate with SAT or ACT math scores in the lowest quartile of scores for their class (11, 41). The department now offers three starting levels of college general chemistry (42); this particular course is designed for students who have a limited background in chemistry but who intend to enroll in additional courses in chemistry. All sections of the course were taught by the same instructor, and students had access to the same required course materials, such as the textbook (43), online resources (Sakai resources, all years, and Coursera video lectures with embedded questions, which started in 2014) (38, 44–46), unit plans with explicit desired learning outcomes (25), graded online homework assignments (WebAssign), and in-class activities drawing from POGIL (47–49), SCALE-UP (50–52), and problem manipulation (53) pedagogies. The class did not have a laboratory component. There were three in-class midterm exams spaced to include approximately one-third of the course material at a time; while these covered similar material, every section had a different version of each midterm exam to reduce cheating. All students took a common, cumulative final exam that was administered at the conclusion of the term. These final exams were not returned to students. Exams contained a combination of short answer, multiple choice, and free-response problems. The exam grades (which were out of 100 points) and final course averages were not curved prior to the assignment of letter grades. Rather, as written in the syllabus, letter grades were assigned according to preestablished percentage ranges: 90–100% A; 80–89% B; 70–79% C; 60–69% D; 0–59% F. In terms of mathematical reasoning in chemistry, the course marches through key qualitative and quantitative concepts that the literature suggests are crucial to success in future chemistry problem solving (6, 15). In all cases, the mathematical concepts are reviewed in the context of chemistry: the first weeks start with reviewing dimensional analysis in the context of medical dosing, and the course moves through to stoichiometric ratios and more complex ideas such as logs and systems of equations. Description of the Assessment Three categories of student skills in quantitative problem solving were deemed as critical in undergirding the processes used repeatedly in this particular course: 121 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

1. Proportions; 2. Unit analysis; 3. Systems of equations. In each of these categories, three types of problems were constructed: 1. Symbolic math problems; 2. Domain-general word problems; 3. Domain-specific word problems in chemistry. Thus, a 3 × 3 matrix of problem types was created. The entire quantitative assessment instrument is available, upon request, from the corresponding author. Illustrative examples for the “system of equations” category of problems are shown in Figure 1. The assessment was the second WebAssignment due, learners had only one submission (as opposed to five tries normally allowed on homework), and it was marked as extra credit toward the homework total. Assessment directions informed students that they would receive the 2 points of extra credit for submitting their answers regardless of their score on the assessment. Problems were delivered in random order, each student received a selection of 10–15 of the problems, and, while the assignment could be saved, all answers had to be submitted at the same time using the submit button at the end of the assignment.

Figure 1. Examples of items in one category showing symbolic math, domain-general, and domain-specific components of the matrix of problems. Description of Participants The participants in this experiment were undergraduates at a large private university in the southeastern United States. All participants were enrolled in the course Introduction to Chemistry and 122 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

Chemical Problem Solving during the fall semester. These students completed an online quantitative assessment as part of their normal homework during the first two weeks of the fall semester. The class size has ranged from 49 to 95 students, and the quantitative assessment and final exam score data were collected from all sections of the course over an eight-year period and aggregated for the overall statistical analysis. Of the students enrolled in the course, 92% completed the quantitative assessment. The online assignment analyzed was a normal assignment in the course, so no recruitment materials were used to incentivize participation. Investigators did not recruit subjects for this work because this was initially conceived as an internal assessment project, and the team did not want to introduce any selection bias. For all enrolled students, researchers collected the answers on the online assessment assignment, final exam scores, and final course grades for analysis. Experimental Protocol Multivariable regression and hierarchical cluster analyses were performed using R statistical software. We used a multivariable regression with the algebra, word, and chemistry subscores as independent variables and the final exam score as the dependent variable. From the 500 individually completed assessments over the eight total years, the final regression analysis included 425 observations (75 observations were deleted because one or more pieces of data were missing). Some small changes to the items that were deployed in each assessment version occurred over the first three years of the study. Moreover, as it became evident that proportional reasoning was especially important, additional proportional reasoning word problems of both the domain-general and domain-specific type were added over this time. The instrument then remained constant over the last five years of the study. For the cluster analysis, the 11 problems in the version of the assessment given in the five years between 2013 and 2017 were grouped based on student answers. To make the alluvial diagram in the results section, the students are grouped by the percentage of the type of problem (symbolic, general, or chemistry domain) that they answered correctly: 100%, 50–100%, or 90% of enrolled undergraduate students were traditional college age (under 24 years old). Most of the students enrolled in the course were either first-year or second-year undergraduates. Due to the characteristics of the populations studied herein, caution must be exercised in attempting to extrapolate the findings to populations of more advanced undergraduate students or graduate or professional students. To protect individual confidentiality and due to the small numbers in some groups, data were not disaggregated using demographics.

Results and Discussion We had two main goals in analyzing the results from the quantitative assessment: (1) to see which problems on the quantitative assessment correlated with success on the final exam, and (2) to 123 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

determine which types of problems on the quantitative assessment gave students more difficulty and analyze for patterns in those problems. For the first goal, the alluvial plot (Figure 2) shows the flow of correct and incorrect answers on the different problem types. Each path through the diagram from left to right represents one student. Each column represents a given problem type (symbolic algebra, chemistry, or general word problems), and the final column is the final exam score. In the first column of the alluvial diagram, one can see that most of the students answered 100% of the symbolic algebra problems correctly. In contrast, in the third column, one can see that very few of the students submitted correct answers for 100% of the word problems. The alluvial diagram shows that students who answered more than 50% of each quantitative assessment problem type correctly earned more than half of the total number of A (90+%) and B (80–89%) letter grades on the final exam. In addition, students who either did not answer 50% of the quantitative assessment’s domain-general word problems correctly or did not answer 50% of the quantitative assessment’s chemistry word problems correctly still earned some of the As on the final exam. Also, there were students who answered 100% of the quantitative assessment items correctly in one or more of the categories and who nonetheless earned a C or lower grade on the final exam. A noteworthy outcome clearly emerging from this study was that the ability to proficiently solve an algebra problem underpinning a chemistry problem does not necessarily translate into the ability to set up that mathematical expression from reading the problem. Indeed, in this case, most of the students had good success solving the purely symbolic math problems. This is consistent with the earlier reported finding that many students feel that their mathematics backgrounds prepared them well for college-level chemistry (54). Indeed, once the students have the proper equation(s) set up, then they can solve those equations. The regression results given in Table 1 show the correlation between each problem category and the final exam score, which is also visualized in Figure 3. Again, this quantitative assessment was given in the first week of class, so it is not a good predictor of success on the final exam because it does not take into account the range of student experiences, other courses, conceptual reasoning skills, time management and behavior, or many other factors that influence a student’s success in transitioning to college, but we ran a regression anyway to see if some of the problem types were better correlated with final exam scores than other problem types. In the regression results, we can see that, of the three problem types, the domain-general word problems were most predictive for this linear model. Figure 3 plots the percentage of each problem type correct versus final exam scores: the slope of the fitted line for the domain-general word problems is steepest. Some professors lament that increasing reliance upon calculators and computers for basic problem solving could be implicated in the general deterioration of the mathematical skills demonstrated by chemistry students (55), but we agree with more recent findings (30) and do not think that is the root of the problem here. Any students who cannot solve purely algebraic problems are outside of the zone of proximal development for applying algebra concepts to chemical phenomena, and these learners would benefit from additional instruction in algebra (56) before beginning college-level chemistry. The firm correlation between standardized math testing scores and learner outcomes in first-semester general chemistry have led many institutions to adopt this strategy (more math before chemistry). However, is this the best approach for all, or even most, college learners?

124 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

Figure 2. Alluvial diagram tracing pathways through problem type (symbolic algebra problems, domainspecific chemistry problems, and domain-general word problems) and final exam score outcomes for each student.

125 Towns et al.; It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics ACS Symposium Series; American Chemical Society: Washington, DC, 2019.

Table 1. Results of Regression Analysisa Residuals Min

1Q

Median

3Q

Max

-63.744

-5.451

1.361

8.044

18.362

Estimate

Std. Error

t Value

Pr(>|t|)

(Intercept)

72.945

2.772

26.316