Article pubs.acs.org/jchemeduc
Investigating the Effect of Complexity Factors in Gas Law Problems Jennifer D. Schuttlefield,† John Kirk,‡ Norbert J. Pienta, and Hui Tang* Department of Chemistry, University of Iowa, Iowa City, Iowa 52242-1219, United States S Supporting Information *
ABSTRACT: Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a different question was created for each attempt. Data were collected in general or preparative chemistry courses at four universities. The results were analyzed using logistic regression. Based on the student responses, the regression showed that the students’ ability to achieve a correct answer for their assigned question was dependent on three of the five complexity factors: number format, volume unit, and temperature unit. The creation of this tool provides a platform for developing a testing process and understanding student difficulties related to cognitive skills in chemistry. KEYWORDS: First-Year Undergraduate/General, High School/Introductory Chemistry, Chemical Education Research, Problem Solving/Decision Making, Gases FEATURE: Chemical Education Research
■
INTRODUCTION Cognitive load is a term that describes the number of information variables in the working memory at a given time during problem solving, thinking, and reasoning.1 According to cognitive load theory (CLT), the working memory can only operate on two−four variables at a time and hold up to seven variables often for less than 20 s.2,3 Thus, the working memory is short-term and has limitations when storing and operating on information. In contrast, long-term memory (LTM) is believed to be unlimited in capacity and is where all knowledge and information are stored as schemas.4 Schemas combine variables and organized information. Working memory has no known limitations on retrieving information from the LTM;2,5−9 the latter can help reduce the load on the working memory as all “packages” of information coming from the LTM are apparently treated as one variable.2 To better understand the role of memory, the information variables operated on by the working memory must be qualified. The information being processed depends on the complexity of the content or problem and the expertise of the learner, that is, how familiar the learner is with the information. For a novice learner, the information may represent unorganized pieces of essential details. For an expert, some information has already been organized into a “chunk” of knowledge via the schemas that arose from prior experience. Therefore, for the same problem, an expert operates on fewer variables than a novice does. This is especially true when the expert has repeated practice with a similar problem, during which the schemas become automated.2 Cognitive load theory describes the learning process and proposes that the best learning occurs when the number of variables in the working memory is optimized.2,5 This permits learners to commit information to long-term memory in the most efficient manner, thus allowing them to build on what they already understand. © 2012 American Chemical Society and Division of Chemical Education, Inc.
These cognitive models are applied in the present case, using an online tool that gives students the opportunity to solve a series of chemistry questions over a range of topics. For this study, we have chosen a basic gas law problem applying Charles’s law as shown in eq 1. As a word problem, it can be summarized as, “Given an initial volume of gas V1 at an initial temperature T1, what is the volume V2 at different temperature T2?”
V1 V = 2 T1 T2
(1)
Determining or understanding the difficulty of these questions is one of the goals of this study. Some semantic issues about problems and their definitions are worth considering first. Previous reports have sought better differentiation of the terms “problem” and “exercise”.10−12 If one knows what to do when processing a question, it is an “exercise”; otherwise, it is a “problem”. According to this definition, a problem can only be answered by seeking novel solutions rather than simply using algorithms. Furthermore, Bodner proposed that the difference between an exercise and a problem depends on whether one is familiar with the task.13 However, these two terms are relative. Exercises to an expert may be problems for novices because of the differences in familiarity and difficulty of the same questions to the two groups.14 Our data show (see Results, below) a substantial range of rates of success depending on the exact nature of the question selected from a set of analogous ones that some would casually characterize as exercises. This means that the questions in this study introduced difficulties and complexities to the participants that have raised the questions to the level of problems. As a result, we refer to them as problems and to the ability to solve them as problem-solving skills. Published: January 13, 2012 586
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591
Journal of Chemical Education
Article
calculator or textbook) to aid in their attempt to solve the problem. Following student login, the Flash software tool assigns a word problem by randomly picking among the variables of the five complexity factors. The possible variables for each complexity factor are listed in Table 1. Two questions
Students’ ability to solve the problems was monitored as a function of five different complexity factors (gas identity, number format, volume, temperature, and pressure) that were varied within the problem to determine the effect of those different factors. Consequently, the level of difficulty associated with these different factors, the cognitive load that students encountered, and the expertise of students in solving the problem could be investigated. To complete the problem, students were asked to solve for the final volume, V2, of a system in which the pressure and the amount of gas were both held constant while the temperature was varied. Equation 1 was not given to the students, as they were expected to identify the correct equation to use from the instruction on gas laws in their chemistry course. No separate assessment was made on their conceptual understanding of the problem. They were expected to understand the idea, select an approach, and perform the necessary calculations; the tool simply asks for a numerical answer. As part of the variation of complexity factors automatically assigned by the questions tool, students could be asked to either do unit conversion for volume (e.g., from mL to L, or L to mL) or know that, to correctly solve for V2, the temperature was required to be in the units of K (kelvins).
■
Table 1. Variables in Each Complexity Factor Gas Identity An ideal gas A mixture of gases An unknown, ideal gas
Number Format General Scientific notation Decimal
Vol
Temp
Pressure
L to L mL to L
K to K °C to K
atm Torr
L to mL
K to °C
Blank (no unit)
mL to mL
°C to °C
METHODOLOGY
Participants
Data were acquired at four different institutions for a total of 4246 attempts over four years. Of these attempts, 2321 were from students enrolled in preparative chemistry courses: the University of WisconsinMilwaukee (UWM, 1452 attempts), the University of Iowa (UI, 794 attempts), and the University of WisconsinPlatteville (UWP, 75 attempts). Preparative chemistry is a one-semester course taken to fulfill a requirement or to gain entry into a two-semester sequence. Another 1925 attempts were from general chemistry courses (i.e., in the first semester of a two-term sequence): Iowa State University (ISU, 1761 attempts) and the University of Iowa (UI, 164 attempts). All subjects had learned gas laws in lecture before participating in this study. Students at UWM and ISU were required to perform the problem for homework credit, whereas students at the other universities were encouraged but not required to use the online tool as practice for an exam.
demonstrate the randomized assignment of the variables (Box 1 and Box 2). The text in bold type is used here to identify the
Instruments
complexity factors; the bold type did not appear in the problems to the students. For gas identity, three different possibilities were used. A student’s problem could begin with “An ideal gas”, “Ideal mixtures of gases (methane, carbon dioxide, and oxygen)”, or “An unknown, ideal gas with a molecular weight of X g/mol” where X was 32, 28, 48, 44, 46, 30, 64, 80, 62, 20, 40, or 16. For number format, the volumes were given as one of three different possibilities: in a general number format (e.g., 1.23), in decimal format (e.g., 0.0012), or in scientific notation (e.g., 1.23E6). For the volume and temperature complexity factors, four different combinations of units were used. The volume complexity factor had the following possibilities: mL to mL, L to L, L to mL, and mL to L; the first unit represents the volume of the initial state V1 while the second unit is for the final state V2. The temperature complexity factor had similar possibilities: °C to °C, °C to K, K to °C, and K to K. The last complexity factor, pressure, also had three different options. The pressure was always referred to as constant, but stated in different ways. One option stated that the pressure was constant
The Web-based tool was created using Adobe Macromedia Flash software and was implemented within various browsers via the plug-in available from Adobe. The Flash plug-in is commonly available on most popular browsers. A student would be given the URL that hosted the tool and a user ID to enable access to the tool. Two different implementations of identification were used: either (i) a single user ID for an entire class that just identified the school and term or (ii) an individual user ID for each student; the latter was necessary to give individual students homework credit for completing the task. For each problem completed, the software archives all of the variables and student activities within the tool. This includes each complexity factor variable, the numbers associated with the randomly varied complexity factors in the problem (i.e., volumes, temperatures, and pressure), the steps performed on the online calculator, the answer submitted by the student, and whether the answer was correct. The students were required to use the provided calculator in order to submit their answers for grading. Because they did this work on their own, the students could use external sources (e.g., personal 587
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591
Journal of Chemical Education
Article
(e.g., “pressure of the system is maintained at a constant value”), while the other options included a value that was added to the pressure in either the units of atmospheres (atm) or Torr (e.g., “pressure of the system is maintained at a constant value of 1.2 atm”).
Table 2. Likelihood Ratio Test of Preparative Chemistry Students’ Data Likelihood Ratio Tests Effect
Model Fitting Criteria −2 Log Likelihood of Reduced Model
Likelihood Ratio Tests χ2 Values
Degrees of Freedom
p Values
Data Analysis
Intercept Gas identity Number format Vol Temp Pressure
1124.000 1125.000 1130.000
0.000 1.597 5.822
0 2 2
0.450 0.054
1204.000 1129.000 1126.000
80.262 4.644 2.395
3 3 2
0.000 0.200 0.302
Based on the number of possibilities for each complexity factor described above, there are a total of 432 (3 × 3 × 4 × 4 × 3) different combinations of the gas law problem. The difficulty of each of these different variables (e.g., L to L) was determined by logistic regression. We predicted that instances of complexity variables in different combinations would involve different levels of cognitive load. For example, if all the other complexity variables are identical, changing the unit in volume (e.g., mL to L) should impose more cognitive load than no unit change (e.g., L to L), because converting among measurement units introduces complexity.15,16 All analyses were done using the Statistical Package for the Social Sciences (SPSS, version 17 for Windows). Logistic regression was used to determine the statistical relevance of the different complexity factors in the gas law problem as well as the overall difficulty of each complexity factor. (See the Supporting Information.) Results from the preparative chemistry and general chemistry courses were analyzed separately. The use of logistic regression in chemistry has not been as widespread as in other disciplines. In one published example, logistic regression was used to examine the success of students in general chemistry based on diagnostic testing.17 Thus, Legg et al. showed that logistic regression could be used to better advise students by predicting student success in general chemistry based on their placement exam (i.e., diagnostic test) scores. We use logistic regression here to analyze student data acquired from the Web-based tool, examining the role of each variable and developing a method to look at the role of cognitive load in the questions. This Webbased tool and accompanying statistical analyses examine students’ abilities to correctly answer word problems by exposing the students simultaneously to a large number of variables in these chemistry problems. This use of technology allows for data collection that would not be easily accomplished with paper and pencil, as well as capturing students’ steps in the problem-solving process. This regression method identifies individual complexity factors within the gas law problem that led to student success or failure, as well as some information about the types of errors the students produced. A more detailed description of logistic regression and its application in this study can be found in the Supporting Information.
■
This analysis shows that the complexity factor volume is significant within the data set, while number format is marginally significant (p = 0.054) in affecting the ability of a student to obtain a correct answer. A more detailed analysis of the data was done using the parameter estimates model in the logistic regression output. This allows determination of significant variables within a particular complexity factor category, not just the overall significance of the individual complexity factor. In addition, this model allows for the level of difficulty of each complexity factor variable to be determined. The parameter estimate results from the regression are shown in Table 3. The complexity factors Table 3. Parameter Estimates of Preparative Chemistry Students’ Data Complexity Factors Intercept Gas identity
Number format
Vol
Temp
Pressure
Variables
β Values
Ideal gas Mixture gas Unknown gas Decimal (0.035) Scientific notation General (1.68) mL to mL mL to L L to mL L to L K to K °C to °C °C to K K to °C Torr atm Blank (no unit)
0.852 0.028 −0.093 0.000 −0.121 −0.234 0.000 −0.495 −0.405 −0.894 0.000 −0.019 −0.223 −0.044 0.000 −0.169 −0.053 0.000
Standard Error
p Values
0.162 0.112 0.117
0.000 0.804 0.428
0.112 0.097
0.284 0.016
0.165 0.117 0.101
0.003 0.001 0.000
0.138 0.126 0.124
0.889 0.076 0.726
0.113 0.104
0.134 0.607
and variables, as well as the calculated β values with standard error, are included. Note that, just as in the case with linear regression, β values are proportionality constants that denote the magnitude of the effect of each of the multiple variables in the fit. The calculated β values represent the relative difficulty of each variable within a single complexity factor group. The convention used throughout the data analysis is that more negative β values correspond to more difficult components of the word problem. For each complexity factor group, one variable is set as a reference. The β value for the reference variable is set to zero by the statistics package, and all other values are calculated relative to that value. This allows for quick
RESULTS
Preparative Chemistry
The preparative chemistry students answered 54% of the attempts correctly. The likelihood ratio test was performed as part of the logistic regression analysis and used to determine the significant complexity factors for the overall data set and to measure the deviance of the data from the logistic model (Table 2). In Table 2, the complexity factors, the model fitting criteria (i.e., the −2 log likelihood of the reduced model), the likelihood ratio test (χ2 statistic), the degrees of freedom for each complexity factor, and the significance for each factor are listed. The criterion for the level of significance is p < 0.05 for a complexity value to have a significant effect on the model. 588
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591
Journal of Chemical Education
Article
surprising as the volume complexity factor results, as the °C to °C variable offers no hint of a necessary conversion to kelvins in order to correctly answer the question. To evaluate the degree of accuracy of the logistic regression model described above, a 50−50 cross-validation was conducted. The 2321 attempts were randomly divided into two subsets: a training sample containing 50% of the attempts, and a holdout sample containing the remaining 50% of the attempts. The former was used to establish the logit equation, while the latter subset was plugged into the derived equation to test the accuracy of the prediction from the model. The analysis showed that the accuracy rates were 50.3% and 58.4% for the training and holdout samples, respectively. Because 58.4% > 45.3% (45.3% = 50.3% × 0.9, the minimum requirement for the holdout sample), the classification accuracy for the analysis of the full data set was supported.
and easy identification of the most difficult variables. For example, compared with L to L, volume unit conversion L to mL would decrease β from 0 to −0.894; the odds ratio of successfully solving the problem, which is equal to eβ and is defined as the probability p divided by 1 − p, would decrease from 1 to 0.409. Thus, “L to mL” is more difficult than “L to L”. From the parameter estimate results shown in Table 3, the result of the logistic regression showed an intercept value of β0 = 0.852 ± 0.162. The intercept is defined as the total difficulty when all of the reference complexity factors variables are present in the word problem. The large positive value of β0 indicates that, for these specific complexity factors, the problem is relatively easy, which is confirmed by the percentage of attempts that are answered correctly (i.e., 54%). Therefore, the “basic” problem is reasonably easy by virtue of how it was defined, and any change to the problem using the other variables in each group results in changing the difficulty factor to one that makes the problem more difficult. Within the different complexity factors groups, calculated β values are used to determine the difficulty of each individual complexity factor variable. For the gas identity and pressure complexity factors, the differences among values did not show a level of significance that was 53.2%). Within the number format complexity factor, the scientific notation variable was again determined to be significantly different from the general or decimal formats. For the temperature complexity factor, the results showed that only one variable, °C to °C, was found to be significantly different from the baseline, similar to °C to °C in the preparative chemistry data set, which was also determined to be significant. In a manner similar to the results of preparative chemistry data, all the other variables in the volume complexity factor were significantly different from the easiest variable, L to L. The variables are ranked from easiest to hardest: L to L, mL to mL, mL to L, and L to mL. The β value for the L to mL variable is more negative than the β value for the mL to L variable by a significant amount. Once again, students had more difficulty with the conversion from L to mL than they did with the mL to L conversion. Unlike the preparative chemistry data set, a significant difference was found in the pressure complexity 589
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591
Journal of Chemical Education
Article
Table 5. Parameter Estimates of General Chemistry Students’ Data Complexity Factors Intercept Gas identity
Number format
Vol
Temp
Pressure
Variables
β Values
Ideal gas Mixture gas Unknown gas Decimal (0.035) Scientific notation General (1.68) mL to mL mL to L L to mL L to L K to K °C to °C °C to K K to °C Torr atm Blank (no unit)
1.770 −0.067 −0.047 0.000 −0.081 −0.322 0.000 −0.494 −0.831 −1.362 0.000 −0.029 −0.380 −0.071 0.000 0.152 0.267 0.000
Table 6. Simplest and Most Difficult Combinations of Complexity Factors in the Questions
Standard Error
p Values
Rank of Question Difficulty
0.198 0.128 0.129
0.000 0.599 0.716
Simplest question
0.129 0.126
0.530 0.011
0.162 0.153 0.149
0.002 0.000 0.000
0.153 0.145 0.147
0.850 0.009 0.630
0.126 0.127
0.227 0.035
Most difficult question
Course Level Preparative chemistry General chemistry Preparative chemistry General chemistry
β Values
eβ Values
Probability of Success, %
0.88
2.41
70.7
2.037
7.67
88.5
−0.761
0.467
31.8
−0.361
0.697
41.1
piece of information. The quantitative relationship between the significant variables and the additional number of pieces of information being processed needs further investigation. However, the differences in probabilities of success in the easiest and hardest questions (Table 6) suggest that the pieces of information in the most difficult combinations must have exceeded the maximum number of elements that could be stored and operated on in student working memory. From the results, we observe the following: 1. Some complexity factor variables did not contribute significantly to difficulty and/or to the cognitive load of the question; for example, the description of the ideal gas that lists several gases in the mixture, or a molecular weight of an unknown gas, produced no significant differences. 2. Some complexity factor variables added either significant difficulty or cognitive load to the question; all of these appear to be related to the calculational aspects of the question. 3. The pieces of information in the most difficult questions overloaded students’ working memory as evidenced by the markedly lower success rates. Table 6 also shows that general chemistry students performed better than the preparative chemistry group. The former had a significantly higher overall success rate than the latter (71% vs 54%). Many students enrolled in preparative chemistry are not science majors, while general chemistry students in this study are more science- or engineering-focused and often have higher admission scores in mathematics and science than their preparatory chemistry counterparts. As a group, the general chemistry students did better on the easiest and hardest questions, although the factors that defined those extremes were generally identical. The rates of success for the easiest questions for both groups would certainly confirm their characterization as exercises. However, the most difficult questions confounded both groups with evidence of the role of cognitive load in this difficulty. In Table 7, we attempt to estimate a quantitative increment in cognitive load for each of the variables used by the tool and based on our original hypothesis, yet modified based on a Bayesian interpretation of the outcomes. The last column in Table 7 contains a cognitive load item (0 = no additional load, 0.25 = small effect, 0.50 = medium effect, 1 = large effect). For each of the 432 possible word problems, a cognitive load increment can be calculated by adding the values in the last column in Table 7. The resulting cognitive load increment ranges from 0.00 to 2.50, with the larger numbers representing higher cognitive load. Note that this increment is part of the total cognitive load of the task. The other contributors to the overall cognitive load include processes such as setting up a gas
factor. One variable, atm, was found to be significantly easier than the reference variable. In other words, when given the unit of pressure in atm, the problem was easier to these students than when no unit was mentioned. This may be due to students having more experience or familiarity using atm as the unit of pressure for gas laws.
■
DISCUSSION Tables 3 and 5 demonstrate several complexity factors that significantly affected students’ abilities to correctly answer the gas law word problem: scientific notation of number format, all volume variables, and °C to °C of the temperature variables. In addition, atm of the pressure variables was determined to be significantly different from the reference variable for students in general chemistry courses. These results support the prediction that different combinations of complexity variables are accompanied by different levels of cognitive load. For example, the easiest combinations of the five complexity factors for preparative chemistry include ideal gas, general numbers, L to L, K to °C, and blank (i.e., no pressure unit). For general chemistry, unknown gas, general numbers, L to L, K to °C, and atm represent that group. On the other hand, the most difficult variables for preparative chemistry are mixture of gases, scientific notation, L to mL, °C to °C, and Torr; for general chemistry, the combination includes ideal gas, scientific notation, L to mL, °C to °C, and blank (no pressure unit). In both groups, manipulations of scientific notation and unit conversions introduced additional components to remember and manage, which added cognitive load during student problem solving. Table 6 shows how student performance was influenced by cognitive load when the two extreme combinations of the complexity factors were compared. Johnstone pointed out that when the pieces of information in a chemistry question increased from 5 to 6, the proportion of students correctly answering the question decreased dramatically; the author attributed this effect to the overload of students’ working memory.18 This finding is consistent with and can be explained by CLT. In our study, each variable in a complexity factor does not necessarily equal one incremental 590
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591
Journal of Chemical Education
Article
introductory chemistry. The use of the online tool and the data acquired in this study provides a novel, detailed method to test student understanding in the problem-solving process and takes a first step toward investigating the role of cognitive load in introductory chemistry courses.
Table 7. Cognitive Load Items Assigned to Each Variable in the Questions eβ Values Complexity Factors Gas identity
Number format
Vol
Temp
Pressure
Preparative Chemistry
General Chemistry
Cognitive Load Items
Ideal gas Mixture gas Unknown gas Decimal
1.028 0.911 1.000
0.935 0.954 1.000
0.00 0.25 0.25
0.886
0.922
0.25
Scientific notation General mL to mL mL to L L to mL L to L K to K °C to °C °C to K K to °C Torr atm Blank (no unit)
0.791
0.725
0.50
1.000 0.610 0.667 0.409 1.000 0.981 0.800 0.957 1.000 0.845 0.948 1.000
1.000 0.610 0.436 0.256 1.000 0.971 0.684 0.931 1.000 1.164 1.306 1.000
0.00 0.50 1.00 1.00 0.00 0.00 0.50 0.00 0.00 0.25 0.00 0.00
Variables
■
ASSOCIATED CONTENT
S Supporting Information *
Description of logistic regression and its application in this study. This material is available via the Internet at http://pubs. acs.org.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. Present Addresses †
Department of Chemistry, University of Wisconsin Oshkosh, Oshkosh, Wisconsin 54901, United States ‡ Department of Chemistry, University of WisconsinStout, Menomonie, Wisconsin 54751, United States
■
ACKNOWLEDGMENTS This work was supported by the National Science Foundation under Grants DUE CCLI 06-18600 and DUE CCLI 08-17279. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. The authors would like to thank Ron Erickson (UI), Luke Haverhals (UI), John Picione (UWM), Thomas Holme (UWM and ISU), Sofia Carlos Cuellar (UWP), and Ramon Cuellar (UWP), who allowed the problems to be used in their classes as well as the students at these universities for their participation.
law equation and figuring out an expression of the final volume. Statistics show very high correlations (p < 0.01) between the cognitive load increment and student performance (success odds, i.e., eβ), with r = −0.925 for preparative chemistry and r = −0.844 for general chemistry. Based on these definitions, the cognitive load increment of the sample problem in Box 1 is 0, while that in Box 2 is 2.25. Because students solved the problems in a relatively uncontrolled environment, direct measurement of cognitive load may be confounded by student use of external material, such as scratch paper or textbooks, which may have extended their short-term memory capacity.19 Collecting the data “in the wild” enabled a large enough sample size to establish this method for examining a large number of variables simultaneously, something not possible with “think aloud” or “questions on paper” strategies. The general utility of this method has been confirmed using the same tool by our group recently to study a set of stoichiometry word problems. In that case, the complexity factors number format and unit significantly affected students’ abilities to correctly solve the problem.
■
REFERENCES
(1) Sweller, J.; Chandler, P. Cognit. Instr. 1994, 12, 185−233. (2) van Merriënboer, J. G.; Sweller, J. Educ. Psychol. Rev. 2005, 17, 147−160. (3) Miller, G. A. Psychol. Rev. 1956, 63, 81−97. (4) van Merrienboer, J. G.; Ayres, P. Educ. Technol. Res. Dev. 2005, 53, 5−13. (5) Brünken, R.; Plass, J. L.; Leutner, D. Educ. Psychol. 2003, 38 (1), 53−61. (6) Grimley, M. Educ. Psychol. 2007, 27 (4), 465−485. (7) Sweller, J. Cognitive Sci. 1988, 12, 257−285. (8) Sweller, J. Evolution of Human Cognitive Architecture. In The Psychology of Learning and Motivation; Ross, B. H., Ed.; Academic Press: New York, 2003; Vol. 43. (9) Ericsson, K. A.; Kintsch, W. Psychol. Rev. 1995, 102, 211−245. (10) Zoller, U. J. Chem. Educ. 1987, 64 (6), 510−512. (11) Bodner, G. M. J. Chem. Educ. 1987, 64 (6), 513−514. (12) Frank, D. V.; Baker, C. A.; Herron, J. D. J. Chem. Educ. 1987, 64 (6), 514−515. (13) Bodner, G. M. Univ. Chem. Educ. 2003, 7, 37−45. (14) Bunce, D. M. J. Res. Sci. Teach. 1991, 28 (6), 505−521. (15) Robinson, W. R. J. Chem. Educ. 2003, 80 (9), 978−982. (16) Dori, Y. J.; Hameiri, M. J. Res. Sci. Teach. 2003, 40, 278−302. (17) Legg, M. L.; Legg, J. C.; Greenbowe, T. J. J. Chem. Educ. 2001, 78 (8), 1117−1121. (18) Johnstone, A. H. J. Chem. Educ. 1984, 61 (10), 847−849. (19) McCalla, J. J. Chem. Educ. 2003, 80 (1), 92−98.
■
CONCLUSIONS An online assessment tool was developed to investigate problem difficulty in undergraduate chemistry courses using a series of gas law questions. The tool identified individual complexity factors that affect the ability of a student to correctly answer a problem. Logistic regression was used to determine the significance of the five different factors that were randomly varied for each question generated. Three factors were consistently determined to significantly affect students’ ability to correctly solve the gas law problem: the number format, volume, and temperature. In addition, we found different levels of complexity for specific unit conversions. Thus, the order of complexity in the volume variable ranges from low to high: L to L, mL to mL, mL to L, and L to mL. These results were observed across institutions and at different levels of 591
dx.doi.org/10.1021/ed100865y | J. Chem. Educ. 2012, 89, 586−591