The Development of Multiple-Choice Items Consistent with the AP

Jul 16, 2014 - Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exa...
0 downloads 0 Views 455KB Size
Article pubs.acs.org/jchemeduc

The Development of Multiple-Choice Items Consistent with the AP Chemistry Curriculum Framework To More Accurately Assess Deeper Understanding John M. Domyancich* Science Department, St. Bede Academy, Peru, Illinois 61354, United States ABSTRACT: Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward the assessment of these more advanced learning outcomes. This article describes the attributes of multiple-choice questions that assess these skills and thus coincide with the AP chemistry curriculum framework. Also, methods for updating existing questions to conform to these goals are described. From this, teachers of AP chemistry may adapt their current assessments to more accurately measure these desired learning objectives and skills. This contribution is part of a special issue on teaching introductory chemistry in the context of the advanced placement chemistry course redesign. KEYWORDS: First-Year Undergraduate/General, Curriculum, Testing/Assessment, Professional Development

T

guidelines are released every year and several full-length exams, including the multiple-choice section, had been released before the redesign. This provided teachers with a pool of authentic questions to either use or adapt for their own assessments. However, with the new curriculum, the exam has undergone some significant changes that have forced teachers to greatly modify their existing assessments. This is especially problematic in the development of multiple-choice items in that their format and emphasis has shifted to a degree that items that fit well with the old curriculum now require extensive revision and perhaps omission. The College Board has released one full-length practice exam in preparation for the redesign. However, this provides a relatively limited sample of 60 multiple-choice items considering that there are 117 LOs. Fortunately, the examples provided do offer some key insights that can be used to update existing questions or develop new items.

he advanced placement (AP) chemistry curriculum framework has presented all teachers, even veteran ones, with a set of unique challenges to the way they prepare their students for the AP chemistry exam. The goals of the “redesign”, as the framework is commonly known, are some common themes being espoused in modern science education: reduce the scope of content, define that content which is essential, and create opportunities for inquiry-based lab experiences that foster deeper conceptual understanding of several key themes. This is being accomplished by exclusion statements that identify content that will not be assessed on the exam, providing a list of learning objectives (LOs) that each exam question will be aligned with, and the release of a lab manual that provides 16 guided-inquiry labs.1 This is causing a shift in the way that many AP chemistry teachers prepare their students and conduct their lessons, but the driving force behind this is the exam itself. The College Board describes the development of the course as a process of “backward design” in which the overall course goals and learning objectives drive the development of curriculum.2 The AP chemistry exam is written so that items are designed specifically to assess these desired outcomes. This practice is not necessarily novel, yet it clearly shows that the AP exam shapes the curriculum. Therefore, a teacher’s role in developing an effective curriculum must include the use of assessments that are clearly in line with the format and content of the actual AP exam. In order to effectively prepare students, it is likely good practice for teachers to attempt to model assessments used in the classroom after the AP chemistry exam. This approach relies on the release of exams by the College Board. Fortunately, the free-response questions and their scoring © 2014 American Chemical Society and Division of Chemical Education, Inc.



EVALUATING AN ITEM’S COGNITIVE DEMAND Multiple-choice items consist of two primary parts: the stem, in which the question and possibly necessary information to solve it are provided, and the choices, including the correct response as well as incorrect choices, known as distractors. While many variations of these items with respect to format exist, they can generally be categorized into one of the following categories based on cognitive demands, as described by Zoller:3 • AlgorithmicThese items require the application of a memorized routine. Special Issue: Advanced Placement (AP) Chemistry Published: July 16, 2014 1347

dx.doi.org/10.1021/ed5000185 | J. Chem. Educ. 2014, 91, 1347−1351

Journal of Chemical Education

Article

This second question type is known as a “complex multiplechoice” and had been a format used before the redesign. The 2008 AP chemistry practice exam contained five of these items.5 It gets more “mileage” than the previous question because it allows several questions to be asked within a single item. However, what is required of the student remains the same. Also, while this type of question may be more difficult, it shows no additional differentiation compared to a conventional multiple-choice item and simply requires more time to complete.6

• Lower-order cognitive skills (LOCS)These items test recall or application of knowledge in a familiar situation. • ConceptualThese items require the demonstration of the basic understanding of scientific themes. • Higher-order cognitive skills (HOCS)These items require the ability to link concepts in an unfamiliar situation. A Shift in Focus

Before the redesign, AP chemistry exams included many multiple-choice items assessing the first two levels. Examples include naming compounds, balancing reactions, predicting solubility, simple stoichiometric calculations, and descriptive chemistry. While these are fundamental aspects of a student’s chemical knowledge, they do not necessarily assess the desired learning outcomes of the redesign. The goal of the curriculum framework is to develop deeper understanding that is both “enduring” and “conceptual”.4 The primary issue with the widespread use of the aforementioned items is that it has been shown that success on algorithmic and LOCS questions does not necessarily indicate conceptual understanding.3 However, the redesigned exam shows a shift in item types that indicates an intent to assess conceptual and higher-order cognitive skills. The move is away from predictable items that students can answer quickly using algorithmic or memorized factual knowledge toward requiring the application of understanding to a novel situation in which the information that is provided must be analyzed in a way that demonstrates deeper learning.



USING CONTEXT-DEPENDENT ITEMS If the objective here is to assess a student’s ability to identify an endothermic or exothermic process and demonstrate conceptual understanding, the key is to present a novel situation that requires a closer look at the stem as well as the choices:

Identifying Lower-Order Items

Because of the redesign, teachers must now closely re-evaluate each multiple-choice question they use, determine how well it aligns with the new curriculum, and decide what modifications, if any, should be made to improve it. The primary challenge is developing a balance between assessing the foundations of chemical knowledge that all successful students must have and the ability to connect concepts and reason scientifically that the curriculum framework is espousing. When the desire is to assess the latter skills, some key elements make an item consistent with the ones seen in the redesigned exam. When one considers the relative ability level of the students, items can be easily identified as algorithmic or LOCS. For example, the following item requires a simple application of the definition of an exothermic process:

This problem possesses some features lacking in the previous two that make it fall more into the conceptual and HOCS categories. This type of question is known as a contextdependent item and is becoming increasingly popular under the redesign. These questions usually include a graph, chart, table, scenario, or visual representation in the stem and allow the assessment of higher-level thinking and problem-solving skills.7 In order to solve the problem, the student must identify which data are necessary and ignore those which are not. Second, the data must be analyzed. In this case, the initial and final temperatures must be compared. Lastly, the choices not only include an initial answer but also an explanation.

Even adaptations of this question, while more challenging, still require the same skill:

Including Explanations in the Choices

A widely held belief is that multiple-choice items are lacking in the ability to assess conceptual understanding and scientific reasoning and are generally only appropriate for measuring lower-order cognitive skills such as recall of facts or procedural knowledge.8 Attempts have been made to create alternative multiple-choice items that seek to measure conceptual understanding. Generally, these approaches ask the test taker 1348

dx.doi.org/10.1021/ed5000185 | J. Chem. Educ. 2014, 91, 1347−1351

Journal of Chemical Education

Article

including a visual representation.11 Cleary, teachers who are updating multiple-choice items to coincide with the new exam format should develop these types of questions.

to either choose from several possible explanations for a corresponding scientific fact9 or use a two-tier approach in which an initial item requires the demonstration of knowledge and the following question asks for the corresponding explanation.10 The latter method can be combined into a single item as is the case here. While choice (C) is the correct response, the distractors serve an important purpose. The distractors appeal to students possessing a common misconception, interpreting data incorrectly, or failing to translate between various representations. Choice (A) addresses a common misconception among students that the water is the system, and because it is absorbing energy, the process must be endothermic. The use of misconceptions in distractors has been shown to be an effective method of assessing the development of scientific understanding.9 Choice (B) is enticing because it correctly describes the endothermic nature of breaking bonds but ignores the exothermic aspect of the ion−dipole interaction. Finally, choice (D) incorrectly equates spontaneity with enthalpy while ignoring the entropic contribution. While it is well-established that using plausible distractors is good practice in developing multiple-choice items,7 the use of explanations that require the demonstration of conceptual understanding beyond the initial choice (in this case endothermic or exothermic) provides further discrimination. A criticism of this question may be that if a student recognizes the dissolution as exothermic, then he or she has a 50% chance of guessing correctly. While this may be true, getting to this point still requires a correct analysis of experimental data. Considering the previous two questions stopped at this level, this shows the advanced nature of this item. If this remains a concern for the item writer, the question can be changed to only include explanations.



REDESIGNING QUESTIONS Writing a well-designed multiple-choice item from scratch can be a time-consuming process. An alternative is to take an existing question and modify it to better coincide with the redesigned AP chemistry exam and curriculum framework. The first concern in this process is to verify that the question addresses one of the current learning objectives. However, the issue with many questions is not in the content the questions seek to assess but the manner in which they are delivered and what cognitive skills are required to answer these questions. The effort to make a question valid can often, in turn, make it rather predictable. Presenting Novel Situations

In order to assess HOCS, it is generally accepted practice to present the question in a novel way.7 If exam questions are similar in structure and content to items presented in class or on homework, students may have developed mental algorithms for solving them. The following question is of this type:

Most AP chemistry students have memorized the periodic trends and, therefore, will think to themselves, “atomic radius decreases as you move up and to the right on the periodic table” in order to answer this question. This does not require any understanding of why this trend occurs or its implications but is sufficient to answer the question. Therefore, it is algorithmic. However, a question of this nature can be modified to assess deeper levels of understanding:

Item Sets

A disadvantage of context-dependent items, in general, is the large space requirement and increased time demands for the test taker.7 However, this can be offset by using the stem as part of an item set. The redesigned exam format includes several subsections of the multiple choice in which a table of data or diagram is presented, and students use it to answer two to six questions.2 Therefore, several more questions may be written to accompany this stem, such as

Here, actual values of atomic radii and dipole moments have already been given, so memorization of the periodic trends is not helpful. Second, most students will be given pause because they assume that because HCl and HBr are both strong acids, their Ka values are simply “very large” as listed in most tables and therefore equal. Being forced to decide between the two makes the question unfamiliar and indicates where students must demonstrate their understanding of bond strength as it relates to acid strength. Again, it is important that the distractors be reasonable and appealing to students who have

As most, if not all, of the questions in the set are related to the same information, the time demands, as a whole, are diminished. Context-dependent questions are a significant portion of the redesigned exam. An analysis of the 2013 released AP chemistry practice exam showed that of the stems of the 60 multiplechoice items, 24 referred to a table of data, 7 to a graph, and 9 involved a visual representation of chemical species.2 This is in stark contrast to the 2012 released AP chemistry practice exam that only contained 4 questions that included graphs and 1 1349

dx.doi.org/10.1021/ed5000185 | J. Chem. Educ. 2014, 91, 1347−1351

Journal of Chemical Education



Article

GENERAL RECOMMENDATIONS Overall, the following considerations offer some possible elements that can be integrated into multiple-choice questions to allow them to assess deeper understanding and, therefore, make them more in line with the AP chemistry curriculum framework: 1. Use context-dependent items frequently as part of item sets. 2. Include explanations as part of the choices where the distractors are appealing to students who have misconceptions or gaps in understanding. 3. Present the question in a way that is unfamiliar to students. 4. Avoid items that allow for an algorithmic solution. Summative assessment should not become the sole focus in the classroom. While AP chemistry teachers generally have the goal of high scores on the AP chemistry exam, the primary goal of the redesign is to develop learners who have the ability to draw upon conceptual knowledge and apply reasoning to make predictions about the natural world.2 However, ensuring that assessments are accurately measuring the desired outcomes is of utmost importance. Also, as we place new demands on our students, we must, in turn, ask more of ourselves. The development of strong multiple-choice assessment is just one aspect of this shift, although its benefits are many. Being more reflective in how a question is written forces educators to evaluate what can be done in the classroom on a day-to-day basis that will promote this deeper level of understanding and creativity. This constant interplay of assessment and instruction ultimately results in a better learning environment for students.

gaps or errors in their understanding. For example, many students will select (A) because they incorrectly equate increasing bond polarity with weaker bonds. It is important to note that the primary difference between the original question and the updated one concerns a shift in what is required of the student. Where the first item requires a memorized method of solution, the second requires an application of deeper understanding to a novel situation. Discouraging Algorithmic Problem Solving

The use of algorithmic techniques is a frequently taught method in chemistry. It can be very useful but is not necessarily indicative of conceptual understanding.12−14 This has been observed even among upper-level students in an introductory chemistry course for science majors.15 This pattern is especially seen in stoichiometry. Students are often taught dimensional analysis or some other procedural way of doing these calculations early on, likely in a pre-AP or general chemistry course, and attempt to use that same method whenever encountered with a stoichiometry problem. If the intent is to assess deeper understanding, it is important that multiplechoice questions prevent the use of these strategies while, at the same time, maintaining the validity of the item. For example, the following problem can easily be solved using dimensional analysis:



Lythcott showed specifically that the ability to solve mass− mass stoichiometry problems is not a demonstration of conceptual understanding. 16 While this problem-solving method is a fundamental skill for a beginning chemistry student, the concepts of stoichiometry can still be assessed through problems that are more unique and, therefore, require a deeper understanding. The following question accomplishes this by providing less information in the stem.

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

The authors declare no competing financial interest.



REFERENCES

(1) AP Chemistry Course and Exam Description Effective Fall 2013; The College Board: New York, NY, 2014; pp 9, 109. (2) AP Chemistry Practice Exam and Notes, Effective Fall 2013; The College Board: New York, NY, 2013; pp 4−5, 14−39. (3) Zoller, U.; Lubezky, A.; Nakhleh, M. B.; Tessier, B.; Dori, Y. J. Success on Algorithmic and LOCS vs. Conceptual Chemistry Exam Questions. J. Chem. Educ. 1995, 72, 987−989. (4) AP Chemistry Curriculum Framework 2013−2014; The College Board: New York, NY, 2011; p 1. (5) AP Chemistry Practice Exam; The College Board: New York, NY, 2008; pp 3−20. (6) Nnodim, J. O. Multiple-Choice Testing in Anatomy. Med. Educ. 1992, 26, 301−309. (7) Haladyna, T.; Downing, S.; Rodriguez, M. A Review of MultipleChoice Item-Writing Guidelines for Classroom Assessment. Appl. Meas. Educ. 2002, 15, 309−334. (8) Clark, D.; Linn, M. Designing for Knowledge Integration: The Impact of Instructional Time. J. Learn. Sci. 2003, 12, 451−493. (9) Sadler, P. Psychometric Models of Student Conceptions in Science: Reconciling Qualitative Studies and Distractor-Driven Assessment Instruments. J. Res. Sci. Teach. 1998, 35, 265−296. (10) Treagust, D. Diagnostic Assessment of Students’ Science Knowledge. In Learning Science in the Schools: Research Reforming Practice; Glynn, S., Duit, R., Eds.; Lawrence Erlbaum Associates: Mahwah, NJ, 1995; pp 327−346.

This problem is more challenging for students who rely heavily on dimensional analysis or algorithmic problem solving because a balanced reaction is not given. As less information is given, students must draw upon several pieces of background knowledge and link them together in order to arrive at the correct answer. The key is realizing that the carbon in CO2 comes solely from the hydrocarbon. Once this has been done, the correct choice, (C), becomes apparent. One of the merits of this problem is that it offers multiple solutions. Some students will apply the previously mentioned method, whereas others will use a trial-and-error approach with the choices given. However, the advantage of the former approach is that it is faster. Time constraints on the multiple-choice section of the exam are an important concern for test-takers, so solving problems quickly is an obvious advantage. 1350

dx.doi.org/10.1021/ed5000185 | J. Chem. Educ. 2014, 91, 1347−1351

Journal of Chemical Education

Article

(11) AP Chemistry Practice Exam from the 2012 Administration; The College Board: New York, NY, 2012; pp 3−22. (12) Nurrenbern, S.; Pickering, M. Concept Learning versus Problem Solving: Is There a Difference? J. Chem. Educ. 1987, 64, 508. (13) Pickering, M. Further Studies on Concept Learning versus Problem Solving. J. Chem. Educ. 1990, 67, 254. (14) Nakhleh, M.; Mitchell, R. Concept Learning versus Problem Solving: There Is a Difference. J. Chem. Educ. 1993, 70, 190. (15) Sawrey, B. Concept Learning versus Problem Solving: Revisited. J. Chem. Educ. 1990, 67, 253. (16) Lythcott, J. Problem Solving and Requisite Knowledge of Chemistry. J. Chem. Educ. 1990, 67, 248.

1351

dx.doi.org/10.1021/ed5000185 | J. Chem. Educ. 2014, 91, 1347−1351