Influence of Exam Blueprint Distribution on Student Perceptions and

Aug 2, 2019 - phases of organization and reflection. Ultimately, the student is responsible for learning, but the instructor can provide tools for pro...
1 downloads 0 Views 766KB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Influence of Exam Blueprint Distribution on Student Perceptions and Performance in an Inorganic Chemistry Course Karin J. Young,* Sarah Lashley, and Sarah Murray Centre College, 600 West Walnut Street, Danville, Kentucky 40422, United States

Downloaded via NOTTINGHAM TRENT UNIV on August 19, 2019 at 23:37:42 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

S Supporting Information *

ABSTRACT: Creating instructional activities and assessments that align with expected learning outcomes can improve student learning. Communicating learning expectations to students may lead to better student performance and an increase in student perceptions of the fairness of the assessment tools. One tool for communicating learning outcomes is the distribution of an exam “blueprint”, a document that correlates student learning outcomes with the expected level of performance and the relative weight on the exam. In this study, exam blueprints were prepared and distributed to students in an inorganic chemistry course. Students were surveyed about their use of the blueprints after each exam and interviewed about their perceptions of the congruence between assessments and instruction as well as their use of the blueprints. Students report using the blueprints as both organizational and metacognitive tools by emphasizing heavily weighted objectives. Overall, the blueprints were an effective tool for facilitating instructor−student communication as well as generating positive student attitudes toward exams, even though exam scores did not increase. KEYWORDS: First-Year Undergraduate/General, Second-Year Undergraduate, Upper-Division Undergraduate, Inorganic Chemistry, Testing/Assessment, Enrichment/Review Materials



INTRODUCTION In Understanding by Design, Wiggins and McTighe argue that desired learning outcomes should drive the creation of assessments and instructional methods1 in order to create more durable learning. The development of both program- and course-level outcomes has become increasingly common as a first step in a data-driven assessment process.2 Communicating learning expectations to students with instructions for how they can meet those expectations has been shown to improve student learning in both the classroom and laboratory.2−4 This simple idea is sometimes called “transparent teaching”.3 In STEM courses, the following transparent teaching practices have been shown to have significant learning benefits for students:3

that correlates student learning outcomes with the relative weight each outcome is assigned on the exam and perhaps the expected level of performance using Bloom’s taxonomy. The instructor then uses the blueprint to create exam items in accordance with each objective, weight, and cognitive level. The benefits to instructors of aligning assessment tools like exams to learning outcomes have been well documented. By using a blueprint, an instructor can choose questions that match both the content and cognitive level the instructor intends, rather than questions that are clever or easy to write.6,8 Blueprinting can ensure that no one content area is over- or underemphasized.7 Also, once the instructor has written the blueprint, less time may be required to produce an exam that matches the instructor’s stated expectations. Blueprints can also help instructors monitor and revise learning objectives and learning experiences in order to “close the assessment loop”.9 When given to the students, the exam blueprint becomes a study tool. Several researchers have classified the processes that occur when students study.10−12 Though different terms may be used, effective student studying includes, among others, phases of organization and reflection. Ultimately, the student is responsible for learning, but the instructor can provide tools for promoting the phases of studying.

• Discuss assignments’ learning goals before students begin each assignment • Explicitly connect “how people learn” data with course activities when students struggle at difficult transition points • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you have taught • Debrief graded tests and assignments in class One method for communicating learning expectations to students in order to facilitate learning and increase transparency is the distribution of an exam “blueprint”, also called a table of specifications.5−7 An exam blueprint is a document © XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: December 17, 2018 Revised: August 2, 2019

A

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 1. Portion of Exam Blueprinta Student Learning Objective 12.1 12.2 12.3 12.4 12.5

Remember

Determine whether a metal center will be labile or inert on the basis of d-electron count Use experimental information to justify whether exchange will be associative or dissociative Predict the position of substitution using the trans effect Explain the origin of the trans effect using ideas about kinetics and thermodynamics Distinguish between inner-sphere and outer-sphere electron transfer Total by cognitive level

Understand Apply

Analyze Evaluate

Create

4

4 8

8

6

6 6

6

0

6

10

Total by Learning Objective

6

4

6

12

10 0

34

a

The numerical value is the number of percentage points on the exam from one or more free-response questions at a particular cognitive level.

Auburn describes potential gains to both students and instructors from the development and communication of an exam blueprint in her organic chemistry courses.6 Collecting data about student experiences may further clarify how exam blueprints affect student learning in chemistry courses. A better understanding of student learning experiences can in turn provide useful feedback for instructors on effective pedagogy. We describe below how we developed and implemented exam blueprints as a communication tool in an inorganic chemistry course. Furthermore, we sought to understand how distributing the blueprint to students would affect their study processes, especially with respect to organization and metacognition. In order to answer this question, we have combined student survey, interview, and exam performance data to evaluate the blueprints as a study tool. We analyzed the effect of distributing the exam blueprint to students in terms of both student perceptions of exams and their performance on exams. While this study focuses on the role of exam blueprints in an inorganic chemistry course, insights may be applicable to college courses at other levels or in other disciplines.

Many practical tools have been developed for helping students connect and structure ideas. For example, concept mapping in which students detail the relationships between ideas has been shown to help students organize material in meaningful ways while also helping students self-assess their understanding.13 Similarly, advance organizers are stories or graphics provided to students before instruction in order to create a context for incorporating new ideas.14 When provided to students before instruction, exam blueprints can be a type of advance organizer.7 In the reflective phase of studying, students self-assess their learning progress. Perhaps the most important component of this phase is metacognition, the process of monitoring one’s own thinking.15 Effective metacognition is a learned skill that requires practice, but research shows that teaching students about metacognition results in deeper learning.11,15,16 Part of helping students reflect on their learning is providing a vocabulary for thinking about thinking, and the most commonly used learning hierarchy is Bloom’s revised taxonomy.17 Crowe and co-workers taught students about Bloom’s taxonomy in several different biology courses at several institutions. In a physiology course, students were required to rank or code classroom examples by Bloom’s level. They found that giving formative feedback on the cognitive complexity as well as the content caused students to change their study habits, an essential outcome of metacognition.18 In the same way, exam blueprints provide information about the learning expectations so that students can adjust their study strategies. Because exam blueprints contain information about the requirements for mastery, the cognitive complexity, and the relative weights of the learning objectives, they have the potential to improve the transparency of classroom assessments and provide an effective study tool for students. When given to students in advance of the exam, the blueprint allows the instructor to discuss the learning outcomes before students are graded. Also, the inclusion of the Bloom’s taxonomy classification provides vocabulary for describing how students learn and for discussing how outcomes might appear on the exam. In this way, the blueprints can aid with both organization and metacognition. Finally, students can be shown how the blueprint and exam questions aligned after the exam is returned. Thus, three practices known to improve classroom transparency may be implemented, using a relatively low effort intervention that requires minimal class time. Blueprinting can be added easily to an existing course structure and complemented by a variety of other pedagogies.



DEVELOPMENT OF EXAM BLUEPRINTS The exam blueprints used in this study consist of a matrix correlating student learning objectives with the six levels of the revised Bloom’s taxonomy.17 A partial blueprint showing the organization of learning objectives, cognitive levels, and points is shown in Table 1. Complete blueprints are available in the online Supporting Information. The course-level level learning objectives in these blueprints were developed by the instructor to align with the goals of the chemistry program at Centre College. In the context of our chemistry curriculum, this course expects students to apply their knowledge of quantum chemistry to the structure and reactivity of transition metal complexes. Much of the course is focused on analyzing observable data, such as spectroscopic or magnetic behavior, in order to make inferences about chemical structure. Therefore, we expect most of the assessment questions to be at the apply or analyze levels. Research has shown that inorganic chemistry courses vary widely in their content and methods, so the learning objectives described in these blueprints should be understood as a possible set of goals.19,20 Instructors wishing to implement this tool might also use textbook learning objectives or statements from the recently developed Anchoring Concepts map.21 In order to develop the blueprints, the instructor started by analyzing prior year’s exams and “reverse blueprinting” by writing a learning objective that could be assessed by each question. The cognitive level and number of points were coded B

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

To answer correctly, students use an established algorithm for classifying complexes as inert or labile by determining delectron count and then remembering how electron count correlates to exchange behavior. In part b, students must analyze that the important feature of [Cr(OH2)5Cl]Cl2 is the transfer of the chloride ligand from an inert reactant to form an inert product. Finally, part c was coded at the evaluate level because students must judge that an inner-sphere mechanism is not plausible because neither [Co(NH3)6]3+ nor [Cr(OH2)6]2+ has ligands suitable for bridging between metal centers. A complete list of exam questions as coded by the instructor is available in the Supporting Information. In the bottom row of the blueprint, the points for each cognitive level were tallied to communicate to students how the points were distributed across the levels. In this course, most questions were at the analyze or apply levels. Fewer points were available at remember, understand, and evaluate levels, and no exam questions were posed at the highest create level. After each exam, a “key” showing alignment among the question number, the learning objective, and the Bloom’s taxonomy level was distributed to students with the graded exam (see Supporting Information). This key provided a useful starting point for debriefing the exam.

for each question. This draft blueprint was then adjusted using the instructor’s professional judgment. For example, additional objectives were added that were not included in the previous year’s exams. The weighting of other objectives was adjusted to bring them into alignment with the amount of class time spent discussing those objectives. We found analyzing the previous exams to be a useful starting point for reflection on the alignment between course goals and assessments. On the blueprints, each objective was given a number that indicated the course textbook chapter and then a nominal objective number; for example, 12.2 indicates the second objective from Chapter 12. Each numerical entry in the matrix corresponds to the number of percentage points on the exam. Percentage points could represent the sum of multiple questions or of a single question. In the last column, the points for each learning objective were tallied to make it clear to students how the overall points were distributed. In this course, some topics discussed in class were intended as enrichment or extension activities beyond the primary scope of the course. Consequently, these topics were stated as learning objectives on the blueprint but given 0 points on the exams. The assignment of 0 points to some objectives highlights the utility of the blueprint to define and communicate the learning expectations by indicating what is not expected on the summative assessments. In accordance with the blueprints, the instructor designed the three course exams to consist of free response questions that could require calculations, structure drawing, or explanation. The instructor found the blueprints useful as a checklist or organizing tool for exam development. Because the learning objectives and cognitive levels were specified in detail before the exam was written, instructor time was spent on writing an appropriate question to assess each objective. Once questions to meet each objective had been written, the instructor could be assured of a complete, well-rounded exam that was consistent with the course learning expectations. The questions were also coded by the instructor according to the levels of Bloom’s revised taxonomy.17 For example, the questions assessing objectives 12.1 and 12.5 (Table 1) were based on work by Taube and co-workers and are shown in Box 1.22 Part a was coded to assess objective 12.1 at the apply level.



STUDY DESIGN In order to gain a deeper understanding about student experiences, the effect of the distribution of exam blueprints on student perceptions of the course and on exam performance was studied in an inorganic chemistry course at Centre College. The details of the study were submitted to the College’s Institutional Review Board (IRB) for approval and declared exempt. Centre College is a residential liberal arts college with enrollment of about 1430 traditional students. The course content is best represented by the “Fundamentals and Selected Topics” cluster identified for foundational inorganic chemistry courses.19 The 4 credit hour course included three 60 min lectures and one 3 h lab session each week. Miessler, Fischer, and Tarr’s Inorganic Chemistry was the required textbook.23 All 13 students in the study were senior-level (fourth-year) chemistry majors who had previously taken a course in quantum chemistry and at least one semester of organic chemistry. All students were traditional college students enrolled full time. Within one year of graduation, eight of these students were working as chemists in industry, two were in chemistry Ph.D. programs, two were in medical or dental school, and one was a K−12 educator. Owing to the small sample size, demographic data on racial and ethnic and gender identities were not collected. The study of blueprints in this inorganic chemistry course was chosen for several reasons. First, the course content in this course relies on abstractions about bonding and chemical behavior. Students in previous cohorts had expressed difficulty understanding the relationships between the ideas, so we thought the introduction of an organizational tool might be helpful. Additionally, the population of this course is fairly homogeneous in terms of preparation and field of study; all students were chemistry majors who had taken similar previous coursework. We thought that studying blueprint use in this population would minimize variations caused by students’ levels of interest in course content. However, we think that the

Box 1. Questions assessing learning objectives 12.1 and 12.5. In part a, objective 12.1 is assessed at the apply level. Parts b and c assess objective 12.5 at the analyze and evaluate levels, respectively. In 1953, Henry Taube and co-workers published an elegant experiment supporting the existence of an inner-sphere mechanism for electron transfer between two transition metal ions. Taube mixed [Co(NH3)5Cl]Cl2 with [Cr(OH2)6]Cl2 and isolated [Cr(OH2)5Cl]Cl2 from the reaction mixture. a. (4 points) Circle complexes that are inert. [Co(NH3)5 Cl]Cl 2 [Cr(OH 2)6 ]Cl 2 [Cr(OH 2)5 Cl]Cl 2 [Co(H 2O)6 ]Cl 2

b. (6 points) Explain how isolation of [Cr(OH2)5Cl]Cl2 strongly supports an inner-sphere mechanism. c. (4 points) Would you expect [Co(NH3)6]3+ and [Cr(OH2)6]2+ to react by an inner-sphere mechanism? Why or why not?

C

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

distribution of the exam blueprints would be a useful tool in a variety of chemistry courses. At the beginning of the term, the design and purpose of the blueprints was explained to students. The Revised Bloom’s Taxonomy descriptors were explained during class when the first blueprint was distributed. A graphical representation of the six levels as a pyramid was projected for students with a brief description of each level. Examples for each level were described verbally, and students were allowed to ask questions for clarification. Approximately once per week, an in-class example was coded by cognitive level in order to reinforce students’ understanding of the levels in context. Students occasionally asked how an in-class example would be coded or would ask for an example of a concept at a particular level. One week before each in-class exam, blueprints were distributed to students as hardcopies and posted on the electronic course management site. No blueprint was given for the comprehensive final exam so that the performance on the final exam could be more directly compared to a previous cohort that did not have access to blueprints. Immediately after each exam, students completed a twoquestion survey. Survey questions were designed to elicit both student perceptions of the fairness and congruence of the exam with expectations and to understand when and how students were using blueprints. The survey questions are shown in Box 2.

Box 3. Questions for semi-structured interview. Individual interviews were conducted after the last exam but before the final exam. 1. Describe how you used the exam blueprints provided in the inorganic chemistry course. a. If the student used the blueprints, i. How did the point totals affect your study strategy? ii. How did the Bloom’s taxonomy classifications affect your study strategy? b. If the student did not use the blueprints, i. How did you approach studying for the exam in this course? 2. To what extent do you think the blueprints affected your performance in inorganic chemistry? 3. How prepared did you feel before you took the exams in this course? 4. How well did the exam questions align with what you expected to be asked? 5. Do you think an instructor should provide exam blueprints to students? Why or why not? an open-ended opportunity to provide any additional information. The instructor did not have access to study participation information, survey responses, or interview responses until after the end of the course.

Box 2. Postexam survey questions. Surveys were administered immediately after each of the three in-class exams.



HOW STUDENTS USED THE EXAM BLUEPRINTS Both student survey and interview responses were used to understand how students employed the blueprint as part of the study process. On the surveys, all 13 students reported using the exam blueprints on all three in-class exams. The survey data indicated that blueprint use was relatively consistent over the term (Figure 1), though it dipped slightly in the middle of the term and increased slightly during the third exam. However, students reported during the interviews that they

1. The questions on this exam were consistent with what I expected. (Circle one). Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree 2. How did you use the test blueprint while studying for this exam? (Circle all that apply). Read it before studying Read it while studying Read it after studying Did not use it Other:

After the three in-class exams but before the final exam, students participated in individual interviews. These interviews took place during the 11th week of a 13-week semester so that the students’ experience with the blueprints would be recent and so that evaluation of the blueprint as a tool would not be conflated with or affected by the end-of-term course evaluations. These interviews were conducted by the coauthors who were not the course instructor on the basis of the questions in Box 3. During the interview, each student was asked about how he or she used the blueprints as well as perceptions about performance or preparation for the class. The interviews lasted about 10 min and are characterized as “semi-structured” because the interviewers were permitted to ask follow-up questions in addition to those in Box 3 where appropriate. At the end of the interview, students were given

Figure 1. Student responses to postexam survey question 2 (Box 2). Students were invited to choose all responses that applied. On 31% of surveys, two responses were marked, and three responses were recorded on 28% of surveys. D

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

used the blueprint the least on the first exam and more on the subsequent exams as a way to improve in the course. As one student explained, “I didn’t rely heavily [on the blueprint] on the first test, so I didn’t do really well on that exam. For the following two exams, I followed the blueprint, and I did pretty well.” Our results indicate that students used the blueprints at several points in the study process. Students reported using the blueprint at two different study times (before, during, or after studying) on 31% of surveys and at all three points on 28% of surveys. When combined with our interview data, we found that students generally consulted the blueprints before or during their study process as an organizational tool and then during or after studying as a guide for metacognition. Overall, 11 of the 13 students reported using the blueprints as an organizational tool before or during studying. As one student explained, “Before the first exam, I was overwhelmed with the range of things we had talked about... [The blueprint] helped to group and connect the information we were learning.” Additionally, students used the blueprint to prioritize the most important topics. Several students referenced their experiences in other courses where “the instructor... says anything in the book or we talk about in class or on homework could be on the test, and I think that’s unreasonable. Some of that information is less important.” Another student felt that using the exam blueprints led to increased efficiency during study time. Given that our original motivation was to help students organize and prioritize abstract ideas in this course, we were encouraged to see that students reported the blueprints to be helpful. In addition to promoting individual organization, the blueprints also promoted collaborative study in two ways. First, the blueprint provided a common structure for discussion among the students. One student explained that the blueprints helped because one student could “walk up to another student... [and] say ‘Hey this right here, what do you think about this?’ ... Without [the blueprint], it would have been hit or miss whether they knew what I was talking about.” Additionally, the blueprint provided the focus of a long-term study group that three or four students formed spontaneously. The students in this group collaborated on a study guide that used the learning objectives as headings. They explained that they would then work toward grouping explanations, figures, and practice problems with the learning objectives. Once their study guide was compiled, the students reported that they were able to ask their peers or the instructor for more clarification. After studying, students reported returning to the blueprint to ensure that their study process had encompassed all of the objectives. For example, a student explained how “I went though and actually marked off... which sections I felt like I could confidently answer a question about and which ones I wasn’t sure about.” Four of the 13 students reported considering each objective and then deciding whether they “felt” that they had sufficient understanding when accounting for the relative weight and, to a lesser extent, the Bloom’s taxonomy level. One student described that “after studying, I would reread the blueprint and ask myself, ‘Can I draw orbitals...?’ and answer yes or no. If it was heavily weighted [by points], I would ask myself again.” Both the collaborative study guide and the individual reflection on each objective represent metacognitive strategies that are known to improve learning.11,15 For example, McGuire and co-workers have seen dramatic increases in student

performance in general chemistry by emphasizing the shift from lower-order to higher-order thinking skills as part of a “study cycle”. As part of the study cycle, students are encouraged to reflect on whether their study methods are effective and whether their understanding is sufficient to explain the ideas to others.11 Here, the blueprint provided a scaffold for metacognition by clearly defining the higher order thinking skills that would be required. Seven of the 13 students reported during the interviews that they emphasized those objectives that were assigned more points on the exams. For some, the point value provided a clue about the depth of the question. One student explained, “If it’s worth 9 points, then I need to make sure I understand even the tricky bits.” Conversely, students deemphasized learning objectives worth 0 points, redirecting study time to objectives that would be tested. It is not surprising that students emphasized heavily weighted objectives and avoided objectives that were weighted 0 points. As long as the assessment tools require students to develop meaningful understanding, then we do not view the emphasis on points as problematic. Rather, we interpret the students’ reports to indicate successful and transparent communication between instructor and students about what course materials are most important. Five students specifically stated that the Bloom’s taxonomy levels were a helpful part of their study process in answering Question 1.a.ii in Box 3, but nine students described using Bloom’s taxonomy in some way during the course of the interview. We found it surprising that students were able to explain how they used the Bloom’s levels but did not necessarily recognize that the levels were helpful. The remaining four students indicated that Bloom’s taxonomy had no effect on their study process. The students who reported using Bloom’s taxonomy were able to articulate the differences among the levels in terms of what was required of their study process, reinforcing the utility of blueprints as a tool for metacognition. For example, one student explained that for apply level questions, students would “need to take something outside or something we haven’t seen before and apply something we have seen to that. So I knew I needed to know the process.” By connecting the apply level with the idea of a “process”, this student recognized the need to adapt study strategies to meet the learning expectations. Several students felt that their ability to use the levels improved over the course of the term so that they were better able to anticipate the form a learning objective and cognitive level would take on the exam. Similarly, a student considered the levels as a guide for assessing whether “I’m studying correctly,” suggesting that the blueprint was a standard against which the student’s own studying could be evaluated. Several students reported sorting the cognitive levels into subgroups. At the extreme end, one student defined “remember equals repeat and then everything else.” Similarly, several students grouped analyze and apply as questions that would require problem solving. This grouping suggests that the cognitive level was somewhat helpful for studying but that the distinction between analyze and apply either was not useful or would need to be explained in more detail to influence study patterns. One way of communicating more clearly with students might be to use a taxonomy with fewer levels. Toledo and Dubas scaffolded a general chemistry course using Marzano’s taxonomy, which has only four levels: retrieval, comprehension, analysis, and knowledge utilization.24 As in E

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

our study, they found that students were able to adapt their learning strategies in order to reach proficiency with each outcome. Students gave objectives with points in remember and explain categories less emphasis during study time. For example, one student reported, “It was nice to know we just had to remember that the precursor to coordination theory was chain theory and not explain how chain theory worked.” In the same way that the exam blueprint can communicate higher order learning expectations to students, it can also be used to communicate when lower order skills are sufficient for mastery. One student used optional practice problems in order to understand the distinctions between questions. The practice problems were not labeled by cognitive level, so the student made assumptions about the relationship between the cognitive level of the homework problems and that expected on the exam. One improvement in pedagogy using the exam blueprints would be to key practice problems to Bloom’s taxonomy. Additionally, students could be prompted to identify example problems or to rank the cognitive complexity of a set of problems.18,25−27 Indeed, emphasizing student coding of cognitive complexity has been shown to encourage metacognition by helping students monitor their level of mastery.18 A further extension of this idea is for students to practice generating questions of different cognitive complexity; this problem manipulation strategy described by Parker Siburt et al. was shown to boost metacognition in general chemistry students.27

Figure 2. Student responses to postexam survey question 1 (Box 2). Most students reported immediately after the exam that it was congruent with their expectations.

ation is that the content of the second exam was the most difficult for students to understand, so their metacognitive strategies might not have been sufficient. The small sample size in this study limits robust conclusions about why these students rated the exam questions as inconsistent with their expectations. Importantly, 4 of the 13 students reported that the blueprint decreased feelings of anxiety or worry that they often experience before exams. In particular, the blueprints decreased anxiety about focusing on irrelevant material or missing important material. One student explained, “I’m an anxious test taker... But with these tests compared to other tests, I wasn’t worried about whether I studied the wrong material because I knew that the material I studied would be on the test. The issue was whether I would be able to apply what I knew to the material on the test.” Test anxiety is a complex issue for students that can result from either insufficient learning or study skills or habitual negative thoughts that students have during testing situations.28 The blueprints have the potential to help with both causes. Because they contain learning objectives and cognitive levels, the blueprints can help communicate the expectation for higher order skills. Furthermore, metacognitive exercises are known to help students cope with test anxiety.29−31 Because the exam blueprints provide a scaffold for students to reflect on their learning, they may also help students maintain a sense of confidence and competence that counters worries related to failure. Moreover, the blueprints are provided to all students, so they may be helpful even to students who have not sought help for test anxiety. When asked whether instructors should provide exam blueprints as a study tool, most students reported that the blueprints were helpful in the context of this or similar science courses. Students had some difficulty imagining how this tool would be useful in the context of courses in other disciplines or at other levels. Our interview data do not provide a full understanding for why some students have difficulty transferring the idea of a blueprint to other contexts. The students generally reiterated the usefulness of the blueprints as a means for communicating the importance of certain course content.



STUDENT PERCEPTIONS Previous studies have shown that increasing transparency by communicating learning objectives and the rationale behind assignments can improve student learning, especially in introductory courses and among underrepresented and nontraditional students.3 In this study, student data related to how well the exams conformed to their expectations were interpreted as feedback on how exam blueprints can be used to increase transparency. In general, students reported in both the surveys and interviews that the exams conformed to their expectations (Figure 2). During the interviews, 10 of the 13 students reported that the exam questions aligned with their expectations based on the blueprint. Three students specifically mentioned that the exams were especially “fair” on the basis of the blueprints. As one student explained, “You never know what a professor is thinking, but with the blueprint, I knew... what she wanted me to know.” Even though students report using the blueprints to focus on heavily weighted objectives, they did not seem to think the test was easy as a result. As one student explained, the blueprint “didn’t give away the answers. I still had to do the work.” The survey data indicate that the second exam was least congruent with student expectations. Two students indicated “strongly disagree” and one indicated “disagree” immediately after the second exam. Those students who did not think the questions aligned well commented in the interviews that the ideas were congruent but the form of the question was unexpected. The reasons why students found the second exam the least consistent are not clear. One explanation is that the students were unable to use the Bloom’s taxonomy levels effectively, but two of the students who felt the exam questions were inconsistent with expectations also reported using the Bloom’s taxonomy levels in their studying. Another explanF

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

qualitative benefits that students describe with respect to test anxiety and transparent communication. Similar results have been seen in the literature before. When Korte et al. implemented optional learning-to-learn resources including study guides developed using Bloom’s taxonomy verbs in an introductory food science course (121 students), they found that hour exam scores increased by 2.6%, but the final exam scores were not significantly different.33 Similar results were also reported by McLaughlin et al. for the distribution of a test blueprint for a multiple-choice exam in a medical school course.34 In a much larger sample of medical students (237 students over five cohorts), no significant difference in exam scores was observed, though students in the cohorts with access to blueprints reported more congruence between the exam items and the course instruction as well as an increased sense of the fairness of the final exam.

They also felt that the distribution of the blueprints was a show of respect for students’ time, which led to a perception of the instructor as generally fair and respectful toward students. In this way, the blueprints represented a gesture of transparency that led in this case to a feeling of mutual trust and respect between students and instructor. In an organic chemistry course, the quality of the student−faculty relationship has been shown to correlate positively with the course grade.32 The distribution of an exam blueprint could be one method for faculty to enhance the instructor−student relationship.



STUDENT PERFORMANCE The effect of blueprint publication on student performance was evaluated by comparing median exam scores to a previous cohort who lacked access to blueprints but were given substantively similar exams. The median scores and score ranges are shown in Figure 3. Even though students report



CONCLUSIONS We found the creation and distribution of exam blueprints to be useful for both the instructor and the students in a small inorganic chemistry course. For the instructor, the process of developing the blueprint was useful for aligning objectives and exam questions. Distributing the blueprints to students required very little class time and was compatible with the variety of other pedagogies used in the classroom. Generally, the students found the exams to be fair and consistent with their expectations based on the blueprints. Students used the blueprints to organize their study process, either individually or collaboratively. Students also used the blueprints to engage in metacognition by reflecting on their abilities to meet the learning objectives. Helping students understand Bloom’s taxonomy is a key component for using the blueprints to communicate learning expectations. Though the students in this study were introduced to the Bloom’s taxonomy levels, the usefulness of the blueprint could be enhanced by including cognitive levels on formative assessments like practice problems as well. The introduction of the blueprints did not improve the exam scores, suggesting that providing additional information about the exam does not automatically lead to students earning more points. Instead, the exam blueprints were an important communication tool between the instructor and the students, used to increase transparency.

Figure 3. Median exam scores for 2017 study cohort (dark gray) and 2015 cohort (light gray). The error bars depict the ranges of high and low scores. The two cohorts were given substantively similar exams, though only the 2017 cohort was given test blueprints. The difference in median score between the two cohorts is not significant.

trying to maximize points, the difference in median exam scores between the two cohorts is not significant by the Mann−Whitney U-test, which accounts for non-normal score distributions. We interpret this result to mean that the increased communication had no measurable effect on exam performance. One consideration is that the exam scores from the 2015 cohort were not particularly low, so it might have been difficult for the median to increase substantially. However, we recognize that statistical tests are unlikely to show significance for a sample of only 13 students. When asked about how the blueprints affected their performance in the course, students made similar qualitative comments. For example, one student said, “I don’t think that my grades were 10, 20, or 30 points higher because of [the blueprints]. I think they were zero to ten points higher.” Similarly, a student explained that, “[o]verall, I probably turned out the same with or without [the blueprint]... I don’t know that grade-wise it benefitted me much, but at a personal and mental level, I think it did benefit me.” As both our qualitative and quantitative data indicate, the utility of distributing the blueprints cannot be ascribed to improving students’ numerical performance but rather to the



ASSOCIATED CONTENT

S Supporting Information *

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b01034. Exam blueprints, sample correlation key, and exam questions (PDF, DOCX)



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Karin J. Young: 0000-0003-3626-1854 Notes

The authors declare no competing financial interest. G

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

(20) Reisner, B. A.; Smith, S. R.; Stewart, J. L.; Raker, J. R.; Crane, J. L.; Sobel, S. G.; Pesterfield, L. L. Great Expectations: Using an Analysis of Current Practices To Propose a Framework for the Undergraduate Inorganic Curriculum. Inorg. Chem. 2015, 54 (18), 8859−8868. (21) Marek, K. A.; Raker, J. R.; Holme, T. A.; Murphy, K. L. The ACS Exams Institute Undergraduate Chemistry Anchoring Concepts Content Map III: Inorganic Chemistry. J. Chem. Educ. 2018, 95 (2), 233−237. (22) Taube, H.; Myers, H.; Rich, R. L. Observations on the Mechanism of Electron Transfer in Solution. J. Am. Chem. Soc. 1953, 75 (16), 4118−4119. (23) Miessler, G. L.; Fischer, P. J.; Tarr, D. A. Inorganic Chemistry, 5th ed.; Pearson: Boston, MA, 2014. (24) Toledo, S.; Dubas, J. M. Encouraging Higher-Order Thinking in General Chemistry by Scaffolding Student Learning Using Marzano’s Taxonomy. J. Chem. Educ. 2016, 93 (1), 64−69. (25) Ball, A. L.; Washburn, S. G. Teaching Students to Think: Practical Applications of Bloom’s Taxonomy. Agric. Educ. Mag. 2001, 74 (3), 16. (26) Athanassiou, N.; McNett, J. M.; Harvey, C. Critical Thinking in the Management Classroom: Bloom’s Taxonomy as a Learning Tool. J. Manag. Educ. 2003, 27 (5), 533−555. (27) Parker Siburt, C. J.; Bissell, A. N.; Macphail, R. A. Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. J. Chem. Educ. 2011, 88 (11), 1489−1495. (28) Mealey, D. L.; Host, T. R. Coping with Test Anxiety. Coll. Teach. 1992, 40 (4), 147−150. (29) Ramirez, G.; Beilock, S. L. Writing About Testing Worries Boosts Exam Performance in the Classroom. Science 2011, 331 (6014), 211. (30) Doherty, J. H.; Wenderoth, M. P. Implementing an Expressive Writing Intervention for Test Anxiety in a Large College Course. J. Microbiol. Biol. Educ. 2017, 18 (2), 1−3. (31) Brady, S. T.; Hard, B. M.; Gross, J. J. Reappraising Test Anxiety Increases Academic Performance of First-Year College Students. J. Educ. Psychol. 2018, 110 (3), 395−406. (32) Micari, M.; Pazos, P. Connecting to the Professor: Impact of the Student-Faculty Relationship in a Highly Challenging Course. Coll. Teach. 2012, 60 (2), 41−47. (33) Korte, D.; Reitz, N.; Schmidt, S. J. Implementing StudentCentered Learning Practices in a Large Enrollment, Introductory Food Science and Human Nutrition Course. J. Food Sci. Educ. 2016, 15 (1), 23−33. (34) McLaughlin, K.; Coderre, S.; Woloschuk, W.; Mandin, H. Does Blueprint Publication Affect Students’ Perception of Validity of the Evaluation Process? Adv. Health Sci. Educ. 2005, 10 (1), 15−22.

ACKNOWLEDGMENTS K.J.Y. wishes to thank the students of CHE 332 for participating in this study and Jennifer Muzyka for reading this manuscript. Preliminary versions of this work were presented at the 255th National Meeting of the American Chemical Society in New Orleans, LA, and at the 25th Biennial Conference on Chemical Education at the University of Notre Dame.



REFERENCES

(1) Wiggins, G.; McTighe, J. Understanding by Design, 2nd ed.; Merrill Education/ASCD College Textbook Series; Association for Supervision and Curriculum Development: Alexandria, VA, 2005. (2) Towns, M. H. Developing Learning Objectives and Assessment Plans at a Variety of Institutions: Examples and Case Studies. J. Chem. Educ. 2010, 87 (1), 91−96. (3) Winkelmes, M.-A. Transparency in Teaching Faculty Share Data and Improve Students’ Learning. Lib. Educ. 2013, 99 (2), 48−55. (4) Duis, J. M.; Schafer, L. L.; Nussbaum, S.; Stewart, J. J. A Process for Developing Introductory Science Laboratory Learning Goals To Enhance Student Learning and Instructional Alignment. J. Chem. Educ. 2013, 90 (9), 1144−1150. (5) Suskie, L. Assessing Student Learning: A Common Sense Guide, 2nd ed.; Jossey-Bass: San Francisco, CA, 2010. (6) Auburn, P. Assessment for Active Learning. In Fall 2016 ConfChem: Select Presentations from CCCE Sponsored Symposia During the 2016 Biennial Conference on Chemical Education; DivCHED CCCE: Committee on Computers in Chemical Education, 2016. (7) Dills, C. R. The Table of Specifications: A Tool for Instructional Design and Development. Educ. Technol. 1998, 38 (3), 44−51. (8) Fives, H.; DiDonato-Barnes, N. Classroom Test Construction: The Power of a Table of Specifications. Pract. Assess. Res. Eval. 2013, 18 (3), 1−7. (9) Coderre, S.; Woloschuk, W.; Mclaughlin, K. Twelve Tips for Blueprinting. Med. Teach. 2009, 31 (4), 322−324. (10) Entwistle, N.; Hanley, M.; Hounsell, D. Identifying Distinctive Approaches to Studying. High. Educ. 1979, 8 (4), 365−380. (11) Cook, E.; Kennedy, E.; McGuire, S. Y. Effect of Teaching Metacognitive Learning Strategies on Performance in General Chemistry Courses. J. Chem. Educ. 2013, 90 (8), 961−967. (12) Dirks, C.; Wenderoth, M. P.; Withers, M. Assessment in the College Science Classroom; WH Freeman: New York, 2014. (13) Francisco, J. S.; Nicoll, G.; Trautmann, M. Integrating Multiple Teaching Methods into a General Chemistry Classroom. J. Chem. Educ. 1998, 75 (2), 210. (14) DeMeo, S. Constructing a Graphic Organizer in the Classroom: Introductory Students’ Perception of Achievement Using a Decision Map To Solve Aqueous Acid-Base Equilibria Problems. J. Chem. Educ. 2007, 84 (3), 540. (15) Rickey, D.; Stacy, A. M. The Role of Metacognition in Learning Chemisty. J. Chem. Educ. 2000, 77 (7), 915−920. (16) Tsai, C. C. A Review and Discussion of Epistemological Commitments, Metacognition, and Critical Thinking with Suggestions on Their Enhancement in Internet-Assisted Chemistry Classrooms. J. Chem. Educ. 2001, 78 (7), 970−974. (17) Anderson, L. W.; Krathwohl, D. R.; Airasian, P. W.; Cruikshank, K. A.; Mayer, R. E.; Pintrich, P. R.; Raths, J.; Wittrock, M. C. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, Complete ed.; Pearson: New York, 2000. (18) Crowe, A.; Dirks, C.; Wenderoth, M. P. Biology in Bloom: Implementing Bloom’s Taxonomy to Enhance Student Learning in Biology. CBELife Sci. Educ. 2008, 7 (4), 368−381. (19) Raker, J. R.; Reisner, B. A.; Smith, S. R.; Stewart, J. L.; Crane, J. L.; Pesterfield, L.; Sobel, S. G. Foundation Coursework in Undergraduate Inorganic Chemistry: Results from a National Survey of Inorganic Chemistry Faculty. J. Chem. Educ. 2015, 92 (6), 973− 979. H

DOI: 10.1021/acs.jchemed.8b01034 J. Chem. Educ. XXXX, XXX, XXX−XXX