Document not found! Please try again

Emphasizing Learning: Using Standards-Based Grading in a Large

Jun 1, 2018 - ABSTRACT: Many students associate grades with the completion of course work rather than learning the course content. While research has ...
0 downloads 0 Views 1MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Emphasizing Learning: Using Standards-Based Grading in a Large Nonmajors’ General Chemistry Survey Course Sarah B. Boesdorfer,* Emilee Baldwin, and Kyle A. Lieberum Department of Chemistry, Illinois State University, Normal, Illinois 61790-4160, United States

Downloaded via TUFTS UNIV on July 4, 2018 at 14:36:09 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

S Supporting Information *

ABSTRACT: Many students associate grades with the completion of course work rather than learning the course content. While research has heavily focused on teaching strategies and instructional tools to improve students’ learning in general chemistry, less focus has been given to assessments as a learning tool to improve students learning. Standardsbased grading (SBG) is a nontraditional assessment method that explicitly connects the learning course objectives with assessments and student grades. Provided with more than one opportunity to demonstrate their knowledge of course objectives, students are evaluated on their level of achievement of these objectives. This article describes the implementation, outcomes, and challenges of SBG in a large enrollment nonmajors’ general chemistry course. Evidence is presented regarding the use of SBG in chemistry from the evaluation of two semesters of students’ assessment scores and a student opinion survey. Students appeared to use the structure of the SBG to pass the course at higher rates than previous semesters, demonstrated some metacognitive skills, and generally appreciated SBG as the assessment method in the course. KEYWORDS: General Public, First-Year Undergraduate/General, Testing/Assessment ‘

chemistry courses.10,11 There has been less focus on how the grading or course assessment methods used in general chemistry courses affect student learning and achievement12 especially in large enrollment courses. Assessment is an important part of teaching and learning3 because most people, not just students, are not good at knowing what they do not know.13 Formative assessments can help students and their instructors identify knowledge gaps or misunderstandings to focus learning and effort; for deep learning to occur, it is the students who must have the metacognitive skills to frequently reflect and assess their learning.3,13 In describing how assessment can be a part of effective pedagogy, Black and Atkin4 argue informal summative assessments (i.e., an end of section exam or assignment rather than formal summative assessments like final exams) are used formatively in effective classroom settings. However, if instructors and students view exams as only summative rather than potentially as both formative and summative, then the focus also shifts to completion of the work, doing the exam for the grade, rather than using it to identify strengths and weakness for continued learning. Standards-based grading approach to assessment provides multiple opportunities for students to show their level of mastery of a clearly

W

hat do I have to do to pass this course?’ This question demonstrates the divide between students and faculty in their understanding of what a grade does or should mean; students often believe grades reflect effort more than faculty who focus on performance.1,2 Though faculty may want grades to reflect the level of chemistry understanding, many students believe doing all the work should constitute at least a C in a course.1 In addition, while summative assessments (evaluations at the end of a course or learning sequence) provide a grade, course assessments should also promote and improve student learning.3,4 Frequently general chemistry courses are “service courses”5 for other STEM or STEM-related majors. As a result, success in general chemistry is connected to students’ persistence and retention in nonchemistry STEM majors.6,7 In part due to high failure and dropout rates and a poor reputation for general chemistry courses,6 researchers continue to explore class structures and teaching strategies to help improve student learning in general chemistry classes so students can successfully continue with their chosen major. For example, peer-led team learning (PLTL)8 and process oriented guidedinquiry learning (POGIL)9 are teaching strategies that have been shown to improve student learning in general chemistry courses. Changing the structure of the course by flipping the class to move lecture to out of class time and problem work to class time has also shown gains in student learning in general © XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: April 5, 2018 Revised: June 1, 2018

A

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

did not. The course described by Toledo and Dubas is an equal mix of the two systems of course evaluation. This article provides an example of using a SBG-only approach to grading in a large general chemistry lecture course for nonmajors. Students’ responses to and engagement with the SBG system are explored.

communicated set of learning objectives. Because of the nature of the approach, a standards-based grading (SBG) approach can help students perceive exams as tools to help them learn and achieve their final grade.14−17 It also helps communicate to students that their grade is reflective of the chemistry they learned and not the amount of work they did.14−17 This article describes one instructor’s use of a SBG assessment method in a large one-semester general chemistry course for nonmajors and students’ responses to it.





STANDARDS-BASED GRADING Standards-based grading (SBG), sometimes also referred to as criteria-based grading, evaluates students based on their ability to meet a clear set of learning objectives or performance indicators.18−20 In a SBG assessment system, the purpose of class activities, assignments, and assessments is to learn the objective and demonstrate understanding of the objective, not gather points (there are no points to gather) to add together for a final evaluative grade.20 In practice, there are several methods for computing students’ final overall grades with standards-based grading.16,21 However, no matter how the overall grade is determined in a SBG system, students are provided with more than one opportunity to demonstrate achievement of the course objectives, emphasizing the knowledge the student has gained at the end of the learning rather than early achievement or task completion during the course.12,15,19,20 With the emphasis on the final achievement of course objectives, SBG emphasizes the use of formative assessments, both informal (ungraded) and formal (graded with the opportunity to improve) throughout a course.15,19,20 In addition, in a SBG system the results of assessments are connected explicitly to the course objectives for students (e.g., “meets expectations for objective #1”). The use of standards-based grading systems in either individual classes or as school-wide grading systems has been growing in K−12 schools,17,21−23 but its use has only been reported in a few higher education courses.12,14,15,24,25 The studies on the use of SBG in higher education have shown students in courses using SBG preferred the method to traditional methods of assessment24−26 and students “shifted their thinking”24 in their approach to learning. Some students in these classes reported problems with “self-motivation”14 or achieving the level of individual responsibility needed to be successful in an SBG system.26 The impact on student learning in SBG courses is not clear from these studies. There has been some reporting of the use of SBG in higher education, but these examples are not in large (+150) enrollment general chemistry classes. The use of an evaluation method in a small general chemistry course, which was a mixture of specifications-based27 and standards-based grading, has been previously reported in this Journal.12 Aligning with an SBG system, Toledo and Dubas12 reported developing course objectives and performance descriptions, which were communicated to students and guided the development of assessments in the course including homework assignments, quizzes, and exam. However, Toledo and Dubas report using a method more strongly aligned to specifications-based grading system to determine grades for the course objectives and overall course grades. Each objective had a set of tasks or specifications (e.g., homework questions, quizzes, exams) for students to complete, and the students’ level of completion and success on these tasks determined their scores. For some of the tasks, students had opportunities to reassess, but for others like exams, they

IMPLEMENTATION OF STANDARDS-BASED GRADING

The Course

In two different semesters, a standards-based approach to assessment was used in a nonmajors’ general chemistry survey course. Described in the course catalog as “Introductory survey of fundamental concepts, laws, and theories of chemical science and their application to common chemical systems”, this one semester, 4 h lecture course met four times per week for 50 min lectures. It was a survey of the topics taught in a traditional two semester general chemistry course; the course content moved quickly. The course was a prerequisite for a fundamental organic chemistry course for nonmajors as well. One-hundred seventy-six students completed the course in Spring 2016 and 204 in Fall 2016; these numbers reflect typical enrollments in the course for each semester. A majority of students in the course were nursing, dietetics, or agriculture majors, though there was a wide variety of majors. Freshmen made up a majority of each class, with 53% freshman, 25% sophomores, 15% juniors, and 7% seniors. Sixteen course learning objectives were written by the course instructor prior to the semester, which were then clarified with specific student learning outcomes that represented full mastery of the course learning objective appropriate for the level and student in the course. Figure 1 provides two examples of course learning objectives with the mastery student learning outcomes.

Figure 1. Two examples of course objectives. Four represents complete mastery of the objective for the course, while 2 represents acceptable mastery. B

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

The learning outcomes in the ‘4’ column represented mastery of the course objective as demonstrated by the learning outcomes at the highest level. The outcome descriptions were then modified or scaled so that a 2 represented competent level of understanding for the course objective and a 0 represented minimal or no understanding. Students were provided with the all the course learning objectives and outcomes (0−4 descriptors) from the first day of class and had access to them throughout the course through the online learning management system (LMS). Within the LMS, course materials were labeled by course objective. Along with the master list of objectives, students were provided with a link to a page for each objective, which listed the objective again and listed the course materials for that objective. With the exception of the exam and grading structure described below, the structure of the rest of the course was very similar to what might be considered a traditional general chemistry course. Rather than providing students with materials or information by chapter or exam though, it was organized by course objective. Materials provided for students for each objective included lecture notes (outlines which students had to complete during class), in class handouts/ activities, suggested online homework assignments,28 textbook readings, and additional practice problems with answer keys. For each objective, students were given numerous resources for practice problems and learning materials besides what was done in class to help them study and learn the material much like a traditional general chemistry class. As much as possible, active learning techniques were used in the large lecture hall in which the course was taught. In addition, whenever beginning or ending an objective during class, the objective descriptions were presented to the students within the notes to which they had access. Table 1 provides the semester outline by week for the course when it was taught in Fall 2016. The objectives covered by each exam, described in the next section, are listed.

Table 1. Fall 2016 Course Schedule Week 1 2 3

Day

Course Topic

M/T W/Th M/T W/Th M T/W

Obj. #1 Basic Language of Chemistrya Obj. #2 Measurement Obj. #2 Measurement Obj. #3- Basic Atomic Structure Holiday Obj. #1 Language of Chemistry- Inorganic Nomenclaturea Obj. #4 Chemical Reactions- See Figure #1 for complete objective Obj. #4 Chemical Reactions Exam 1: Obj #2−4 Obj. #5 Stoichiometry- See Figure #1 for complete objective Obj. #6 Gases Obj. #7 Atomic Theory and Periodic Table Obj. #7 Atomic Theory and Periodic Table Obj. #8 Bonding Exam 2: Obj #2−7 (5−7 were new objectives for this exam) Obj. #8 Bonding Exam 3: Obj #2−7 (Optional: no new objectives) Obj. #9 Intermolecular Forces Obj. #10 Solutions Obj. #10 Solutions Obj. #11 Acids and Bases Obj. #12 Equilibrium Exam 4: Obj #5−11 (8−11 were new objectives for this exam) Obj. #12 Equilibrium Obj. #13 Redox and Electrochemistry Obj. #13 Redox and Electrochemistry Thanksgiving Break Obj. #14 Nuclear Exam 5: Obj #8−14 (12−14 were new) Obj. #1- Language of Chemistry: Simple Organic Nomenclaturea Exam 6: Obj. #8−14 (Optional: no new objectives) Final Exam including optional third opportunity for Obj. #12−14

Th 4 5 6 7

8 9 10 11

12 13 14 15 16

Assessment Method

To assess student learning of the course objectives, students completed a set of questions on an exam over the objective (see Table 1 for exam schedule and content). Exam questions were short answer or show your work questions. The Supporting Information includes a set of exam questions aligned with the objectives in Figure 1. The set of questions was graded using the objective descriptions (Figure 1) to assign a 0−4 score. On the basis of the student learning outcomes for the objective, clear criteria for each score on a set of questions were established prior to grading; criteria for half scores, that is, 3.5 or 2.5, were also determined. The exam examples in the Supporting Information provide the grading criteria as well. Teaching assistants helped grade exams. No overall exam score was given, rather scores for each of the objectives on the exam were given. For example, on an exam a student could receive three scores, one for Objective 2, one for Objective 3, and one for Objective 4. These scores would be reported to students in the gradebook section of the course LMS. There was a column in the gradebook for each course objective. The graded exams were also returned to students. As described above, one important aspect of standards-based grading is that students are given multiple opportunities to demonstrate they have learned the required content.15,23 To achieve this goal in a manageable format for a large lecture course, students were given three opportunities to demonstrate their understanding of an objective. A set of 4−6 questions for

M−W Th M−Th M−W Th M/T W Th M−W Th M−W Th M/T T−Th M−W Th M/T W/Th M−Th M−W Th M−W Th

Finals

Objective #1 does not appear on any exams because it was assessed through online quizzes. Objective #1 involved nomenclature so repeated practice was asked of students rather than a section on the exam. a

an objective appeared on three of the six 1 h exams given in the semester (see the Supporting Information for examples of 2 question sets). Question sets were clearly labeled by objective. Once the material for the objective had been taught in class, a question set for the objective appeared on the next three exams (see Table 1). Different question sets were used on each exam, though the topics and tasks for the questions were similar. During any exam, a student could choose to take or skip any of the objectives on that exam. If they skipped an objective, it was not scored. Their most recent score on an objective was recorded in the gradebook. For example, if a student completed Objective 3 questions on Exam 1 and scored a 3.5, the student could take the new Objective 3 questions on Exam 2 or could skip the questions if they were happy with their score of a 3.5 or not ready to retest it. If the student took the Objective 3 questions on Exam 2, then the score, whether higher or lower, was recorded. As the students were informed, this policy was in place to ensure students did not complete C

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 2. Distribution of Spring 2016 Students Completing Each Objective Opportunity and Average Scores First Opportunity (N = 176)

Second Opportunity (N = 176)

Third Opportunity (N = 176)

Objective

Topic

Completed, %

Av. Score

Completed, %

Av. Score

Completed, %

Av. Score

2 3 4 5 6 7 8 9 10 11 12 13 14 15

Measurement Atomic theory Periodic tableb Bonding Reactions Stoichiometry Solutionsb Gases IMF Energy Kineticsb Nuclear Equilibrium Biochemistry

97.7 97.7 97.7 96.6 96.6 96.6 46.1 71.1 51.7 61.9 54.0 45.5 41.5 33.0

2.65 2.35 2.55a 2.03 1.74 1.92 1.33 2.46a 1.62 2.15a 2.19 2.51a 1.67 2.01

34.7 37.5 31.3 37.5 52.3 26.7 45.5 38.1 50.6 42.0 50.0 41.5 53.4 46.6

2.22 2.17 2.28 2.53a 2.58a 1.98a 2.41a 1.96 2.17a 1.87 2.60 2.45 1.29 2.25a

21.1 30.7 30.1 18.8 24.4 26.7 43.8 19.9 27.8 21.6 19.3 20.5 42.6 40.3

3.11a 2.65a 2.54a 2.26 2.56a 1.85 1.37 2.30 2.06 2.20a 2.96a 1.90 2.03a 2.17

a

Highest average score for this objective. bDenotes where exam breaks occurred, objectives here and above would be new on the exam.

Table 3. Distribution of Fall 2016 Students Completing Each Objective Opportunity and Average Scores First Opportunity (N = 204)

Second Opportunity (N = 204)

Third Opportunity (N = 204)

Objective

Topic

Completed, %

Av. Score

Completed, %

Av. Score

Completed, %

Av. Score

2 3 4 5 6 7 8 9 10 11 12 13 14

Measurement Basic structure and energy Reactionsb Stoichiometry Gases Atomic theoryb Bonding IMF Solutions Acid/baseb Equilibrium Redox Nuclear

94.1 95.6 89.2 67.2 61.3 64.7 66.2 58.3 50.0 31.9 42.2 35.3 34.8

2.19 2.27 2.29 2.14a 2.38 2.09 2.61a 2.21a 1.94 1.58 2.10a 2.21a 2.77a

23.0 17.6 28.9 29.9 29.9 38.7 28.9 32.4 35.3 47.0 39.2 34.8 26.5

2.47 2.46 2.68 1.87 2.51a 2.77a 2.38 1.70 1.33 2.03 2.11a 1.90 2.01

23.5 17.6 22.5 25.4 21.1 19.6 9.31 20.6 31.4 35.3 22.1 31.4 29.4

2.65a 2.61a 2.80a 1.49 2.16 2.11 2.05 1.75 2.10a 2.63a 1.84 1.83 1.75

a

Highest average score for this objective. bDenotes where exam breaks occurred, objectives here and above would be new on the exam.

question sets with just the hope of doing better than the first time by chance when they had not studied. In addition, a student did not have to complete the question set for an objective on the first time it appeared on an exam. If on Exam 1, a student did not feel prepared to take the Objective 3 questions, he/she could skip this section of questions and not receive a score until Objective 3 questions were complete. A zero would be given if a student never completed a set of objective questions. Of the six 1-hour exams in the semester, two were labeled optional because they had no new objectives on them (see Table 1); they provided students with an opportunity to catch up and reassess objectives already offered. It should be noted that due to the 1-hour time restriction, as more and more objectives appeared on the exams it was highly unlikely that a student would have time to complete all objectives on an exam. At the end of the semester, the final recorded scores from all objectives were added together and averaged, similar to a GPA calculation. This score represented 90% of a student’s grade. The other 10% of a student’s grades came from their score on a cumulative final, which is expected to be given in all chemistry courses. The cumulative final included two questions from each objective, and the percentage chosen allowed for most of the student’s score to come from their demonstrated learning during the semester to

keep it aligned with an SBG system while still meeting the course final requirements. Limitations

This work was the initial study on the use of an SBG system in a large lecture chemistry course and provides initial evidence of the potential benefits of this approach and efforts involved in its design. This study assumed the question sets written for the exams were valid assessments of content that were reliably graded. At the time of this work and given the nature of the course, there was no appropriate externally validated assessment instrument available to assess the students’ content knowledge. Therefore, claims about students’ content knowledge are not made; rather only their success on the instruments used in the course. Future research needs to clearly evaluate the students’ content knowledge learning when SBG is used in a course. Additionally, other factors could be affecting student performance in the class. The instructor never taught the course without using SBG so there is no control for her teaching methods with and without SBG to compare. The use of short answer questions rather than multiple choice questions on the exams could impact student learning. The change in D

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 4. Spring 2016 Students Completing Each Opportunity as Their First Attempt at the Objective and Their Average Scorea First Opportunity (N = 176)

Second Opportunity (N = 176)

Third Opportunity (N = 176)

Objective

First Attempt, %

Av. Score

First Attempt, %

Av. Score

First Attempt, %

Av. Score

2 3 4c 5 6 7 8c 9 10 11 12c 13 14 15

97.7 97.7 97.7 96.6 96.6 96.6 46.1 71.1 51.7 61.9 54.0 45.5 41.5 33.0

2.65b 2.35b 2.55b 2.03b 1.74b 1.92b 1.33 2.46b 1.62 2.15b 2.19 2.51b 1.67b 2.01b

1.70 1.70 1.70 1.70 1.70 1.10 25.0 20.7 31.0 27.6 33.9 35.1 36.2 34.5

1.67 2.17 1.04 1.33 1.33 0.75 1.97b 1.60 2.07b 1.87 2.43b 2.33 1.13 2.03b

0.60 0.60 0.60 0.60 0.60 0.60 21.0 5.20 11.9 6.30 6.90 13.2 14.9 21.8

2.50 2.00 0.00 2.50 3.00 0.50 1.38 2.15 1.91 2.27 2.71 1.74 1.54 1.97

a

Averages from 1.7% or less of students were ignored as they resulted from three students or fewer not representative the whole class. bHighest average score for this objective. cDenotes where exam breaks occurred, objectives here and above would be new on the exam.

Table 5. Fall 2016 Students Completing Each Opportunity as Their First Attempt at the Objective and Their Average Scorea First Opportunity (N = 204)

Second Opportunity (N = 204)

Third Opportunity (N = 204)

Objective

First Attempt, %

Av. Score

First Attempt, %

Av. Score

First Attempt, %

Av. Score

2 3 4c 5 6 7c 8 9 10 11c 12 13 14

94.1 95.6 89.2 67.2 61.3 64.7 66.2 58.3 50 31.9 42.2 35.3 34.8

2.19b 2.27b 2.29b 2.14b 2.38 2.09 2.61b 2.21b 1.94b 1.58 2.10b 2.21b 2.77b

2.00 2.50 6.40 13.2 15.7 15.7 21.6 25.5 27.0 38.7 27.9 27.0 23.5

2.38 2.20 2.27 1.65 2.47b 2.41b 2.16 1.64 1.10 2.09 1.96 1.78 1.95

2.00 1.50 3.40 11.7 11.3 9.80 3.40 6.90 11.3 19.6 15.7 21.1 23.0

3.00 2.33 2.50 1.04 1.88 1.80 2.29 1.39 1.83 2.76b 1.88 1.64 1.71

a

Averages from 6.4% or less of students were ignored as they resulted from 13 students or fewer, not representative of the whole class. bHighest average score for this objective. cDenotes where exam breaks occurred, objectives here and above would be new on the exam.

objective questions to complete on each exam as objective opportunities started to repeat themselves on exams and new material appeared. No opportunity had 100% completion rate by students because there were always a few students absent from every exam (they made up work on future exams). Tables 2 and 3 show that as the semester progressed the percent of students who completed the first opportunity for objectives declined, indicating that students started choosing not to complete the first opportunity for an objective. In addition, no more than about half of the class took advantage of the second or third opportunity to complete an objective, so the grading never constituted grading every student on every objective offered on an exam. However, it is also clear by adding the percentages for any objective that some students took the questions for an objective more than once. Data describing when students chose to attempt objectives for the first time and how often they repeated objectives are discussed below and presented in Tables 4 and 5. From these results, it is hard to understand how students were using the SBG system to learn. Given the size of the class, the fact that fewer students took the objectives on the first

textbooks between semesters could have contributed to the change in student performance.



STUDENTS USE OF SBG TO LEARN CHEMISTRY With Institutional Review Board (IRB) approval, student scores (0−4) for objectives from each exam and their final grades in the class were collected. If a student did not take an objective on an exam, no score was recorded for the student. As described above, students had three opportunities to demonstrate knowledge of any objectives but were not required to do all three. Tables 2 and 3 provide data of the percentage of the class that attempted (took) the set of questions for the objectives on each of the three opportunities. As a reminder, Objective #1 was assessed with online quizzes (see Table 1) and thus is not in the tables. The highest average score for each objective is noted. Because of a shift between semesters in the course textbook and the update/alterations of some of the course objectives, data for each semester of the course are presented independently rather than combined. After the first exam, which only included the first set of objectives, students had choices in terms of which set of E

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 2. Average attempts and course grades. For each course letter grade, the percent of students in the average number of attempts per objectives is given. For example, of the five students who got a F in the course in Spring 2016, 80% of them averaged 0−.92 attempts per objective.

smaller than the total percentage of students completing the opportunity from Tables 2 and 3. Again, for Objective 8 in Spring 2016, 25.0% of students completed the Objective 8 questions for the first time on the second opportunity, while 45.5% of the students overall completed the Objective 8 questions on the second opportunity. Thus, some students were using the second and third opportunities for their first attempt on objectives, while others were retaking it. In terms of performance, if students waited until the second or third opportunity to complete the questions, they did not perform better than those whose first attempt was the first opportunity (see the shaded cells in Tables 4 and 5). In the spring semester, students who waited did better on four objectives, while in the fall it was only three. Some of these objectives align with the higher overall averages for the opportunities in Tables 2 and 3, but not in every case. The overall higher averages for the second or third opportunity then indicate that students were learning the content and performing better on the next exam; it was likely not a result of students who knew the material waiting to take the objectives. To better understand how often students were retaking objectives and if retaking objectives improved their success in the course, the number of attempts for each objective for each student was counted. A student who never attempted an objective on an exam, including students assigned a grade for the course but stopped attending class and exams at some point, was assigned a 0 for the number of attempts for an objective. The student’s total number of attempts on any objectives was added together and divided by the number of objectives on the exams to calculate an average number of attempts for each student. Over both semesters, the average number of attempts on an objective ranged from 0.15 attempts per objective to 2.38. The very low averages came from students who stopped attending the course but did not withdraw. For example, the 0.15 average was from a student who only came to Exam 1 and did not do the entire exam. Over 75% of the students retook at least one objective over the course of the semester. In Spring 2016, more students retook objectives than in Fall 2016; overall class average for Spring 2016 was 1.39 attempts per objective, while Fall 2016 was 1.16. For students who completed an objective and then tried again

opportunity could mean they were learning some metacognitive skills to identify when they were ready to take the objective, or it could mean that students who waited to take assessments at times were struggling in the class or procrastinating as was indicated in a previous study on SBG.26 Because the most recent score for any objective was recorded as the student’s score, the lower averages on second and third opportunities could also indicate that students were not using the structure to take the time to learn the content before taking the objective but just trying again to get a higher score later with a reassessment or procrastination. Thus, more information is needed to understand how the students in this large general chemistry course were engaging with the use of this SBG system. Tables 4 and 5 provide data for each attempt on the percentage of students taking the objective for the first time. The average scores of the first-time attempters are also provided in the table. Again, the highest average for each objective has been noted in the table unless the percentage of students taking the objective for the first time was less than 6.4%. The averages from the small percentage of students were ignored because it would have resulted from 13 students or fewer (often 1 or 2) and should not be used to represent the whole class. Most students completed the first opportunity for an objective (tried it the first time it was offered), but these percentages decreased as the semester went along. Typically, the last objective taught in class just prior to an exam had the lowest number of students attempting it on the first opportunity. Since this material was often taught up until a day or two prior to the exam, it suggests that some students may not have felt they had learned the material well enough yet to complete this section and chose to focus their time and efforts on other sections of the exam, possibly indicating a level of metacognition. This is also supported by the fact that the second opportunity for these objectives (just above the dark line in Tables 4 and 5) has a higher average score. For example, in Spring 2016, the average on the second opportunity for first attempters (Table 4) was 1.97 compared to a 1.33 for the first opportunity. In addition, the percentage of students using the second and third opportunities as their first attempt is always F

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

taught by different instructors who did not use an SBG grading system. The topics covered by the different instructors are the same, but the instructors are given freedom to emphasize different areas. For example, the instructor in Fall 2015 included Nuclear Chemistry as a single lecture day but the other instructors included it more, which resulted in different assessments by the instructors. As a result, only final grades in the course are being compared. More students passed the class when the SBG system was used than in previous semesters: 23.7% and 22.6% of students failed or withdrew from the course in prior semesters compared with the less than 8% in the SBG semesters. However, fewer students received A’s during the semesters in which SBG was used. Considering the opportunities to reassess in the SBG system, it is not surprising that more students passed the course in the SBG system. In a traditional grading system, a low F (e.g., 25%) on an early exam can make it difficult to pass the course because of the significant number of points no longer available for students to earn. An early failure making it difficult for students to succeed in a traditional grading system is one of the arguments often used in advocating for SBG or other assessment systems.15,16,18 For example, Toledo and Dubas12 also found a higher passing rate with their mixture of SBG grading and specifications-based grading than when the course was taught with a traditional assessment method. Overall, there is evidence that students used the SBG system to be successful in this general chemistry survey course for nonmajors; the fact that they varied their use of the system suggests a level of student metacognition. After the first exam, students started making different choices as to which objectives they would complete on each exam as evidenced by the percentages in Tables 2 and 3. Scores for objectives covered just before exams were higher when the initial attempt by students was the second opportunity for the objective (Tables 4 and 5), suggesting students used the system to take the time to learn the materials before attempting the assessment on it. Students learned the material between opportunities, doing better overall on second and third attempts, and thus passed the class at a higher rate than in previous years. Many successful students took advantage of the extra opportunities to do better in the course (Figure 2). Finally, students in Spring 2016 used the system better than Fall 2016 students: they retook objectives more often, fewer students waited until the third opportunity to attempt an objective, and they achieved higher grades overall. There was no conscious change by the instructor in her explanations of or approach to the system from Spring 2016 to Fall 2016 to explain this difference in effort by the groups.

on another attempt, less than 10% of these attempts (9.78%) resulted in a lower score than the previous attempt. A large majority of these reduced scores occurred on retakes for the first few objectives in the semester, that is, Objectives 2−5, likely because the students did not read or understand the exam grading policy. For a large majority of the students, when they retook an objective they performed better on the objective; they did not “give up” on the material after the exam as they might have in a traditionally graded course, but continued to study and learn the material. This is also reflected in the data when looking at overall performance in the class with average number of retakes. Students were then grouped by their average number of attempts: (1) student did not take all the objectives (less than 1 average), (2) student completed all the objectives once (1.0), (3) student averaged reattempts on less than one-third of the objectives (1.08−1.31), (4) student averaged reattempts on about a third to two-thirds of the objectives (1.38−1.62), (5) student averaged reattempts on two-thirds to almost all the objectives (1.69−1.92), and finally (6) student averaged reattempts on all of the objectives (2.00 or higher). See Figure 2 for the groupings. While it is an average and could indicate they took one objective three times and another only once or not at all, it provides a basis of comparison for the amount of objectives on exams they were completing. Figure 2 shows how students within each average attempt group performed in the class overall. A Chi square test for independence showed that each semester these groups had different performance in the course: Spring 2016 χ2 (20, N = 176) = 62.73, p < 0.001 and Fall 2016 χ2 (16, N = 204) = 109.17, p < 0.001. Students who averaged less than one attempt per objective received a D or F in the course. A few with an average less than one attempt per objective received C’s or B’s, but these were the students with averages near 1 (i.e., 0.92) indicating they must have done well on all the objectives they took and skipped only 1 or 2 objectives. No student received an A with an average less than 1. In general, students who took advantage of the SBG system to reassess objectives received higher grades in the class. Finally, to understand how students used SBG to be successful in the course overall, students’ final grades were collected and are presented in Table 6. Students successfully completed the course using the SBG system, with only 7.6% of student failing or withdrawing from the course in Spring 2016, and 6.6% in Fall 2016. Table 6 also provides students’ final grades data from two prior semesters when the course was Table 6. Comparison of Standards-Based Grading Overall Course Grades to Those of Previous Semesters Taught by Different Instructors Grades

Spring 2016a (N = 184)b

Fall 2016a (N = 210)b

Fall 2015c (N = 215)

Fall 2014c (N = 195)

A, % B, % C, % D, % F, % WX, %

12.0 34.8 35.9 10.3 4.3 3.3

11.4 31.9 33.3 16.7 3.8 2.8

13.5 20.9 28.4 13.5 20.5 3.2

13.3 21.0 28.7 14.4 15.4 7.2

Students’ Opinions of SBG in the Course

At the end of both semesters, students completed an anonymous online survey about the course. The complete survey is provided in the Supporting Information. The survey included three open-ended questions about the aspects of the course the students found helpful and those that hindered their learning. The second page of the survey included 18 Likertscale statements about the course and learning chemistry generally for students to indicate their level of agreement. The statements were adapted from the CLASS-Chem survey29 and modified to include references to the assessment method for the course. A modified version of the CLASS-Chem survey was chosen because the course is a general education course.

a

Semesters with standards-based grading. bData include students who withdrew from the course, which were not used in previous data that thus have larger N values. cStandards-based grading was not used and the course was taught by different instructors. G

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 7. Qualitative Coding Categories, Definition, and Sample Student Response Code

Description

Example Response

Assessment method Class session

Objectives, grades, exams, or other aspects of assessment.

The objectives are nice because you can do them at your own pace and if you are not ready for an objective you do not have to take it. [Instructor] uses lots of practice problems during class and walks us through them. The repetitive practice helped a lot. Also discussing with neighbors. What really helped me learn in this class were the tutoring sessions that she held [a Weekly review session]. I’d say the enthusiasm of the teacher. Someone who is passionate about what they teach is always easier to listen to. [Online Homework system] is very helpful with practice problems as well as other learning modules they have to offer.

Class structure Instructor

Routine classroom practices, like class practice problems, simulations used in class, group work, etc. Structural aspects of the course, size of the course, lecture room, class times, etc. The instructor explicitly, her actions/behaviors.

Course materials

Materials used in the course, for example, the textbook, learning management system, etc.

Table 8. Comparison of Students’ Survey Responses Reflecting on Their Learning in the Course Distribution of 237 Students’ Survey Responses by Answer Categorya Assessment Method

Class Session

Class Structure

Instructor

Materials

Survey Questions

%

N

%

N

%

N

%

N

%

N

What helped you learn? What hindered your learning? What is one thing would you change about the course?

21.80 7.10 15.10

52 17 36

29.70 14.20 9.20

71 120 22

12.00 50.20 55.60

31 120 133

12.60 7.50 2.10

30 18 5

51.50 14.60 13.40

123 35 32

a

Students responses could be coded into more than one category so percentages will not add to 100 for any question, nor will the number of responses add to 237.

Figure 3. Students agreement with Likert-scale questions on the online survey. The number of students is graphed with their level of agreement to each statement.

Descriptive statistics from student scores and Likert-scale questions were computed. Students’ responses to the openended questions on the survey were read and coded using the qualitative analysis software NVivo. After reading a set of student responses, coding categories were developed by the researchers and are presented in Table 7 along with a sample student response from the first open-ended question “What helped you learn?”. These coding categories were then used to code all the responses. Consensus codes were achieved for a sample of 30 student survey responses, and the coding was then completed by the third author. A student’s response to a question could be coded into more than one category.

Several of the statements of the CLASS-Chem survey provide data for the university’s general education standards (e.g., “Reasoning skills used to understand chemistry can be helpful to me in my everyday life.”) and could be used for other reporting purposes. Only the modified statements from the survey relating to the “assessment method” are discussed here, though the students’ responses to all the Likert-scale questions are provided in the Supporting Information with the survey. Two-hundred thirty-seven students completed the final survey, a response rate of 61.9%, for both semesters combined. The students could earn 2 extra credit points (out of 100) on their final exam (10% of their overall grade) if they completed the survey. H

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

about work completion and effort1,2 than knowledge or skill acquisition. The instructor of the general chemistry course described here, like most faculty members,2 believes the grades students receive in her course represent knowledge of the chemistry content of the course. In this article, we presented the instructor’s continuing attempts to use a standards-based grading method in a large general chemistry course for nonmajors to try to ensure that course grades do represent learning and explicitly communicate that to students. Early observations from the study presented here support the use of SBG methods by students to be successful in the course. In addition, there is evidence of some metacognitive skills being used by the students including their choices of which opportunity to take which objectives (i.e., waiting on newest objectives until the second opportunity), number of reattempts with improved scores, and even opinions on learning chemistry from the survey. Results did not indicate that the assessment method hindered student learning, and both the students and the instructor (who continues to use SBG) had positive opinions about the use of SBG in the course. As with any method, there are challenges to overcome and improvements that can and will be made as the assessment method implementation continues to evolve. However, as we continue to strive to find methods to improve our instruction, improve students’ chemistry learning, and improve their metacognitive skills to assess and improve their learning themselves, standards-based grading assessment methods provide an option for instructors to make progress on these fronts.

As previously mentioned, 237 students across both semesters completed the same end-of-course survey. This allowed students to give their opinions on the course through open-ended questions and rank statements using a Likert-scale. Table 8 provides the results from the coding of the students’ open-ended responses to the survey questions. When asked about the course, the assessment method was not a factor many of the students identified as helpful or a hindrance to their learning, though 21.8% of students’ comments did mention that the assessment helped them learn in the course. The course materials and the design of the class sessions were more often mentioned by students as things that helped them learn. Not surprisingly, the structure of the class, for example, its size, was identified as the biggest problem with the class. “The class was too big” or “Make the class smaller” are examples of common responses coded as Class Structure. When asked what they would change, again most students would change the class structure since it was the biggest hindrance. However, a slightly larger percentage of students (15.10%) mentioned the assessment method as something they would change rather than as something that hindered their learning (7.10%). A closer examination of these responses coded as assessment method for something they would change revealed that most of these students’ responses (65.7%) suggested changing the format of the exam. For example, “give multiple choice tests because the ones in this class were very hard for me.” Since exams relate to assessment they were coded accordingly, but these responses are not about the SBG system rather test format. Only 11 responses (5%) would change the SBG system, for example, “give percentages instead of out of 4”. Students were asked about the assessment directly through the Likert-scale questions. Figure 3 provides students’ responses to these statements with the statements included in the figure. Most students liked the assessment method or were neutral to it; only 21.6% (51 out of 237) of students disagreed or strongly disagreed with the statement “I like the assessment method in this course”. In addition, the students’ responses to the other statements in Figure 3 suggest that the course and the assessment method may have helped them think about learning chemistry differently as well. Sixty-two percent (62%, 147 out of 237) agreed with the statement that the assessment method changed their approach to studying and learning, and only 20% (48 out of 237) agreed with the later contradictory statement that it did not affect their learning. The students also indicated they did not focus on memorization (17% or 37 students agreed) or recall (20% or 48 students) to learn chemistry. This finding is also important for the SBG design and assessments. Exams were returned to students after they were graded. If students only had to memorize something from one exam to the next to do better on the objective, it would be expected they would strongly agree with these statements about memorization and recall. The instructor’s reflections on the SBG system can be found in the Supporting Information. The instructor discusses the positive aspects observed anecdotally and challenges encountered when implementing the SBG system during the first two attempts.



ASSOCIATED CONTENT

* Supporting Information S

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b00251.



Instructor’s reflections (PDF, DOCX) Sample exam questions with grading criteria (PDF) Survey instrument and complete results (PDF, DOCX)

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Sarah B. Boesdorfer: 0000-0002-2948-4921 Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS We would like to acknowledge the students who have participated in the course and survey to continue improvements. In addition, we would like to acknowledge Stacie Cler who helped in the initial phases of this research study.



REFERENCES

(1) Tippin, G. K.; Lafreniere, K.; Page, S. Student Perception of Academic Grading: Personality, Academic Orientation, and Effort. Act. Learn. High. Educ. 2012, 13 (1), 51−61. (2) Adams, J. B. What Makes the Grade? Faculty and Student Perceptions. Teach. Psychol. 2005, 32 (1), 21−24. (3) How Students Learn: Science in the Classroom; Donovan, M. S., Bransford, J. D., Eds.; The National Academies Press: Washington, D.C., 2005.



CONCLUSION Students must pass general chemistry courses for a variety of STEM-related majors6 and students often view passing as more I

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(27) Nilson, Linda B. Specifications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time; Stylus Publishing LLC.: Sterling, VA, 2015. (28) On the basis of student feedback during Spring 2016 requesting the instructor ‘make them do’ the recommended homework, in Fall 2016, a new objective was added to the course: “Students use methods for effectively learning chemistry.” If students completed 90% of the homework assignments by the due dates, they received a 4. Since it would be 1 out of 18 objectives or 1/18th of their grade, this was an acceptable compromise for students’ request to make them do the recommended homework but still have their course grade reflect their knowledge of course material. (29) Adams, W. K.; Wieman, C. E.; Perkins, K. K.; Barbera, J. Modifying and Validating the Colorado Learning Attitudes about Science Survey for Use in Chemistry. J. Chem. Educ. 2008, 85 (10), 1435.

(4) Black, P.; Atkins, J. M. The Central Role of Assessment in Pedagogy. In Handbook of Research on Science Education; Lederman, N. G., Abell, S. K., Eds.; Routledge: New York, 2014; Vol. II, pp 775− 790. (5) Cooper, M. M. The New MCAT: An Incentive for Reform or a Lost Opportunity? J. Chem. Educ. 2013, 90 (7), 820−822. (6) Cooper, M. The Case for Reform of the Undergraduate General Chemistry Curriculum. J. Chem. Educ. 2010, 87 (3), 231−232. (7) Gillespie, R. J. What Is Wrong with the General Chemistry Course? J. Chem. Educ. 1991, 68 (3), 192. (8) Hockings, S. C.; DeAngelis, K. J.; Frey, R. F. Peer-Led Team Learning in General Chemistry: Implementation and Evaluation. J. Chem. Educ. 2008, 85 (7), 990. (9) Perry, M. D.; Wight, R. D. Using the ACS General Chemistry Exam To Compare Traditional and POGIL Instruction. In Process Oriented Guided Inquiry Learning (POGIL); ACS Symposium Series; American Chemical Society: Washington, D.C., 2008; Vol. 994, pp 240−247. (10) Ryan, M. D.; Reid, S. A. Impact of the Flipped Classroom on Student Performance and Retention: A Parallel Controlled Study in General Chemistry. J. Chem. Educ. 2016, 93 (1), 13−23. (11) Weaver, G. C.; Sturtevant, H. G. Design, Implementation, and Evaluation of a Flipped Format General Chemistry Course. J. Chem. Educ. 2015, 92 (9), 1437−1448. (12) Toledo, S.; Dubas, J. M. A Learner-Centered Grading Method Focused on Reaching Proficiency with Course Learning Outcomes. J. Chem. Educ. 2017, 94 (8), 1043−1050. (13) Brown, P. C.; Roediger, III, H. L.; McDaniel, M. A. Make it Stick: The Science of Successful Learning; The Belknap Press of Harvard University Press: Cambridge, MA, 2014. (14) Beatty, I. D. Standards-Based Grading in Introductory University Physics. J. Scholarsh. Teach. Learn. 2013, 13 (2), 1−22. (15) Iamarino, D. L. The Benefits of Standards-Based Grading: A Critical Evaluation of Modern Grading Practices. Curr. Issues Educ. 2014, 17 (2).110 (16) Sadler, D. R. Three In-Course Assessment Reforms to Improve Higher Education Learning Outcomes. Assess. Eval. High. Educ. 2016, 41 (7), 1081−1099. (17) Pollio, M.; Hochbein, C. The Association between StandardsBased Grading and Standardized Test Scores in a High School Reform Model. Teach. Coll. Rec. 2015, 117 (11), 1−28. (18) Guskey, T. R. Grading and Reporting Student Learning. Educ. Leadersh. 2010, 67 (5), 31−35. (19) Marzona, R. J. Formative Assessment and Standards-Based Grading; The Classroom Strategies Series; Marzano Research Laboratories: Bloomington, IN, 2010. (20) Nagel, Dave. Effective Grading Practices for Secondary Teachers: Practical Strategies To Prevent Failure, Recover Credits, and Increase Standards-Based/Referenced Grading; Corwin: Thousand Oaks, CA, 2015. (21) Steward, L. Standards-Based Grading in the Chemistry Classroom; American Association of Chemistry Teachers, 2017. https://teachchemistry.org/professional-development/webinars/ standards-based-grading-in-the-chemistry-classroom (accessed May 2018). (22) Muñoz, M. A.; Guskey, T. R. Standards-Based Grading and Reporting Will Improve Education. Phi Delta Kappan 2015, 96 (7), 64−68. (23) Van Hook, M. S. Standards-Based Grading; MSV Education Network, 2014. https://www.msveducationalnetwork.com/aulas/ artigos-e-recursos-para-educadores/ (accessed May 2018). (24) Buckmiller, T.; Peters, R.; Kruse, J. Questioning Points and Percentages: Standards-Based Grading (SBG) in Higher Education. Coll. Teach. 2017, 65 (4), 151−157. (25) Post, S. L. Standards-Based Grading in a Thermodynamics Course. Int. J. Eng. Ped. 2017, 7 (1), 173. (26) Stange, K. E. Standards Based Grading in a First Proofs Course; University of Colorado, 2016. http://math.colorado.edu/~kstange/ papers/Stange-standards-discrete.pdf (accessed May 2018). J

DOI: 10.1021/acs.jchemed.8b00251 J. Chem. Educ. XXXX, XXX, XXX−XXX