Article pubs.acs.org/jchemeduc
Using Structured Chemistry Examinations (SChemEs) As an Assessment Method To Improve Undergraduate Students’ Generic, Practical, and Laboratory-Based Skills Stewart B. Kirton,* Abdullah Al-Ahmad, and Suzanne Fergus Department of Pharmacy, University of Hertfordshire, Hatfield, Hertfordshire AL10 9AB, United Kingdom S Supporting Information *
ABSTRACT: Increase in tuition fees means there will be renewed pressure on universities to provide “value for money” courses that provide extensive training in both subject-specific and generic skills. For graduates of chemistry this includes embedding the generic, practical, and laboratory-based skills associated with industrial research as an integral part of undergraduate training. Acknowledging the perception from industrial employers that the laboratory skills of high-achieving graduates in chemistry do not match their academic ability, we present SChemEs (structured chemistry examinations), a novel method of authentic assessment that focuses on developing and rewarding competency in the laboratory. Emphasizing the importance of these skills for future employment and thus embedding them in an undergraduate’s skills portfolio will enhance graduate employability. This article outlines the methodological development of SChemEs (which was inspired by the objective structured clinical examinations used in clinical programs), provides an overview of how a SChemEs assessment runs, gives examples and grading criteria used in the exercise, and presents data from a pilot study on attainment and student viewpoint regarding SChemEs. KEYWORDS: Laboratory Instruction, Testing/Assessment, First-Year Undergraduate/General
■
INTRODUCTION
workplace, such as time management and presentation skills alongside subject-specific skills and knowledge.4 It is therefore ironic that at a time when students are expecting courses from universities that will enhance their employability, industry itself is still bemoaning the lack of workplace skills at the disposal of top graduates. Although the major concerns from employers are universalnamely, a lack of appropriate levels of literacy and numeracy5a drop in the number of students studying in science programs coupled with a shortage of “laboratory-ready” graduates in the STEM disciplines is of particular concern to the scientific industry. The perceived shortage of “competent” graduates in STEM disciplines has been well documented,6 and several hypotheses have been offered to understand why this perception prevails. For chemistry students it is clear that the amount of time the average student spends in laboratory classes has decreased significantly over the last 50 years. Reid and Shah7 showed that a Level-4 chemistry student in the 2000s spends only 44% of the time in the laboratory when compared to their 1960s contemporaries and argue that it is this relative lack of exposure to practical work and the laboratory environment rather than a lack of underlying ability that contributes to the perception that STEM graduates lack basic competencies. Reid and Shah
In May 2011, the United Kingdom Business Secretary, Vince Cable, warned universities in England and Wales that the “biggest mistake” they could make with respect to the rise in undergraduate tuition fees would be to “underestimate” the expectations of the consumers. Before paying up to £9000 a year, many prospective students will want to know how a specific university course will enhance their employability. A recent study found that 89% of students rated future employability and salaries as major factors when deciding which university, and which degree pathway, to apply for,1 and in direct response to the increase in tuition fees, universities in England and Wales are now required to produce Key Information Sets (KIS) that provide information to potential students on aspects of their courses, including the amount of contact time, student satisfaction, employment rates, and average salary of graduates.2 As such, there is an onus on universities in England and Wales to develop curricula that embed the employability skills coveted by graduate employers, especially amid the growing disquiet that courses with increased fees have not provided value for money.3 Underlining this need for “work-ready” graduates is new research showing that recent chemistry graduates would have preferred courses that allocated more time to developing the soft skills used heavily in the © 2014 American Chemical Society and Division of Chemical Education, Inc.
Published: March 28, 2014 648
dx.doi.org/10.1021/ed300491c | J. Chem. Educ. 2014, 91, 648−654
Journal of Chemical Education
Article
practical skills demanded of these professions.12 It is intended that OSCEs will assess whether or not a student is competent as a practicing professional. This is achieved via the use of multiple OSCE stations. Each station details a different scenario designed to test a range of clinical competencies that take 5−15 min to complete. Recent studies have shown that the use of OSCEs enhances assessment by addressing the appraisal of skills that may be difficult to measure by traditional examinations.13 The notion of developing and devising authentic assessments for the chemistry laboratory is not new. Several innovations have been reported that use multiple-station-style examinations in order to probe deep understanding of chemical concepts. Examples include those proposed by Silberman and coworkers14 who looked at multistage assessments testing practical skills in conjunction with recognition and recall and interpretation of data in order to gain an insight into student ability. Neeland’s15 use of problem-based learning under timeconstrained conditions in the laboratory also looks to assess student understanding of chemical concepts, as does the multiple-station assessment proposed by Rhodes16 for first-year high school students. There are obvious similarities with the protocols outlined above and the methodology we propose for SChemEs. However, we feel that SChemEs expand on the previous work by taking the focus of the assessment away from the outcomes achieved and placing the focus on rewarding the processes involved in achieving the outcome, hence emphasizing the importance of process, as well as outcome, in the laboratory. As such, this article centers on exploiting the concept of OSCEs to develop a series of assessments designed to appraise the practical, generic employability and laboratory-based competencies of undergraduate students in chemistry and chemistry-related disciplines. Five key-skill areas are considered: basic techniques, information management, interpretative exercises, apparatus assembly and handling, and numeracy. Collectively these assessments are known as SChemEs(Objective) Structured Chemistry Examinationsand they are designed to facilitate retention of practical and laboratory-based skills and embed the importance of generic employability skills. The exercise is in its initial stages of development and at the moment is only aimed at Year 1 students in pharmacy and the life sciences at the University of Hertfordshire.
suggest that this lack of familiarity, rather than a lack of ability, places present-day graduates at a distinct disadvantage when entering the world of work.7 Reid and Shah also postulate that the situation is exacerbated because employers who have qualifications in STEM subjects are likely to have completed degree programs with significantly higher practical content than the graduates they employ, and as a consequence employers’ expectations of the laboratory training graduates have received could be unrealistic.7 However, with financial resources in universities become increasingly strained, it is unlikely that there will be moves to increase the number of comparatively costly practical classes within any given program. Hence, alternative and innovative solutions to embedding essential laboratory, practical, and generic skills are required. Another possible contributing factor to the perceived reduction in competency of STEM graduates could be a result of undergraduate students being socialized by their earlier educational experience to value grades rather than learning,8 and the notion that the skills that they are using in the laboratory are of reduced importance compared to the laboratory report produced at the end. If we accept this premise as true, we can assume that the majority of students will place greater emphasis on the results of an exercise rather than the processes employed in achieving the goals. Given this, the academic and industrial communities should not be surprised that high-achieving graduates initially struggle with the trouble-shooting ethos of business as it is the solution to the problem they are focused on, not the methodology or skills required to achieve it. This is not a new concept. Clow and Garratt9 have previously highlighted the dangers of students mechanically following laboratory schedules in order to achieve a “correct” result and, as a consequence, being unable to deal with or explain a result that deviates from the ideal. Additionally, if the “soft skills” employed in arriving at the solution to a problem, such as time management and experimental design, are not specifically rewarded in a mark scheme, it can be argued that they will be perceived as being relatively unimportant by the student. Therefore, a pragmatic approach to ensuring that students value the importance of being able to demonstrate competencies in practical, laboratory, and generic soft skills, and to embed these skills in order to enhance graduate employability, is to design assessments that reward the ability to demonstrate competency in such skills. It is interesting to note that chemistry is not alone when considering the poor correlation between academic ability and performance in work-related situations. Research carried out by The Royal Pharmaceutical Society of Great Britain (RPSGB) in conjunction with the University of East Anglia10 established that good performance in an academic environment was not necessarily an indicator of success in the clinic. The discrepancy that exists between competency and academic achievement for both pharmacy and chemistry students can be succinctly summarized by considering Miller’s pyramid of competence.11 Industry asserts that scientific graduates predominantly exhibit competencies at the lower two levels (“knows” and “knows how”), when ideally they wish to employ graduates demonstrating skills congruent with the highest tiers (“shows how” and “does”). Clinical disciplines have addressed the need for advancement and assessment of competencies in addition to knowledge by developing the objective structured clinical examinations (OSCEs). OSCEs were first introduced in the 1970s as training tools for medical students and nurses as a way of assessing the
■
METHODOLOGICAL DEVELOPMENT Molecular Structure and Reactivity (MSR) is the compulsory 30-credit Year 1 chemistry module offered to students in the pharmacy and life sciences programs at the University of Hertfordshire. Although there are some lectures introducing basic principles of thermodynamics, atomic orbital theory, and chemical structure, the majority of the course focuses on developing an understanding of organic chemistry nomenclature and reactivity in preparation for more detailed medicinal chemistry modules that follow. Consequently, the following analysis is biased in favor of organic chemistry. The summatively assessed practical components of MSR consist of four, four-hour laboratory sessions. In response to the importance of contextualization when teaching chemistry,17 all practical sessions are centered on aspirin. In addition, students also complete two independent study assignments, one in numeracy, and the other in the use of computers for retrieval of chemical information. Each assessment was reviewed to identify the range of skills each exercise was designed to develop. A 649
dx.doi.org/10.1021/ed300491c | J. Chem. Educ. 2014, 91, 648−654
Journal of Chemical Education
Article
This grade for preparedness will constitute one-sixth of the overall SChemEs mark, rewarding and emphasizing the important generic skills of information retrieval and time management. Once in the laboratory each student is required to complete five stations: one station corresponding to each of the umbrella headings previously defined (and shown in Figure 1). Each station has a written set of instructions regarding how the candidate should complete the station. Students have 5 min to complete each station; they are told when they have 1 min remaining for the completion of their station. The 2 min rest period between each station allows assessors to reset the assessment in preparation for the next candidate. The stations remain at fixed locations within the laboratory, and students move from station to station. Each station is staffed by an appropriate assessor, and the whole process is coordinated by a session leader who is responsible for timing each stage of the assessment. Each station contributes one-sixth to the overall SChemEs mark, meaning it is possible to perform poorly on one or more stations and still achieve a passing grade (≥40%). The assessor at each station observes candidates as they complete the task, awarding marks for demonstrated competency according to a predefined mark scheme. A positive approach to assessment is taken whereby students are rewarded for demonstrating appropriate skills rather than penalized for inaccuracies or transgressions. The overall mark for the SChemEs is achieved by adding the scores for preparedness and the individual stations together. It is not a prerequisite to achieve a passing grade in the SChemEs exercise (≥40%) in order to pass the Year 1 chemistry module. This is because there is significant evidence to show that competency does not necessarily relate to the ability to perform in an examination (e.g., the Cambridge Model18) and it was deemed draconian to prevent progression on a degree program for failure to demonstrate competency in one element of coursework during the first year of study. In addition, each learning outcome associated with laboratory skills is assessed at least twice during the academic year (in the laboratory practical and in the SChemEs session) so it is possible for students to demonstrate competency in a skill without that competence being explicitly demonstrated in the SChemEs assessment. Also, SChemEs test skills that will be developed as part of the program, and the SChemEs assessment is a useful instrument at Year 1 for providing feedback to students with respect to their laboratory competencies. By reflecting critically on their performance, they are able to identify any limitations and construct plans for addressing these
thematic analysis of the results of the review, using the 20 skills categories defined by Hanson and Overton,4 was carried out. However, the scope of these categories proved too great for this investigation, and five novel umbrella headings were defined: basic techniques, information management, interpretative exercises, apparatus assembly and handling, and numeracy (Figure 1).
Figure 1. Diagram showing the umbrella headings (red) under which sit the individual assessments comprising the SChemEs.
■
OVERVIEW OF THE SCHEMES ASSESSMENT PROCEDURE SChemEs take place in the final laboratory session at the end of the MSR module and after all other practical classes have been completed. Prior to the SChemEs assessment students are given the opportunity to work through and receive feedback on example SChemEs stations provided via a virtual learning environment. It is important to note that each SChemEs station is directly related to a technique the students have encountered and used in order to complete the preceding summatively assessed practical classes. On the day of the assessment, students, in groups of 20, are asked to report to the laboratory 10 min prior to the beginning of their scheduled test. Instructions as to what is required for the assessment in terms of equipment (pen, pencil, calculator, ruler) and standard of dress (appropriate shoes, protective lab coat, hair tied back, use of flame retardant headwear, etc.) are made available via the electronic learning environment. Students are assessed according to their level of preparedness, in line with the instructions given. Candidates are penalized if they are late, inappropriately dressed, or inadequately equipped.
Table 1. Distribution of Subject-Specific Skills Tested by the SChemEs Stations SChemEs Stations a,b
Skills
Chemical terminology Fundamental chemical principles Organic compounds and reactions Analytical techniques Safe handling of chemical materials Manipulative practical skills Skills with chemical instrumentation
Basic Techniques
Numeracy
Apparatus Assembly and Handling
Interpretative Exercises
Information Management
X X
X
X X X X
X
X X X
X
a
Skills as defined by Hanson and Overton.4 bA number of areas (principles of thermodynamics; kinetics of chemical change; inorganic compounds and reactions) are not covered by Year 1 SChemEs; these will need to be developed by SChemEs and other methodologies as a student progresses through the program. 650
dx.doi.org/10.1021/ed300491c | J. Chem. Educ. 2014, 91, 648−654
Journal of Chemical Education
Article
Table 2. Distribution of General Skills Tested by the SChemEs Stations SChemEs Stations Skillsa,b Experiment planning/design Interpreting experimental data Numeracy/computational skills Information retrieval skills Problem-solving skills Time management and organizational skills
Preparedness
Basic Techniques
Numeracy
Apparatus Assembly and Handling
Interpretative Exercises
Information Management
X
X
X X X
X X X
X X X
X
X
X
X X X X
X X X
a
Skills as defined by Hanson and Overton.4 bA number of areas (report writing skills; oral presentation skills; teamworking skills; independent learning ability required for continuing professional development) are not covered by Year 1 SChemEs; these will need to be developed by SChemEs and other methodologies as a student progresses through the program.
numeracy tasks set will also require the candidate to demonstrate proficiency with subject-specific skills, namely, an understanding of chemical terminology and the application of fundamental chemical principles. An example of a numeracy station and associated assessment criteria is given in the Supporting Information (Station 1B). This station deals with the ability of a student to calculate the amount of a compound needed in order to make a solution of 1 M concentration.
as they progress through the program. This aligns with the University of Hertfordshire policy regarding assessment of coursework at Year 1. However, MSR itself must be passed in order for a student to progress to Year 2, and the importance of competency in these skills is reflected in the relative contribution the exercise is given when determining the overall grade for the module: 15% of the overall grade. A pilot study involving 217 students was carried out in May 2013. The preliminary results from this study are presented below. They will be analyzed thoroughly and used to inform the design of SChemEs assessments at Levels 5, 6, and 7 of the pharmacy and life science programs.
Station 1C: Apparatus Assembly and Handling
Competency in apparatus assembly and handling at Year 1 comprises the ability to select equipment appropriate to the task at hand from a selection of standard laboratory glassware and manipulate the glassware to assemble a safe and secure apparatus conducive to solving the problem at hand. An apparatus assembly and handling station will test candidates’ generic skills, including the ability to manage their time effectively, their ability to solve problems, information retrieval skills, and their ability to plan and design experiments. A successful candidate will also need to use subject-specific skills to complete the assessment, specifically an understanding of chemical terminology and demonstrable manipulative practical skills. The example presented for the apparatus assembly and handling station in the Supporting Information (Station 1C) requires the student to assemble the equipment required to carry out a vacuum filtration and outlines the grading criteria associated with this exercise.
■
DEFINING THE SCHEMES CATEGORIES What follows is a general description of the different types of stations that comprise a first-year undergraduate SChemEs assessment. Specific examples of each station are provided as part of the Supporting Information accompanying this manuscript. Station 1A: Basic Techniques
For Year 1 students we defined competency in basic techniques as the ability to correctly identify and use a standard piece of laboratory equipment in the completion of a routine task and to handle any chemicals used in conjunction with that equipment with an appropriate level of care. Mapping each SChemEs activity onto the skills categories defined by Hanson and Overton (Tables 1 and 2) assisted in the development of the exercise.4 Hence, a SChemEs station assessing basic laboratory techniques will generally require students to demonstrate competencies in the generic skills of time management and organization and information retrieval. The subject-specific skills of using chemical instrumentation, manipulative practical skills, and safe handling of chemical materials are also examined. An outline of a Basic Technique station, and the relevant marking criteria, centered on assessing whether students are able to accurately weigh a chemical compound is given in the Supporting Information (Station 1A).
Station 1D: Interpretative Exercise
A SChemEs station associated with interpretative exercises requires candidates to examine a typical result from an experiment and perform a series of tasks to explain and contextualize that result. Competency in this area at Year 1 is represented by the ability of the candidate to apply simple chemical concepts to solve a multistep chemistry problem. The assessment may be segmented at this level to help guide the candidate in a stepwise manner toward the overall solution. As for all SChemEs stations, the generic skills of time management, problem solving, and information retrieval are implicitly examined. In addition, a successful candidate will also need to demonstrate the ability to interpret experimental data accurately. Subject-specific skills examined by this station include a grasp of chemical terminology, understanding of fundamental chemical principles, and in some cases an understanding of analytical techniques and organic compounds and their reactions. The SChemEs station presented as an example of an interpretative exercise in the Supporting Information (Station 1D) outlines the procedure and grading
Station 1B: Numeracy
We defined competency in numeracy at Year 1 as the ability to carry out simple multistage chemical calculations quickly and accurately, the ability to lay out an answer so that the reasoning at each stage of the calculation is clear and easy to follow, and the ability to employ an appropriate use of units. A numeracy station will therefore test the generic skills of time management and organization, information retrieval, problem solving, and numeracy and computational skills. The chemical nature of the 651
dx.doi.org/10.1021/ed300491c | J. Chem. Educ. 2014, 91, 648−654
Journal of Chemical Education
Article
criteria for students who are asked to analyze and interpret the results they observe from a TLC analysis. Station 1E: Information Management
A SChemEs station focused on information management requires candidates to demonstrate their ability to use literature and computational resources to accurately retrieve and record simple chemical and physical data and reference material. Competency at Year 1 is defined by the ability to locate and accurately report data from a primary literature source (electronic or paper-based). The information management stations are primarily concerned with assessing generic skills, once again requiring successful candidates to demonstrate competencies with respect to time management, information retrieval, and problem-solving skills. In some cases, computational skills are also tested and developed. With respect to subject-specific skills, a grasp of chemical terminology is required. The example given in the Supporting Information (Station 1E) is concerned with candidates using the Internet to retrieve and accurately record information on a journal article. The information provided outlines the process and the grading criteria associated with this exercise.
Figure 2. Chart showing the relative performance of the top 10% (blue) and bottom 10% (red) of students for each of the SChemEs stations in comparison to the average marks (green) for the cohort.
found the interpretative exercises and the information management stations most challenging. When examining the cohort as a whole, there is no correlation between the rank of the student in the SChemEs assessment and the rank of the student in their overall coursework performance (Spearman ranking correlation coefficient = 0.29), which indicates that students are being tested on skills in the SChemEs assessment that are different from the skills being assessed in other coursework elements. There is a modest correlation (Spearman coefficient = 0.61) between performance in SChemEs and performance in the midterm examination, but overall the statistics suggest that SChemEs are examining skills that are not being captured in other assessments on the module and, hence, are a valid and valuable addition as an assessment tool.
■
PILOT STUDY In May 2013 a pilot study of the SChemEs assessment was carried out at the University of Hertfordshire involving 217 students (130 Year 1 pharmacy students and 87 Year 1 students in life sciences, i.e., biochemistry, pharmacology, and pharmaceutical sciences). Immediately after finishing the assessment the students were asked to complete a questionnaire (see the Supporting Information, End of Assessment Questionnaire) in an effort to gauge their reactions and opinions. The questionnaire has 12 questions and uses a fivepoint Likert scale to categorize responses. Participants were also invited to provide free-response comments on any aspect of the assessment as part of their feedback. All questions used were piloted and validated by a group of academics and postgraduate students prior to being presented to the undergraduates; subsequently no significant changes were deemed necessary to the pilot questionnaire.
■
Student Attitudes and Opinions on the SChemEs Assessment
A preliminary analysis of the student questionnaire shows that the majority (58%) of students agreed that the SChemEs assessment was fair, and the vast majority (78%) believed it assessed their competence in the chemistry laboratory. As such, the assessment appears to be fit for its purpose. Students were generally aware that the SChemEs assessment would test skills developed in the preceding chemistry laboratories (67%), and participants also reported that they thought it was important that they were able to complete routine tasks in the laboratory within short periods of time (87%). Students were satisfied that the tasks they were given to complete at their SChemEs stations were similar to tasks they had seen previously as part of the laboratory classes (93%), but did not appear to view these laboratories as adequate preparation given that 88% “strongly agreed” or “agreed” that there should be a practice SChemEs session prior to the summative assessment. Resource implications mean that running a formative SChemEs session in addition to all other laboratories is not feasible, but as a consequence of this feedback we have taken steps to further signpost the important skills tested by SChemEs in the student practical schedule, and have directed students to short bespoke and public-domain videos that demonstrate examinable skills and techniques. It was heartening to see that only 14% of students did not believe that the skills being tested in the lab were important to their future careers, as this is an indicator that students believe the assessment as a whole is valuable to their development. It
RESULTS
Student Performance in SChemEs
The mean mark for the SChemEs assessment in this pilot was 56.40% ± 12.6, which was substantially lower than average overall coursework marks for the cohort (66.68% ± 6.4), but comparable to the average mark obtained for the midterm multiple-choice examination (57.01% ± 12.7). This is unsurprising given that the SChemEs assessment and midterm examination both involve students being assessed under timeconstrained conditions, and it is often observed that students score more highly in coursework than time-controlled tests.19 When comparing the students who were ranked in the top 10% of the SChemEs assessment to those ranked in the bottom 10% (Figure 2), we see that, with the exception of the mark awarded for preparedness, the stronger students outperform the weaker students on every station, which lends support to the assessment being discriminative. The range of marks for each category of station is 5 (i.e., at least one student scored 0/5, and at least one student scored 5/5 for each task) with the exception of preparedness, where the range is 2 (all students scored between 3 and 5). The average marks for each station also shown in Figure 2 highlight that this cohort of students 652
dx.doi.org/10.1021/ed300491c | J. Chem. Educ. 2014, 91, 648−654
Journal of Chemical Education
Article
student education, and therefore help us move toward the initial goal of embedding the general and subject-specific skills desired by employers.
was also apparent from preliminary studies that the workload during laboratory sessions had generally been shared equally between lab partners, although there were a significant number of students who admitted shirking their responsibilities and allowing their partners to do significantly more work in the lab than they did (31%). Unfortunately, because the questionnaires were completed anonymously, any potential correlation between students who had taken a backseat in laboratories and performed poorly in SChemEs could not be established. Relatively few students (