Bridges to the Future: Toward Future Ready Graduates in Chemistry

Apr 12, 2019 - The assessment has been aligned with the Intended Learning Outcomes; authentic group-based laboratories with Socratic questioning has ...
0 downloads 0 Views 3MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Bridges to the Future: Toward Future Ready Graduates in Chemistry Laboratories Fun Man Fung*,†,‡ and Simon Francis Watts*,†,‡,§ †

Department of Chemistry, National University of Singapore, 3 Science Drive 3, Singapore 117543, Singapore Institute for Application of Learning Sciences and Educational Technology (ALSET) University Hall, Lee Kong Chian Wing UHL #05-01D, 21 Lower Kent Ridge Road, Singapore 119077, Singapore § Brighton Observatory of Environment and Economics, New Brighton, 8061 New Zealand Downloaded via UNIV OF LOUISIANA AT LAFAYETTE on April 12, 2019 at 14:41:38 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.



S Supporting Information *

ABSTRACT: The redesign of an advanced analytical and physical chemistry practical module for senior undergraduates is described in this report. The underpinning rationale for the changes is to use cooperative learning groups placed in situations beyond their previous experience to increase the student experience of Kolb’s Learning Cycle to a level greater than is usual for undergraduates. The students are required to make decisions in the context of ambiguity and unknown unknowns, hence gain experience to prepare them for research or post university life. The assessment has been aligned with the Intended Learning Outcomes; authentic group-based laboratories with Socratic questioning has been implemented for four years. Both direct (quantitative) and indirect (qualitative) measures of the effect of the changes are reported. The effectiveness of the scaffolding of the new exercises has been measured using mark outcomes, staff observations and student thoughts about what they have learned. Evidence indicates that although marks have only increased marginally (by about 1.5 marks), the differences between the old and redesigned modules are statistically significant. Similarly, a Dixon’s Q analysis revealed no outliers at p = 0.05. Deep learning by students is occurring, and there seems a much higher level of engagement than previously. There are also statistically significant differences across the two sides of the module, with perceived deeper learning on the more “strictly” Socratic (analytical) side of the module. KEYWORDS: Upper-Division Undergraduate, Analytical Chemistry, Curriculum, Inquiry-Based, Socratic Questioning, Discovery Learning, Communication, Writing, Laboratory Management, Laboratory Instruction



INTRODUCTION

have more in common with gym workouts than running an Olympian race of learning and discovery, and it is difficult to see the relationship of this current activity to the original inspiration of including laboratory work in undergraduate chemistry degrees.

Since the late 19th Century, laboratory work has been a central component of all serious undergraduate chemistry degree courses; for example, the Chemistry Department at Harvard opened its first undergraduate chemistry laboratory in 1850,1 and Oxford did similarly in 1860:2 uniquely called “The Abbot’s Kitchen”. Laboratory work provided not only practitioner/researcher training, but at that time, fairly basic laboratories were also the engine which was growing our small fledgling subject per se.3 One of the reasons for the inclusion of laboratory work in these courses was the exposure of students to (what we now describe as) Kolb’s Learning Cycle, which underlies much chemical thought. Two centuries later, chemistry has grown, and the chemistry education landscape has changed with it. Now, all too often the focus is on teaching rather than learning,4 and undergraduate chemistry students mistake attending classes and memorization for understanding.5 Typically, students have very limited exposure to Kolb’s Learning Cycle. The laboratories can © XXXX American Chemical Society and Division of Chemical Education, Inc.

The Singapore Context

In Singapore, student competition is often intense, the work ethic strong, and our students seem to have heavy workloads. In the past, the course required students to write up large numbers of familiar experiments, each carrying little assessment weight and all too often unchanged from year to year. It could be argued that by allowing this situation,6 along with the adoption of “overly pedantic” assessment regimes, faculty themselves incentivise plagiarism.7 In a student system where coursework was readily available for sale from Web sites and an Received: September 22, 2018 Revised: March 17, 2019

A

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

active senior system passed good experimental write-ups down the years to juniors, it is perhaps unsurprising that some students less focused on their own learning were tempted into plagiarism. Although not the major motivation for the work reported below, it was a factor in the redesign of this and other courses. Rationale for Modern Undergraduate Laboratory Work

In preparation of students for the wide world, post-university practitioner training is key to good higher education. Although some chemistry graduates are employed for their laboratory skills and others engage in research, only about half are employed within the subject at all.8 Instead, they are in demand in areas like finance and accountancy.9 These areas require the legion transferable, analytical, and abstract manipulative skills that are learned in the study of chemistry, but not the chemical context in which they were learned. While it is beyond doubt that modern undergraduate chemistry laboratories still provide laboratory practitioner training, in particular psychomotor skills,10 the evidence for them either enlarging/growing chemistry or even supporting the acquisition of other chemical knowledge is rather sparse.11,12 When this is coupled with the fact that laboratory work is extremely expensive to provide and demanding of student and staff time, it is no surprise that there is an emerging debate about the opportunity costs of the inclusion of laboratory work in undergraduate chemistry degrees.13,14 Logically, it could be argued that a general course in the first year would be required for all, but beyond that, the large proportion of time spent in subsequent years by our undergraduates (up to 20% in some courses) only makes sense for those wanting to do honors research projects or see a lab-based/research career destination. Notwithstanding, most chemistry professional and validating bodies require large laboratory components, and it is our responsibility to get the most out of these activities. Theoretically, for meaningful learning in undergraduate chemistry laboratories to occur, the cognitive (thinking), affective (feeling), and psychomotor (doing) domains need to be integrated.10,15 In this situation, laboratories begin to not only provide improved practitioner training for future ready graduates but also support the learning of chemistry. Recent increasing focus on what happens after graduation and the implications of this to the structure and delivery of teaching has led to greater emphasis on authentic learning that prepares graduates for their professional futures.16,17

Figure 1. Kolb’s learning cycle as experienced in the undergraduate chemistry teaching laboratories.

delivered by using cooperative learning groups21 as part of the strategy. In a student environment, interactions among students in formal teaching situations are usually competitive or minimal.21 The third (unusual) choice is cooperation, working together to realize shared objectives. Cooperative learning where group members work together to maximize the learning of all, sharing a belief that better individual performance produces better group performance, make it probably the most effective way of promoting learning22 and critical thinking.23 The theoretical framework for this is based in constructivism, where individuals connect new experiences with their existing knowledge to increase and grow their understanding.24 By making groups (which are of themselves dynamic entities) part of the learning process, the social interaction increases motivation and gives the group members mutual support.22 Sociocultural perspectives on this situation identify interdependence within the groups of social and cognitive processes, i.e. a relationship between cognitive development and social interaction.25 Learning occurs in this environment when those groups are given problems beyond their current developmental knowledge.26 So, cooperative learning is supported by positive group interdependence in a zone of proximal development.



BACKGROUND

Intended Learning Outcomes

Strategy and Structure

At the National University of Singapore (NUS), students who enroll in the chemistry degree program must read the compulsory Year III analytical/physical chemistry practical module (CM3292) to fulfill the graduation requirement (Figure 2). The module CM3292, particularly the analytical half, has been designed in terms of mode, activities, and assessment to be effective (i.e., implement Novak’s ideas).27 Intended Learning Outcomes (ILOs) have been aligned with assessment,27,28 and the whole gives progression in the degree (the formal module descriptions of this module, syllabuses of the prerequisites, as well as a recent copy of the module booklet are included in the Supporting Information). It is a single semester 13-session course that runs over 13 weeks, 6 h per session. Students work in groups, and before each session, students need to have met and planned how they will proceed,

Figure 1 puts the undergraduate laboratory learning in the context of Kolb’s experiential learning cycle, where the focus is on the learning process, not simply the outcomes.18 More recent work has considered the application of this model to laboratory learning19 and also to how learning from experience is a driver of personality development.20 Kolb’s Learning Cycle is experienced progressively and additively throughout a person’s life. Optimistically, if laboratory studies are part of that life, then apart from the growth of a science mind with science-useful traits, if the lab situation mirrors real world problems and processes, we are also forming adaptable, flexible, and future ready graduates. However, although optimism is desirable, realistically, the laboratory experience “input” into the learning cycle is likely to need enhancing to have the desired impact. In this work, that enhancement is B

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 2. Degree compulsory laboratory module structure (green) with structural and pedagogic relationships between CM3292 and its prerequisites. Nonlaboratory CM3292 prerequisites are in straw color; blue comprises expectations before and after CM3292. Notes: CM2142 no longer runs. C is compulsory module; O is optional module; and A&P is analytical and physical.

The experiments themselves are not covered in the eLecture material. Given students in a laboratory setting will be in the psychomotor domain, to engage and integrate the other two domains (affective and cognitive), the approach taken here is: • employ group and collaborative working with time pressure (cements the groups, allows wider subject engagement, engages the affective domain);31 • create ownership of experiment, allow students to form experiments (generates “buy-in”);32,33 • allow students to form their own research questions and increase their view of Kolb’s Learning Cycle (engages cognitive domain); • use of cognate but not expert subject domain to create enthusiasm for learning and applying what they already know;34 and • improve working skill-base (future readiness) by including the real world environment.35 Consistent with the ILOs, the main assessment is not based on repetitive experimental write-ups, but rather on the keeping of a functional laboratory notebook. This task includes keeping the laboratory record as well as freeform work up, assessment of results, and reliability of experimental data. There is a clear understanding that although “good” experimental results are expected, it is what they do with the results they have (i.e., introspective work up and analysis of results and implications for the “next” experiment) that are more important. Therefore, the number of practical write-ups required has been reduced, and appropriate weight for the notebook has been trialed. Other assessment includes a single full write-up of one experiment to a required format, a presentation, a viva vocé examination, and an unseen examination. Over the three years, the pedagogic research questions addressed have included the effects of: • lowering the number of write-ups required; • changing the laboratory notebook assessment weight; and • changing the test assessment weight.

complete a risk assessment, etc. For one-half of the semester, students are doing physical chemistry; for the other half, they are doing analytical chemistry (see the Supporting Information). Although most are probably unaware of their internal learning processes, the module extends the student experience of Kolb’s Learning Cycle by requiring students to identify the aims and research questions for their exercises. In other words, this module focuses on all four parts of the cycle, introducing environmental sampling (a new lab experience for them) as well as using Socratic questioning as the main interface throughout. Whereas Years I and II are characterized by “known knowns” and “known unknowns”, this module adds “unknown unknowns”, i.e. the students do not know what they do not know.29 Many of these unknown unknowns are nested in the environmental sampling, one of either soils, waters, or atmosphere. These (new) exercises extend over some weeks and include the chemical and intellectual analysis of samples and obtained results, respectively. The full experimental script for one of these “project practical” experiments is included in the Supporting Information. The students are in pairs and groups and subjected to time pressure to identify safe and productive potential sites for their sampling.30 Of course, all work they do subsequent to the sampling is anchored to the assumptions made in the planning and execution of that sampling (whether the students initially realize it or not), and this is one of the lessons that comes out of these exercises. The course is also supported by three streams of eLectures, covering: 1. The chemical bases of the techniques they will be using, i.e. how it works; 2. How to operate the equipment they will be using, including the environmental sampling they will be doing; and 3. Transferable skills like advanced presentation skills, how to ask good questions, how to work in a group, how to perform a risk assessment, etc. C

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 1. Descriptive Data for the Main Semester Offerings of CM3292 since Academic Year 2013/14 S2 Weight of Course Components/% Year and Semester 2013/14 2014/15 2014/15 2015/16 2015/16 2016/17 2016/17 2017/18

S2a S1 S2 S1 S2 S1 S2 S1

Nb

Average Mark

95 90 63 84 89 86 68 77

70.0 71.2 75.2 69.3 72.3 71.0 74.1 71.6

SD

Number of WriteUps

WriteUps

Notebook

8.7 6.3 5.3 6.1 5.7 4.3 6.0 6.3

11 8 8 4 4 1 1 1

40 40 40 25 25 25 25 15

0 20 20 35 35 30 30 17.5

Presentation Test 0 10 10 10 10 10 10 10

20 10 10 20 20 25 25 30

Viva Vocé

Practical Test

Weight per Practical

10 20 20 10 10 10 10 10

30 0 0 0 0 0 0 0

3.6 5.0 5.0 6.3 6.3 25.0 25.0 30.0

a The previous form of the module was last offered in 2013/14 S2. bThe students are split between equal classes on 2 days, so actual “in the room” class size ranged between 15 (2013/14) and 27 (2014/15).

Figure 3. Annual average marks with number of write-ups required and weight per write-up. Note transition year data when average annual mark was 72.9, and the unseen test weight dropped by 50% (not plotted). Note exponential fit chosen to help the reader’s eye only, see Table 1 for numbers of students.

One final point on how grades are assigned to the assessments in this module. Historically, a lot of the assessment (including laboratory work) used in Singapore owed much to the Chinese tradition.36 It included what might be perceived as overassessment, assessment regimes requiring incredible (nonvital) detail,7 and over-reliance on certain assessment types (e.g. unseen tests). Moving away from this approach and in common with other related initiatives,17,37 we assessed skills and content together to measure the achievement of ILOs. Therefore, clearly the criteria for the determination of a grade is different for each ILO being assessed (see module booklet in the Supporting Information).

three years (Table 1). On average, there are usually about 100 students per semester with a staff team of 4 academics and 12 demonstrators spread over 4 laboratories on 4 separate laboratory days, each with about 15−27 students. Demonstrators and staff underwent training in Socratic questioning techniques. The component weights have systematically changed over the three years of this module. 2013/14 was the last year of the previous “more traditional” module form (Table 1) with relatively unchanged experiments and widespread use of scripts from seniors. The experiment started in S1 of 2014/15 when new experiments were introduced and laboratory notebooks and presentations were implemented. Although the practical test was dropped, the viva vocé exam continued but was enhanced to include the use of a laboratory instrument.



METHODS OF ASSESSING OUTCOMES Outcomes of the new structure were assessed using both direct (quantitative) and indirect (qualitative) methods, understanding that these two methods have very different constructs and measure different things.38 Although it is relatively straightforward to use qualitative (subjective) data to identify what people think they have learned, it is often more difficult to get quantitative (objective) data to show what skills and abilities have actually been learned.39 In this module, we attempted the same by modification of component weights to test mark response alongside the other qualitative and quantitative measures. This module has now been conducted in this format for more than



FINDINGS AND DISCUSSIONS

Class Size

Class sizes ranged from about 15−27 across the time scale of the study (Table 1). It is generally accepted that students do less well in larger classes.40 Pilot work included a check on whether the changing student numbers were correlated with the overall average mark. This revealed a significant (p = 0.01) linear relationship: D

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

N = −4.6525M + 415.72, r 2 = 68%

has twice increased to even higher levels, any trend is less obvious. This may be due to staffing changes. Although not statistically significant, for the more aligned module, the marks (the quantitative measures of learning) have risen slightly.

where N is the number of students and M is the average mark. Typically in a (United Kingdom) university context, on average, a student might display a mark variability of about 7% around their average across their modules/courses.41 However, increasing class size (an affective parameter) has been shown to reduce the mark of students by about 12.5% of their average variability across the class size differences involved in this study.41 At grade B+ (70−75%), this translates to an average mark loss of about 0.6%, which is smaller than the other changes we are studying and hence will be ignored.

Staff View of Learning

The anonymous staff observation questionnaire results are based on what staff observed of students in the laboratory 2016/17 S2 appear in Table 2 and Figure 4. Note IRB codes B15-299 and S17-027E cover this data collection. Table 2. Questionnaire Statements and Results of the Anonymous Survey of Lab Tutor Observations

Component Weights

As discussed earlier, the aim has been to align the assessment with the ILOs so that the number of practical write-ups required has been reduced, and appropriate weight for the notebook has been trialed. In any one academic year, the assessment weights are the same in both terms. Unfortunately, changing assessment weights means more than one thing changes at the same time. The initial change in marks between 2013/14 and 2014/15 seems drastic. The increasing marks for both of the runs following 2013/14 are statistically significantly different (p = 0.05) by t-test from those of 2013/14. However, taken over the 7 means spanning the 3 years, despite the changes, the average mark has remained unchanged: ANOVA (p = 0.05). Similarly, a Dixon’s Q analysis revealed no outliers at p = 0.05. This looks to be due to the differences in average marks between the pairs of semesters in any year: the intra-annual variation is similar to the interannual variation. This is not helped by the fact that different staff led the module each semester. Over the last three years, the main change is the reduction of write-up load. Over that time, the annual average mark has risen by about 1.5 marks, but this is statistically not significant due to the size of the intra-annual variability referred to above. Generally, there are 2 factors in play when changing the number of pieces of assessed work for a given weight, in this case from 11 write-ups down to 1 (Figure 3). Giving students more time to complete work would be expected to increase the average mark, and generally, this is shown in Figure 3. As the number of write-ups required reduces, the marks increase. However, we also increased the weight on each write-up, and the higher relative weight is likely to incentivise.42 Evidently, as the weight per practical goes up, so too does the average mark (Figure 3). However, reducing the number of submitted experiments reduces the safety net for weaker students28 and potentially drives marks down by simple economics: not all marks are equal: it is easier to achieve the 15 marks between 45 and 60 (here, a D+ and B− grade) than those between 70 and 85 (here, a B+ and A+ grade). Stronger students have more time and are able to access those marks, but this is more difficult for weaker students. Given the slight increase in marks over the four years, the implication of this is that students are gaining marks from the Notebook component, and this is obvious in the raw module marks when adjusted for the different weights. The effect of increasing the test weight (“exam”) might be expected to reduce the marks, as continuous assessment is known to yield higher marks over examination.43 This is likely to be one of the factors in the increase in the average annual mark on halving the weight of this component (after 2013/14 S2) and then again when it was doubled (after 2014/15 S2) and the marks decreased. However, recently, as the test weight

Scoresb N=5 Item, Qa 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Statements for Response

Mean

SD

Initially, students seem to hesitate at the unfamiliar less structured approach to laboratory work. Most students engage productively with their groups. There seemed to be real curiosity and enquiry in the project practicals. The student groups develop robust research questions within their groups. Student analysis of their results from the project practicals can be extremely rigorous. The use of Lab Notebooks seems to support the continuation of the group discussions and analysis of their results. When they have to design it themselves, students encounter real issues in bedrock material like calibration. Students seem more aware of issues like units and precision/accuracy. The group working seems to encourage students to push their results and their interpretation harder. The groups seem to build confidence of the students in their own work. Students display ownership of their results and ideas. Students keep their notebooks both in and outside the laboratories. Students are curious about the pedagogy being employed. Students are less perturbed when they do not get the results they expected. Students seem more ‘self-propelled’ and organize themselves and their lab slots accordingly. The majority of students seem not to have understood the underlying aims of what we are doing. Many students are seen in the lab discussing their results with others (i.e., using time well). I have had a more than usual number of very interesting ‘science’ conversations with students about their results and issues that flow from them.

3.0

0.0

4.0 4.0

0.0 0.0

3.7

0.6

3.7

0.6

3.0

1.0

3.3

0.6

3.0

0.0

3.7

0.6

3.0

0.0

4.0 3.7

0.0 0.6

2.7

0.6

1.7

0.6

3.7

0.6

2.0

1.0

3.3

0.6

3.7

0.6

a

Q1 and Q2 identify that staff consent to their anonymous data being used with involvement of half of the module staff (see Figure 4). bThe Likert-type scale for statement responses has a range of 1−4 with 4 being strongly agree and 1 being strongly disagree.

The staff team is relatively small. Of the ten (non-GTA) laboratory tutors who taught the module, five responded, and hence, statistical analysis is limited. Table 2 contains the questions, and Figure 4 contains a bar chart of the staff responses to those questions. A four option Likert scale was used for staff to increase the contrast of the data by removing the midpoint of a five-point scale. The first observation is that E

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 3. Questionnaire Statements and Results of the Anonymous Student Survey Scoresb N = 30c Item, Qa 3 4 5 6 7 8 9

Figure 4. Summary of CM3292 staff observations of student activities. Questions (see Table 2) were four option Likert scale with 4 being strongly agree and 1 being strongly disagree. Q1 and 2 identify that staff consent to their anonymous data being used with involvement of half of the module staff.

Statements for Response

Mean

SD

The GTAsd are helping me to think and are improving my ability to learn. The Prof is helping me to think and is improving my ability to learn. I feel it is just the same as the weeks progress. The module is of high quality and adding to the value of my degree course. The lab teaching is of high quality and is helping me grow as a scientist. The two halves of the module are very different from each other. I found the Physical side of the module the more intellectually challenging.

2.3

0.8

1.9

0.8

3.1 2.4

0.8 0.9

2.2

0.8

1.8

0.8

2.5

0.9

a

Q1 and Q2 identify which day and which half of the module (analytical or physical) the respondent is in. bThe Likert-type scale for statement responses has a range of 1−5, with 1 being strongly agree and 5 being strongly disagree. cAbout 50% of students responded to the survey. dGTA is a Graduate Teaching Assistant.

• Q6 and 7 only slightly agreed that the lab teaching was growing their science (∼73% agreed or strongly agreed) but disagreed that it was valuable (p = 0.05) t-test; • Q8 and 9 agreed that the two halves of the module were different (∼83% agreed or strongly agreed) but were neutral that the physical side was harder (p = 0.05) ttest.

the results from each side of the module (analytical and physical) are very different. Whereas the analytical side observed very engaged student behavior, it seems that the physical staff observed less. t-tests indicate a statistically significant difference (p = 0.05) between the two halves but also confirm the analytical scores are also statistically significantly higher than the physical scores (p = 0.05). The questionnaire is mostly phrased positively, responses to the two less positive questions indicate that staff felt the (physical) students did not seem to have understood the underlying pedagogy (Q18) but also that all recognized the new mode of operating (Q3). On the positive phrased questions, overall averages are at “agree” or “strongly agree” but very skewed toward analytical. Certainly in that side of the module, this evidence seems to indicate that the strategy of the module appears to be yielding the hoped for increase in engagement and motivation, which would indicate deeper or effective learning.44,45 This data is direct (or quantitative data); it is separated from the learners, and display of the behaviors associated with deeper learning is good evidence that the learning has actually occurred.46 Because all students do both halves of the module, it is difficult to test the staff observation with marks.

Overall, although students feel that their lab tutor is helping them to learn, they are less certain that the GTAs are also doing this. This might be around the issue that although staff are secure enough (in the staff−student relationship) to use Socratic questioning with students, the GTAs are often less secure or experienced in this. There is also separate evidence that GTAs may be concerned that students might give them bad feedback for not answering student questions directly. Although this is a terminal survey, the evolution of student opinion from the larger weekly survey on this question is also interesting, see Figure 5 below for data from the 2014/15 S1 larger weekly survey results for Q4, “The Prof is helping me to think and is improving my ability to learn”. There are four teaching staff involved here, so the same lab tutor covers the same day and same half (analytical or physical) over all weeks. The difference between the analytical and physical sides, although consistent with the terminal survey above, is not statistically significant (p = 0.05): the student view of the effectiveness of the lab staff seems generally higher for analytical than physical sides of the module. There does not seem to be a statistically significant (p = 0.05) staff (personal teaching style) signal47 within the data. This may in part be due to the influence of the Socratic technique which may help to “even out” personal teaching differences; however, it also affects student perceptions of staff and pedagogic effectiveness.48,49 In summary, there is consistency between the staff and student views on the two halves of the module. It seems that the analytical and physical sides are seen as different to each other, and it also seems that the roles of the GTAs need more tuning. These results are also not inconsistent with the larger similar surveys taken in 2014/15. It is worth noting that with

Student View of Learning

An anonymous questionnaire was administered to students weekly during the first run of the new module (2013/14 S1) and at the end of 2016/17 S2. The collection of data was approved by the Institutional Review Board (IRB code no. B15-299). The questions and statistics of the latter survey (concurrent with the Staff Survey) appear in Table 3. In total 29/68 students responded. The point at which this survey was done means that all students had done both sides of the module: • Q3 and 4 indicate that students agree that their lab tutor is helping them to think and improving their ability to learn (∼87% of respondents agreed or strongly agreed) but are neutral that the Graduate Teaching Assistants (GTAs) are doing the same (p = 0.05) t-test; F

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 7. Grade awarded for CM3292 practical write-ups as a function of the Student declared index (expected grade minus actual grade awarded). See Table 4 for the grade point system.

time spent on writing up a practical and what grade they perceive they will attain (see the Supporting Information). These data relate to 2014/15 (the first year of “reformed” module) and comprise about 1000 cover sheets. Students were still submitting eight write-ups over the course of the semester while also keeping laboratory notebooks, so at this stage, the week-to-week workload was high. Figure 8 contains a filled cover sheet that has been made anonymous.

Figure 5. Weekly student response to Q4 “The Prof is helping me to think and is improving my ability to learn”. One hundred and thirtythree responses were received over the semester (analytical and physical, Monday and Thursday combined). Questions were fiveoption Likert scale with 5 being strongly agree. (a) All data plotted; error bars are standard error of the mean. (b) All data with summary statistics. In total, about 150 students participated with a mean weekly number of about 30 ± 17. aIT error resulted in data loss for these MAn weeks.

only a 50% response, it is likely that selectivity may be compromising the random nature of the sample,50 and of course, student perceptions of their learning are not the same as evidence of their learning.39 With any questionnaire like this, the response could also be evidence of personality (student satisfaction) as well as that of effective learning.51−53 Student Attitudes

We were also interested to understand something of general attitudes of our students. Figures 6 and 7 show some of the results from the cover sheets that students submit with their experiments. This sheet asks for specific information about

Figure 6. Student declared hours for CM3292 practical write-ups as a function of the grade the write-up was awarded. The highest grade A+ (85%+) is 10, whereas the lowest pass grade D (40−44.9%) is 1. Total numbers of data for combined A grades is 78, combined B grades is 125, and combined C grades is 6. The error bars represent ±1 standard deviation.

Figure 8. A sample cover sheet with staff feedback (anonymized here) that students hand in when they submit an experiment which asks for specific information about time spent on writing up a practical and what grade they predict they will attain. G

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



CONCLUSIONS The structural alignment and scaffolding structures implemented in this module were analyzed using both indirect (qualitative) and direct (quantitative) methods to establish whether these changes are likely to have increased effective student learning and achievement of the module ILOs. We concluded from our direct (quantitative) data the following: (1) organized changes to the assessment regime have produced corresponding changes in student achievement, which are consistent with the literature; (2) changes to the assessment items required, particularly lab notebooks rather than practical write-ups, seem to have resulted in higher marks where these are tested, hence deeper learning; and (3) scaffolding exercises have resulted in increasing focus on both the fundamental and real world aspects of analytical chemistry, hence fulfilling ILOs within an authentic learning environment (from staff interpretation of observed student behaviors). In addition, we concluded from our indirect (qualitative) data the following: First, student feedback indicates overall a perception of lab supervisors being more deeply involved in their learning on the analytical (Socratic) side of the module than on the physical side of the module. Further, student derived data indicate that across the medium and higher grades, students think they spend reasonably similar amounts of time and that amount of time is only weakly related to their achieved grades. However, their perception of how effective their work is (the grade it is worth) varies systematically, so that stronger students underestimate their grades and weaker students overestimate. The weaker the student, the greater the overestimation to a maximum of 2−3 grades.

The highest grade A+ (85% and higher) is worth 10 points, whereas the lowest pass grade D (40−44.9%) is worth 1 point. Although no statistical statement is possible, Figure 6 shows that student declared hours spent on experiments that earn the lowest grades (B−, C+, and C) is not very different to that spent on those which receive higher grades (A−, A, and A+), 19.0 and 21.1, respectively (or for 2015/16, 18.3 and 14.8, respectively). The lower standard deviations at the lower end of the scale reflect low sample numbers, the lowest grade shown here being C. This is not inconsistent with other results containing many thousands of responses.54 It is important to realize that these are student declared hours, and these can reflect the level of student stress and their perception as to how they interpreted the question. Student reports of hours in excess of 80 h (7 data points) were set at 37 h, as any longer seemed unreasonable and not practically possible. The first point concerns the number of declared hours being on average somewhere around 20. Here, the nominal weekly load per module is 10 h with 4−5 modules per semester. So, standard practical modules which have 6 h in the lab and maybe an hour preparation nominally only have 3 h for data treatment and write-up. The usual practical write-up load was ∼10 experiments per semester per practical module. If these data are representative, it provides strong motivation to educators to reduce the number of experiments they expect to be written up because just the magnitude of the task will disincentivize deep learning and incentivize plagiarism.55−57 Data from more recent runs of this module show not dissimilar results but in a lower stress environment because writing up fewer experiments (this year just one) reduces the workload, allowing students to focus on other learning. Another piece of information requested on the cover sheets is the student predicted grade for their work. Although not statistically significant, Figure 7 shows that those students whose assignments get the higher grades underpredict their grades, whereas the opposite is true for those who achieve the lower grades. This is observed beyond this module.54 Typically, those awarded A and A+ grades underestimate their results by between 1 and 2 grades, whereas B+ overestimate by about one grade, and the lower grades overestimate by between 2 and 3 grades. Table 4 shows the grading system adopted and coding used for the translation of grades to points.



Letter Grade

Corresponding Points

85 80 75 70 65 60 55 50 45 40 35 0

A+ A A− B+ B B− C C− D+ D F FF

10 9 8 7 6 5 4 3 2 1 0 0

ASSOCIATED CONTENT

S Supporting Information *

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b00771. Lab practical cover sheet and title page (PDF) CM3292 Advanced Experiments in Analytical and Physical Chemistry module booklet (PDF) Experiment 1a Measurement of Atmospheric NO2 analytical project practical v9 (PDF) NUS Department of Chemistry Module details for CM2101 (PDF) NUS Department of Chemistry Module details for CM2142 (PDF)



Table 4. Grade Point Systema Grade Percentage

Article

AUTHOR INFORMATION

Corresponding Authors

*E-mail: [email protected]. *E-mail: chmff[email protected]. ORCID

Fun Man Fung: 0000-0003-4106-3174 Simon Francis Watts: 0000-0002-7420-4730 Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS The authors would like to thank the Department of Chemistry teaching team, particularly Emelyn Tan, Hairuo Xu, Wee Boon Tan, Maw Lin Foo, Claire Taylor, Linda Sellou, Michael Yudistira Patuwo, Francis Yuan Yi Chong, and Saradha

a

The highest grade A+ (85% and higher) is worth 10 points, whereas the lowest pass grade D (40−44.9%) is worth 1 point. H

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Scientific Communication, and Research Skills. J. Chem. Educ. 2018, 95 (1), 68−75. (18) Kolb, D. A. Experiential Learning: Experience as the Source of Learning and Development. In Experiential Learning: Experience as the Source of Learning and Development; Prentice Hall: Eaglewood Cliffs, NJ, 1984; pp 20−38. (19) ALTC. Moderation of Assessment in Higher Education: A Literature Review; Canberra, 2012. (20) Chee, Y. S. Embodiment, Embeddedness, and Experience: Game-Based Learning and the Construction of Identity. Res. Pract. Technol. Enhanc. Learn. 2007, 02 (01), 3−30. (21) Smith, K. A. Co-Operative Learning: Making “Groupwork” Work. In New Directions for Teaching and Learning; Jossey-Bass: Minnesota, 1996; pp 71−82. (22) Johnson, D. W.; Johnson, R. T.; Johnson-Holubec, E. CoOperation in the Classroom, 8th ed.; Interaction Book Company: Edina, MN, 2008. (23) Gokhale, A. A. Collaborative Learning Enhances Critical Thinking. J. Technol. Educ. 1995, 7 (1). DOI: 10.21061/jte.v7i1.a.2 (24) Fun Man, F. Exploring Technology-Enhanced Learning Using Google Glass To Offer Students a Unique Instructor’s Point of View Live Laboratory Demonstration. J. Chem. Educ. 2016, 93 (12), 2117− 2122. (25) Johnson, D. W.; Johnson, R. T.; Smith, K. A. Active Learning: Cooperation in the College Classroom; Interaction Book Company: Edina, MN, 1998. (26) Davidson, N.; Major, C. H. Boundary Crossings: Cooperative Learning, Collaborative Learning, and Problem-Based Learning. J. Excell. Coll. Teach. 2014, 25 (4), 7−55. (27) Biggs, J. Constructive Alignment in University Teaching. HERDSA Rev. High. Educ. 2014, 1, 5−22. (28) Gibbs, G. Using Assessment to Support Student Learning; University of East Anglia: Norwich, 2010. (29) Rumsfeld, D. as US Secretary of Defense in a Department of Defense News Briefing 12 February 2002 on the lack of evidence linking the then government of Iraq with the supply of weapons of mass destruction to terrorist groups. (30) Fung, F. M.; Watts, S. F. Bird’s-Eye View of Sampling Sites: Using Unmanned Aerial Vehicles to Make Chemistry Fieldwork Videos. J. Chem. Educ. 2017, 94 (10), 1557−1561. (31) Soller, A.; Goodman, B.; Linton, F.; Gaimari, R. Promoting Effective Peer Interaction in an Intelligent Collaborative Learning System; Springer: Berlin, Heidelberg, 1998; pp 186−195. (32) Brindley, J. E.; Walti, C.; Blaschke, L. M. Creating Effective Collaborative Learning Groups in an Online Environment. Int. Rev. Res. Open Distrib. Learn. 2009, 10 (3). DOI: 10.19173/irrodl.v10i3.675 (33) Israel, J.; Aiken, R. Supporting Collaborative Learning With An Intelligent Web-Based System. Int. J. Artif. Intell. Educ. 2007, 38. (34) Soller, A.; Lesgold, A. Knowledge Acquisition for Adaptive Collaborative Learning Environments (FS-00-02); Pittsburgh, 2000. (35) Lombardi, M. M. Authentic Learning for the 21st Century: An Overview. In EDUCAUSE Learning Initiative; Oblinger, D., Ed.; ELI: Washington, DC, 2007; p 12. (36) Biggs, J. Institute of Curriculum Studies, South China Normal University. Private Communication, 20/8/1997. Int. J. Educ. Res. 1998, 29, 723−738. (37) Collison, C. G.; Kim, T.; Cody, J.; Anderson, J.; Edelbach, B.; Marmor, W.; Kipsang, R.; Ayotte, C.; Saviola, D.; Niziol, J. Transforming the Organic Chemistry Lab Experience: Design, Implementation, and Evaluation of Reformed Experimental Activities - REActivities. J. Chem. Educ. 2018, 95, 55−61. (38) Sitzmann, T.; Ely, K.; Brown, K. G.; Bauer, K. N. SelfAssessment of Knowledge: A Cognitive Learning or Affective Measure? Acad. Manag. Learn. Educ. 2010, 9 (2), 169−191. (39) Bacon, D. R. Reporting Actual and Perceived Student Learning in Education Research. J. Mark. Educ. 2016, 38 (1), 3−6.

Thyagarajan. We also gratefully acknowledge the support of Yulin Lam and the superb technical support of April Ong Bee Hoon, Adeline Chia Hwee Cheng, and Livonne Ng Voon Kunn, without whom this work would not have been possible. This work is supported by NUS Centre for Development of Teaching and Learning (CDTL) Teaching Enhancement Grant (TEG): “Laboratory Work: Through the Asian Looking Glass”. There are two Institutional Review Board codes for this work: B-15-299 and S-17-027E. The abstract graphic was partly contributed by Choo Wen Xin and used with permission.



REFERENCES

(1) Lamb, A. B. History of Harvard Chemistry Recounted in Recent Article https://www.thecrimson.com/article/1929/5/13/history-ofharvard-chemistry-recounted-in/ (accessed March 17, 2019). (2) University of Oxford. Department of Chemistry http://www. chem.ox.ac.uk/history/ (accessed March 17, 2019). (3) Carnduff, J.; Reid, N. Enhancing Undergraduate Chemistry Laboratories: Pre-Laboratory and Post-Laboratory Exercises; Royal Society of Chemistry: Cambridge, UK, 2003. (4) Ackoff, R. L.; Greenberg, D. A. Turning Learning Right Side up: Outting Education Back on Track, 1st ed.; Prentice Hall: Upper Saddle River, 2008. (5) Wirth, K. R.; Perkins, D. Learning to Learn v16; 2008. OnlineDocument. University of Dakota and MacAlester College. DOI: DOI: 10.1108/13697230210816222. ISSN 1369-7234. http:// web.archive.org/web/20180310005012/https://www.macalester. edu/academics/geology/wirth/learning.pdf, 29p. (accessed April 10 2019) (6) Stanley, J. Fixing Chemistry Education - Honors Program https://www.northeastern.edu/honors/2017/05/fixing-chemistryeducation/#.WRjgUeuGM7Y (accessed May 15, 2017). (7) Richardson, J.; Farrell, B.; Lee, A.; Schirmer, A. Some Problems with Plagiarism. CDTL Br. 2008, 11 (2), 1−6. (8) Hanson, S.; Overton, T. Skills Required by New Chemistry Graduates and Their Development in Degree Programmes; Hull, 2010. (9) CityJobs. Why do banks want physics and maths grads? City Blog http://www.cityjobs.com/cityblog/2015/05/06/banks-physicsmaths-grads/ (accessed May 15, 2017). (10) Galloway, K. R.; Bretz, S. L. Development of an Assessment Tool to Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory. J. Chem. Educ. 2015, 92 (7), 1149− 1158. (11) Elliott, M. J.; Stewart, K. K.; Lagowski, J. J. The Role of the Laboratory in Chemistry Instruction. J. Chem. Educ. 2008, 85 (1), 145. (12) Hofstein, A.; Lunetta, V. N. The Role of the Laboratory in Science Teaching: Neglected Aspects of Research. Rev. Educ. Res. 1982, 52 (2), 201−217. (13) Mewis, R. Staff and Student Opinions of the Inclusion of Practical Work in Higher Education Chemistry Courses in England: What Are the Perceived Objectives and Outcomes? New Dir. Teach. Phys. Sci. [online] 2016, 36−44. (14) Hart, C.; Mulhall, P.; Berry, A.; Loughran, J.; Gunstone, R. C. What Is the Purpose of This Experiment? Or Can Students Learn Something from Doing Experiments? J. Res. Sci. Teach. 2000, 37 (7), 655−675. (15) Novak, J. D. Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and Corporations. J. ELearning Knowl. Soc. 2010, 6 (3), 21−30. (16) Yasin, N. Y. B. M.; Yueying, O. Evaluating the Relevance of the Chemistry Curriculum to the Workplace: Keeping Tertiary Education Relevant. J. Chem. Educ. 2017, 94 (10), 1443−1449. (17) Marteel-Parrish, A. E.; Lipchock, J. M. Preparing Chemistry Majors for the 21st Century through a Comprehensive One-Semester Course Focused on Professional Preparation, Contemporary Issues, I

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(40) Angrist, J. D.; Lavy, V. Using Maimonides’ Rule to Estimate the Effect of Class Size on Scholastic Achievement. Q. J. Econ. 1999, 114 (2), 533−575. (41) Bandiera, O.; Larcinese, V.; Rasul, I. HETEROGENEOUS CLASS SIZE EFFECTS: NEW EVIDENCE FROM A PANEL OF UNIVERSITY STUDENTS*. Econ. J. 2010, 120, 1365−1398. (42) Gibbs, G.; Simpson, C. Does Your Assessment Support Your Students’ Learning? Learn Teach. High. Educ. 2004, 1 (1), 3−31. (43) Bridges, P.; Cooper, A. Coursework Marks High, Examination Marks Low: Discuss. Assess. Eval. High. Educ. 2002, 27 (1), 35−48. (44) Urdan, T.; Schoenfelder, E. Classroom Effects on Student Motivation: Goal Structures, Social Relationships, and Competence Beliefs. J. Sch. Psychol. 2006, 44 (5), 331−349. (45) Sambell, K.; McDowell, L.; Montgomery, C. Assessment for Learning in Higher Education; Routledge: London, 2012. (46) Biggs, J. What the Student Does: Teaching for Enhanced Learning. High. Educ. Res. Dev. 2012, 31 (1), 39−55. (47) Tucker, P. D.; Stronge, J. H. Linking Teacher Evaluation and Student Learning; Association for Supervision and Curriculum Development: Alexandria, VA, 2005. (48) Rios, J. N. The Effects of Different Teaching Methods on Student Attitude and Achievement in Calculus Recitations; University of New Mexico, 2017. (49) Lake, D. A. Student Performance and Perceptions of a LectureBased Course Compared With the Same Course Utilizing Group Discussion. Phys. Ther. 2001, 81 (3), 896−902. (50) Reisenwitz, T. H. Student Evaluation of Teaching: An Investigation of Nonresponse Bias in an Online Context. J. Mark. Educ. 2016, 38 (1), 7−17. (51) McColl-Kennedy, J. R.; Schneider, U. Measuring Customer Satisfaction, What, Where and How? Total Qual. Manag. 2000, 11 (7), 883−896. (52) Goos, M.; Salomons, A. Measuring Teaching Quality in Higher Education: Assessing Selection Bias in Course Evaluations. Res. High. Educ. 2017, 58, 1−24. (53) Lipnevich, A. A.; Smith, J. K. Response to Assessment Feedback: The Effects of Grades, Praise, and Source of Information (ETS-RR-0830); Princeton, NJ, 2008. (54) Tan, E.; Watts, S.; Hairuo, X. U. A Relational Approach to “Wet Chemistry” Laboratory Learning: A Cultural Transformation Tool 1. In International Conference on Teaching and Learning in Higher Eduction TLHE 2014; Ragpathi, K., Ed.; NUS: Singapore, 2014; p 4. (55) Ismail, I. R.; Zulkifli, N.; Pauzi, S. F. M.; Hadi, K. A. A.; Najid, N. A. Effects of Student Workload and Academic Procrastination on Attitude to Plagiarize: A Partial Least Squares Application. In Proceedings of the International Conference on Science, Technology and Social Sciences (ICSTSS) 2012; Springer Singapore: Singapore, 2014; pp 375−381. (56) Bailey, J. Are Teachers At Fault for Plagiarism? - Plagiarism Today https://www.plagiarismtoday.com/2011/12/01/are-teachersat-fault-for-plagiarism/ (accessed June 17, 2017). (57) Yardley, J.; Rodríguez, M. D.; Bates, S. C.; Nelson, J. True Confessions?: Alumni’s Retrospective Reports on Undergraduate Cheating Behaviors. Ethics Behav. 2009, 19 (1), 1−14.

J

DOI: 10.1021/acs.jchemed.8b00771 J. Chem. Educ. XXXX, XXX, XXX−XXX