WebMarkA Fully Automated Method of Submission, Assessment

subject delivery and reception by providing (i) online internal management systems for improved course administration (5),. (ii) an interface for hype...
1 downloads 9 Views 266KB Size
Information • Textbooks • Media • Resources

Computer Bulletin Board

WebMark—A Fully Automated Method of Submission, Assessment, Grading, and Commentary for Laboratory Practical Scripts George W. J. Olivier,* Katie Herson, and Michael H. Sosabowski School of Pharmacy and Biomolecular Sciences, University of Brighton, Moulsecoomb, Brighton BN2 4GJ, UK; *[email protected]

Background Information technology (IT) is becoming an agent of quality enhancement in the learning and teaching of chemistry (1). There is precedent to show that IT developments are of benefit in the teaching and learning of specific elements of chemistry syllabi, such as stereochemistry (2) and molecular visualization (3, 4). Other workers have shown that IT improves subject delivery and reception by providing (i) online internal management systems for improved course administration (5), (ii) an interface for hypermedia tools for better comprehension (6 ), and (iii) courses that deal with online literature searching techniques (7). Entire chemistry courses may be delivered over the Internet (8), and “Webware” may be downloaded “off the shelf” to supplement existing chemistry courses (9). We have reported that the use of Intranet-based resources can enhance course delivery and the quality of the students’ learning experience (10). We previously reported the success of a limited prototype of this system, which allowed students to enter numerical results via a Web-based form; the data were imported into a MS Excel spreadsheet for manual processing. This approach has been effective in the adaptation of the prototype shell of the system to other practical experiments, indicating the general applicability of this type of system (11). Assessing Laboratory Practical Scripts— The Traditional Approach In the traditional method of assessing laboratory practical exercises, the student records data during a laboratory class, completes the calculations, interprets and writes-up after the class, and submits hard copy to the educator. Depending upon turnaround time, the students may receive their corrected scripts some weeks after submission. WebMark allows the students to submit their results to a server via a Web-based form from any networked computer and receive their grade, feedback, and an indication of where mistakes may have been made within seconds. Students may be confident that their work has been marked 100% equitably and under exactly the same conditions as the work of all other students, and can approach the educator for assistance with corrections while the subject is still fresh in their mind. WebMark is a flexible, sophisticated system for the assessment of tasks requiring numerical manipulation and calculations. It can be used to assess complex logical processes that bring together conclusions from different sets of data and also more conventional multiple-choice-type questions.

The conclusions a student makes might need to be drawn from different parts of a problem and it may be appropriate that different students draw very different conclusions depending on their individually determined data. The nature of the system allows for this situation. Process of Establishment The School Intranet provides a link to a Web-based form that has fields for personal details, security, and all responses required for assessment of the practical as per the hard copy schedule (Fig. 1). The security password field (PIN, Personal Identification Number) ensures that data submitted are coming from the named student. This allows students to remain confident that their submitted results are ultimately identifiable as theirs in the event of any inquiry or dispute concerning the origin of the data. The students’ revelation of their password would confirm or deny their ownership of data, since the passwords are collected with the data. Students are also required to submit a paper copy of their results for quality assurance purposes. Ten percent of the paper copies are checked for consistency with the database record. Students sign a register for each practical session and have the results of their experiments countersigned by the supervising educator and dated to guard against educational dishonesty. Students’ paper scripts are also perused routinely during the course of the session. The data submitted are divided into two categories: raw and derived. Raw data are results that were measured in the course of the exercise, such as weights and titers. Derived data are obtained by manipulation of the raw data, for example by calculating volumes delivered and determining amount or concentration. Raw and derived data are entered as numeric values into the appropriate field on the form. There are also pull-down menus for yes/no and pass/fail data and buttons for selection in multiple-choice-type responses. The Web-based form is linked to a Filemaker Pro database, and the data are entered into Filemaker Pro after all the required responses are complete and the students have entered their email address and clicked on the “save your results” button. Within the database, calculations are performed using the students’ raw data to establish their “right” answers. These data are held in fields parallel to those holding the students’ own derived data. The derived data are then compared with the database-calculated data and marks or flags are assigned to additional fields within the database. These marks and flags form the basis by which grading and feedback for the work are achieved. Additional calculations based on the raw data and

JChemEd.chem.wisc.edu • Vol. 78 No. 12 December 2001 • Journal of Chemical Education

1699

Information • Textbooks • Media • Resources

held in additional parallel fields can be used to establish the nature of some common errors. The use of such flags and additional fields is illustrated in the simple example below. The students’ grade, commentary, and feedback are then sent to them by email to the specified address. Illustration of the Marking and Feedback Mechanism Consider the data sets for two students shown in Table 1, which illustrate the levels of assessment that are possible. The example relates to a practical exercise on the analysis of a sample of aspirin, which involves a back-titration of sodium hydroxide with hydrochloric acid. In the course of the exercise, students are required to accurately establish the concentration of the sodium hydroxide solution. They titrate 25 cm3 of sodium hydroxide solution in duplicate with 0.5230 M HCl solution.

They then calculate the molarity of the sodium hydroxide solution. The replicate titers should agree within 0.25 cm3, and failure to achieve this degree of precision is penalized. In this example both students correctly calculated the molarity of the sodium hydroxide solution but one of them has titer values of unacceptable precision. By using logical functions within Filemaker Pro it is possible to distinguish between a data set that is correctly calculated and of acceptable precision and one that is also correctly calculated but is based on data of unacceptable precision. In this example, the student whose data were less precise but who completed the calculations correctly scores some but not all of the marks available for that section. The penalty is restricted to the marks awarded for that particular aspect of the exercise. A number of students make the error of using the approximate concentration of the sodium hydroxide solution in the calculations for the aspirin assay instead of the accurately determined concentration. Using logical functions and parallel calculations it is possible to distinguish students who made this error from those whose molarity coincidently worked out to exactly 0.5000 M, and to provide appropriate marks and feedback for this situation. Nonsystematic calculatory errors are by definition difficult to predict. In a long calculation, should a student make an error at an early stage, WebMark will penalize that error but the student will receive credit for subsequent correctly executed calculations, despite the now-incorrect derived data. For example, if students incorrectly calculate the molarity of NaOH and then use that value in subsequent correctly executed calculations, they are not penalized twice for the single initial error. A potential problem at the calculation-checking stage is caused by the use of different rounding methods. This is overcome by stipulating an acceptable range of error for each field. The extent of this range depends on the precision acceptable for a particular field. Student feedback based on the students’ reported answers, the Filemaker Pro-calculated answers, and the divergence for a particular section of the experiment is presented in an appropriately worded manner. This allows students to reflect upon their work and any errors that they made and provides formative value to the process. The grade for the quantitative Table 1. Illustrative Data Sets for Two Students Calculated Data

Item

Raw Data

Titer 1

25.75

Titer 2

25.85

Average titer

25.80

Marks/Flags

Student 1

Molarity of NaOH solution 0.5397 Feedback

25.80

1 mark

0.5397 4 marks

Calculation of molarity of NaOH solution is accurate. Student 2

Titer 1

25.65

Titer 2

25.95

Average titer

25.80

Molarity of NaOH solution 0.5397 Feedback

Figure 1. Aspirin Web data-entry form.

1700

25.80

1 mark

0.5397 4 marks precision penalty = ᎑2 marks

Your titers in standardizing the NaOH solution should be closer together. Calculation of molarity of NaOH solution is accurate.

Journal of Chemical Education • Vol. 78 No. 12 December 2001 • JChemEd.chem.wisc.edu

Information • Textbooks • Media • Resources

Box 1. Text from a Typical Feedback E-mail From: [email protected] Sent: 24 February 2001, 16:03 To: [email protected] Subject: aspirin_recorded

section of the write-up depends upon the both the correctness of the answers (compliance with monograph specifications) and the correctness of the manipulation of the data (calculations). The text from a typical feedback email is shown in Box 1. Student Evaluation and Feedback

Student, A. Your results have been received. Here is your feedback. Average yield from your synthesis. Poor melting point determination for aspirin. Good melting point determination for Test A. precipitate. Good result for loss on drying. Calculation of molarity of NaOH solution is accurate. Error in calculating weight of aspirin in determination 2. Your 1st determination aspirin assay is slightly outside the monograph specification Your 2nd determination aspirin assay is slightly outside the monograph specification and your grade is: B

Box 2. Student Evaluation Questionnaire 1. To which age bracket do you belong? 2. Do you have access to a personal computer at your term time address? 3. Please indicate where you inputted the results for the four practicals. 4. How easy/difficult did you find the Web Marking forms to use? 5. For each occasion you submitted your results, did you wait by the computer for your result to arrive or did you go away and come back later? 6. If the answer to question 5 is yes, how long did it take for your results to arrive? 7. Overall, how confident are you that your work has been marked fairly? 8. Overall, how confident are you that your work has been marked accurately? 9. Overall, do you prefer this method of marking to the method used in previous practicals (i.e. manual marking of handwritten script)? Please explain your answer. 10. Did you find the feedback valuable? Please explain your answer. 11. Did the speed of feedback help your understanding compared with waiting until the end of term as you may have done with previous practical exercises? 12. We have identified certain issues, which have been a hindrance to this method of assessment—these include access to computers and network failures. Please list any other problems you have experienced. 13. Please tell us of any ideas you have for improving the use of Web Marking for practical assessments. Please be candid and constructive as possible. 14. Rank 8 statements about this method of assessment, some positive and some negative.

Table 2. Time to Receipt of E-mailed Response for Each Practical Session Time

Number

Percent

115

47

10 min

28

11

20 min

4

2

30 min

7

3

Immediately

>30 min Came back later

6

2

87

35

WebMark Questionnaire The 1999–2000 MPharm level-3 students who used WebMark for four practical experiments in a PY301 Medicinal Chemistry Module were given a questionnaire to solicit their opinions and experience using the WebMark process (Box 2). Sixty-nine students completed the questionnaire. The following results take the total results across all 4 practical experiments. The percentages shown are based on completed responses only. Some questions (e.g. Question 1) require only one answer; others (e.g. Question 5) require four answers, one for each practical session completed. The results relevant to the WebMark system in general are discussed below. Question 4 asked how easy/difficult the students found the WebMark forms to use (1 = difficult, 5 = very easy). Seventyeight percent said that WebMark was very easy or easy to use (score of 4 or 5), 10% indicated neutrality (score 3), and 9% indicated some level of difficulty (score 1 or 2). A few students struggled, but a clear majority found the WebMark forms easy to use. Questions 5 and 6 concerned how long the students waited for their results. The results (Table 2) are surprising, as the feedback email was sent almost instantaneously and we did not expect any results in the categories of 10 minutes or higher. This suggests that students took some time to log into their email or perhaps network difficulties caused delay. More than a third did not wait for the results and left the computer once the results had been entered. In the future, we hope to see more students reading their results immediately, while the practical is still fresh in their minds. Questions 7 and 8 asked how confident the students were that their work was marked fairly and accurately (1 = not confident, 5 = very confident). The opinions regarding the two aspects were similar; most students had a mid-to-high confidence level regarding the marking of their work (Fig. 2). Questions 9, 10, and 11 gathered opinions about aspects of this system and its feedback compared to previous methods. Most students preferred this system, valued the feedback, and appreciated the speed of feedback (Table 3). The qualitative answers from the questionnaire did not reveal any issues the development team was unaware of. The final question asked the students to rank 8 statements concerning positive and negative aspects of WebMark, from 1, most important, to 8, least important. The questions were presented on the questionnaire in random order to minimize bias. Figure 3 shows the results. The first 4 statements (positive) were all ranked relatively highly; the last four statements (negative) mainly received lower rankings. This indicates that the students found the advantages of the system more important than the drawbacks. General Module Evaluation Questionnaire The PY301 module has both laboratory-based exercises and a lecture component. It is standard practice within this institution to undertake a student assessment of the general

JChemEd.chem.wisc.edu • Vol. 78 No. 12 December 2001 • Journal of Chemical Education

1701

Information • Textbooks • Media • Resources

Box 3. Module-Evaluation Form Commentary Regarding Submission of Results via WebMark

140 120

Instant feedback from intranet marking Good idea for computer submission of practicals, liked the immediate feedback Feedback Submission of labs on intranet Prompt feedback from practicals Good feedback on practical performance it’s good to get instant feedback

100 80

Fairly Accurately

60 40 20 0

1

2

3

4

5

Figure 2. Evaluation of whether scripts had been marked fairly and accurately (1 = not confident, 5 = very confident) shown by counts for each rating.

Rank: 1

2

3

4

5

6

7

8

Discussion

I like the speed of grading compared to a potential wait of several weeks with conventional marking method I like the flexibility of submitting data from any networked location I like the speed of feedback compared to a potential wait of several weeks with conventional marking method I am happy with the consistent accuracy of marking–each student is marked in exactly the same way I am not confident that the work received the attention it deserved since the results came back very quickly I dislike the fact that I have to depend on the reliance of a computer network system I dislike the inconvenience of having to enter the data into a computer after having recorded it on paper I dislike the inconvenience of having to find an available computer 0%

20%

40%

60%

80%

100%

Figure 3. Relative statement rankings shown by percentage count for each rank.

Table 3. Responses to Selected Questions Response Question

Yes

No

No.

%

No. %

Overall, do you prefer this method of marking to the method used in previous practicals (i.e. manual marking of handwritten script)?

42

65

23

35

Did you find the feedback valuable?

44

66

23

34

Did the speed of feedback help your understanding compared with waiting until the end of term as you may have done with previous practical exercises?

47

73

17

27

1702

module, using a separate module evaluation form. This assessment allows for qualitative unsolicited comments under three general headings: the best things about the module, the worst things, and any other comments. Twelve students mentioned the WebMark system as one of the best aspects of the module. No students mentioned WebMark in a negative context. Box 3 shows the commentary with regard to the submission of results via WebMark.

We have identified the following benefits of using this approach. Students benefit because they receive detailed feedback on their work almost instantaneously. This is a particular advantage, as the work will still be fresh in their minds and they will be able to reflect on where they have gone wrong (if they have). They also receive details of their grades quickly. Members of academic staff find that significantly more students approach staff to receive help in areas where they have gone wrong than is the case with conventional hand-marked assessments. Academic staff benefit because very time-intensive, repetitive assessment is minimized. Staff spend more quality time with students discussing their work than they do when work has a longer turnaround time in marking. Tedium is avoided. Manual assessment of this exercise requires a large number of manually performed calculations. Computers can only be objective. This method therefore ensures total objectivity in checking and grading. Multiple calculations done by hand also increase the probability of error. All scripts can be checked simultaneously. Turnaround time can be measured in seconds. The provision of a Web-based interface links the experiment to the School Intranet. This makes learning resources more available and provides each module with its own Intranet Web page. The time and date that the results are entered are automatically recorded to ensure student compliance with deadlines, and the use of student-chosen passwords allows us to be confident that the submitter of results was indeed the named student, as does the parallel submission of the laboratory script. Students may also be confident that their results are recorded, as they are provided with an on-screen, time- and date-stamped receipt, which they may print for their own records. The following issues have been identified and addressed. Conceptualizing the Filemaker Pro procedure represented a large setup time. This, coupled with the actual time for program design, required a substantial amount of educators’ time.

Journal of Chemical Education • Vol. 78 No. 12 December 2001 • JChemEd.chem.wisc.edu

Information • Textbooks • Media • Resources

This time is recoverable in later semesters as the concept is expanded to other laboratory exercises, as the system is easily adapted with minimum time–cost implications. A subsequent assessment setup was completed within four hours of educators’ time, indicating that a steep experience curve makes further practical setups more efficient. Concerns have been raised that the setup and maintenance of the databases is overspecialized and represents a potential threat should the incumbent educator leave. All the software described is proprietary, well-supported, and intuitive to use. Occasional use by students of inappropriate/unsolicited units such as grams (g) in a “weight” field was problematic in the earlier application using MS Excel as the processing application. A feature of Filemaker Pro is that the format of a particular field can be designated as numeric and the application will then ignore any alphabetical characters included in that field. For example, students can enter the weight of a sample as 0.999 g and the application will recognize only the numerical characters for calculatory purposes. There is a training issue inasmuch as the system assumes prior knowledge of the Internet and School Intranet. This is a common requirement in such instances (12). In our case, the students receive training in the use of the Internet and other computer-based skills elsewhere in the course and overview training of the particular application was all that was required. Less than 3% of the students who submitted data into this system chose not to have their results emailed. This figure reflects the general level of computer literacy within the current student body. Experience of using Web browsers and the Internet is now virtually universal among these students. However, for the few who did experience difficulty, handson training was provided as required. We recognize the need for students to acquire skills in the competent recording of laboratory data and the writing of goodquality scientific reports. It is not proposed that WebMark be seen as a method for assessing all laboratory reports, but rather as a method of efficiently assessing those reports with a high proportion of quantitative answers. Approximately half of the reports in a typical MPharm chemistry practical module may be mainly quantitative and suitable for WebMark assessment. Potential Developments This innovation has proved beneficial and cost effective when judged against the parameters set out above. The benefits for both students and academic staff are proven and the use of the WebMark system will be continued in future semesters. There is potential for expansion to address any issues that have arisen. Other quantitative practical exercises will be checked and then graded in this manner. However, no system should remain static; technology continually changes and such systems can be continually refined. Moreover, we propose that no such checking system should be totally without some element of human input; total automation may not be in the students’ or educators’ best interests. With the above in mind, other possible developments are summarized thus. The system that has been developed is a generic one and can be applied to other areas of study. The system is currently used as an aid to student assessment. Using a slightly different approach, it is also being used as a

computer-aided learning (CAL) package—for example, in gaining proficiency in calculations based on the Henderson– Hasselbalch equation. Students log-on, request a set of data, take the data away with them, manipulate them and draw whatever conclusions are appropriate, enter their results, and get feedback on their results. If mistakes are made, on the basis of the detailed feedback provided by WebMark, students can recalculate their data, resubmit the results, and check to see that they have handled the task correctly. They can then request a fresh data set for the same problem and demonstrate to themselves that they fully understand how to complete the task. Such activity involving hand-marking by staff would require a considerable staff resource commitment. WebMark has considerable potential to become a learning aid with very high formative value. We have established in principle a procedure for checking for instances of academic dishonesty. In the example used in this paper, six fields containing raw data (titers, accurate weighings) are copied into buffer fields and compared with all the other records currently in the database. Any coincident values are then flagged so the educator can undertake the appropriate investigation. The method can also be applied to previous years’ data sets. Work is currently underway to refine this concept. Discussions are in progress with colleagues within the School of Pharmacy and Biomolecular Sciences to extend the system as a learning aid for mathematics, organic chemistry, and clinical pharmacy. It has been reported (13) that preregistration students and newly qualified pharmacists often lack confidence when tackling complex pharmaceutical calculations. This system could easily be made available over the Internet for access by graduates, with benefit to both pharmacists and the patients under their care. Acknowledgments Two of the authors (MHS and GWJO) wish to gratefully acknowledge the generous support of the University of Brighton Education Research Strategy Group. Literature Cited 1. Brimberry, W. M.; Riffee, W. H. Am. J. Pharm. Educ. 1995, 59, 1–7. 2. Harrold, M. W. Am. J. Pharm. Educ. 1995, 59, 20. 3. Davis, P. J. Am. J. Pharm. Educ. 1994, 58 (Suppl.), 99S. 4. Kerwin, S. M. Am. J. Pharm. Educ. 1994, 58 (Suppl.), 99S. 5. Smith S.; Stoval, I. J. Chem. Educ. 1996, 73, 911–915. 6. Tissue, B. M.; Earp, R. L.; Yip; C.-W.; Anderson, M. R. J. Chem. Educ. 1996, 73, 446. 7. Matthews, F. J. J. Chem. Educ. 1997, 74, 1011–1014. 8. Liu, D.; Walter, L. J.; Brooks, D. W. J. Chem. Educ. 1998, 75, 123. 9. Judd, C. S. J. Chem. Educ. 1998, 75, 1073. 10. Sosabowski, M. H.; Herson, K.; Lloyd A. W. Am. J. Pharm. Educ. 1998, 62, 302–306. 11. Sosabowski, M. H.; Herson, K.; Olivier, G. W. J.; Martincigh, B. S.; Kindness, A. S. Afr. J. Chem. 2000, 53, 1, 28–32. 12. Chandra, A.; Holt, G. A. Am. J. Pharm. Educ. 1996, 60, 297–303. 13. McAteer, S. Pharm. J. 1999, 262 (7039), 477.

JChemEd.chem.wisc.edu • Vol. 78 No. 12 December 2001 • Journal of Chemical Education

1703