Using Software Tools To Provide Students in Large Classes with

Oct 19, 2018 - Students at the university level spend more and more time in learning management systems (LMSs), which support current teaching by offe...
0 downloads 0 Views 2MB Size
Technology Report Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Using Software Tools To Provide Students in Large Classes with Individualized Formative Feedback Sebastian Hedtrich and Nicole Graulich* Institute of Chemistry Education, Justus Liebig University Giessen, Giessen 35392, Germany

J. Chem. Educ. Downloaded from pubs.acs.org by 146.185.203.80 on 10/21/18. For personal use only.

S Supporting Information *

ABSTRACT: Students at the university level spend more and more time in learning management systems (LMSs), which support current teaching by offering online tutorials or units. These LMSs allow individual preparation for the students, especially in large classes. However, students’ learning in these online systems is often not supported by detailed formative feedback, as the options for this in current LMSs are quite limited. Formative feedback should be connected with the learning objective or competencies of the course, but formative feedback in an LMS is limited to single tasks or tests and does not allow one to focus on competencies across tasks. We have developed two easy-to-use software tools that enable teachers to use data from the LMS to quickly create an automated formative feedback that can be sent to students. First evaluations of the average final exam scores of different classes show that this new type of formative feedback seems to have a medium-scale effect on students’ final exam scores according to Cohen. KEYWORDS: First-Year Undergraduate/General, Second-Year Undergraduate, Laboratory Instruction, Internet/Web-Based Learning, Nonmajor Courses



FORMATIVE FEEDBACK AND E-LEARNING Transitioning from high school to university is often a challenge for students, as there is considerably less formative feedback in introductory classes. Undergraduates especially struggle in this quite anonymous learning situation, foremost in large-scale class settings.1 However, as shown by Hattie, feedback is a strong supporting factor that influences the learning process.2 In this regard, formative feedback is defined by Shute3 as information communicated to the learner that is intended to modify his or her behavior or thinking for the purpose of improving learning. Appropriate formative feedback, in addition to the summative feedback given by the final exam grade, can help students before the exam to adapt their preparation and learning needed to pass the final exam.4 Otherwise, they might risk overestimating their abilities, therefore spending insufficient effort on preparation.5,6 Classes at universities have made increasing use of webbased learning elements to extend traditional teaching.7 Flipped-teaching, for instance, has been applied very successfully in chemistry lectures, both in organic chemistry8 and prelab activities for undergraduates.9 Students seem to appreciate these formats of flipped- or blended-teaching, which result in slightly better exam scores and lower dropout rates.10−12 One central part of distance teaching is a learning management system (LMS), a special type of content management system that is specialized for offering learning opportunities.13 Many LMSs have been published in recent years, for example: Black Board, ILIAS, Moodle, and © XXXX American Chemical Society and Division of Chemical Education, Inc.

OpenOLAT. Although learning supported by LMS is increasing at the university level, the opportunities for formative feedback in these LMSs are often limited; therefore, students may not receive the support they actually need while working within an LMS.14 Current LMSs, such as Black Board, ILIAS, or Moodle, offer only two types of feedback options: a direct corrective feedback given to a wrong answer and a summative feedback on a total test performance;15 the latter type of feedback seems to induce superficial preparation.16 Additionally, the feedback options in current LMSs do not enable educators to provide formative feedback to students regarding the mastery of learning objectives that are covered either by single tasks or single tasks of a test within the LMS.17 Even if they had the necessary insight into students’ learning in the LMS, it is quite challenging to deliver individual, formative feedback to each student in large introductory classes.



NEW SOFTWARE CAN DELIVER INDIVIDUAL FORMATIVE FEEDBACK TO LARGE CLASSES The LMS Analyzation Kit (LMSA Kit) in combination with the Easy Snippet Feedback Edit (ESF Edit) are new software tools that can be used to deliver formative feedback. They allow insight into students’ learning progress within the LMS Received: March 9, 2018 Revised: October 4, 2018

A

DOI: 10.1021/acs.jchemed.8b00173 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Technology Report

Figure 1. LMSA Kit uses various matchmaking algorithms to estimate student’s abilities. The criteria on the left-hand side can be analyzed by different algorithms. The list contains tasks that were solved in the LMS and that belong to one criterion. On the right-hand side, the table shows the results. The first section contains the difficulty scores of the different tasks in the criterion, whereas the second section shows the students’ ability scores. The names are blurred for privacy reasons.

The generation of feedback within the ESF Edit was designed in a visual programming language (cf. Figure 2).

and allow teachers to generate automated, individual feedback for each learner.18 Both software tools were developed at our institute to overcome the deficits of current feedback options in LMSs. The LMSA Kit works with typical data export functions almost every LMS offers. Hence, the LMSA Kit is not directly linked with a specific LMS. In the LMSA Kit, all tasks addressing the same learning objective, competency acquisition, or topic can be grouped into criteria. The LMSA Kit then analyzes students’ performance in these criteria using matchmaking algorithms. This method is used to overcome the shortcomings of item-response models, which are generally not applicable to data exported from LMSs.19 The ability scores generated by the LMSA Kit reflect students’ learning progress in each criterion. These scores are typically not accessible for educators in LMSs. All steps of data import and analysis can be easily carried out within the graphical user interface of the LMSA Kit (cf. Figure 1). All basic functions are displayed directly in a menu bar. The advanced functions usually open wizards that guide the user through the analysis. A detailed description of the handling of the LMSA Kit and the ESF Edit can be found in the Supporting Information and on our homepage.20 In addition to the LMSA Kit, we developed the ESF Edit, a second software tool that can be used together with the LMSA Kit. It takes the ability scores from the LMSA Kit and creates individual formative feedback almost automatically. Hence, it is possible to reuse the assessments students have already taken over the course of the class to offer them additional feedback regarding the final exam. Those students who struggle to estimate their progress toward a learning objective over the course of different tasks in different tests may benefit from such an automated formative feedback system.21 The task of writing individual feedback is automated by the software and can now be managed easily by a single person in a reasonable amount of time.22 An experienced user, for instance, can manage a process of feedback generation with about 14 learning objectives in 2 days. However, time may vary and depends on the experience of the teacher, but the required time is certainly reduced to less than an hour if a programmed feedback is reused for exams of the same class in following years.

Figure 2. Example of a feedback generation process within the ESF Edit.

Instead of learning a difficult and less intuitive programming language, teachers arrange graphical parts, so-called snippets, in a specific logical order in the ESF Edit. The visual programming language is very similar to flowcharts or concept maps. The arrangement of the snippets describes the process of feedback generation (cf. Figure 2).



SENDING FORMATIVE INDIVIDUAL FEEDBACK TO STUDENTS At our university, the introductory chemistry courses for chemistry minors and student teachers, are structured similarly. Students in these courses attend a lecture and a lab section. These lab sections offer a web-based learning in the LMS (in our case ILIAS) to support students’ lab preparation and to review relevant topics. During this web-based training, students are required to pass several online tests. These tests are the data basis for the LMSA Kit to estimate students’ abilities in B

DOI: 10.1021/acs.jchemed.8b00173 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Technology Report

Figure 3. Template of a feedback block (left) and example of a generated feedback e-mail (right).

detailed description of which progress has been made toward mastery of class’s objectives based on all tasks solved within the LMS. This increases the feedback possibilities of common LMSs and goes beyond the correctness of a single task or score in a single test. We used our chemistry lab classes for student teachers for developing and testing the software and for qualitative evaluations prior to the use in large nonmajors courses. Results from the qualitative interviews and surveys (N = 30) revealed that students rated the feedback as beneficial and helpful for their exam preparation. On the basis of these qualitative evaluations, we added recommendations for further preparation in the feedback (cf. Figure 3) and provided the feedback e-mail earlier, that is, 2 weeks before the final exam.18,22

different topics, for example, acid−base concept, redox reactions and balancing chemical equations, electrochemistry, basic organic chemistry, nomenclature, and organic mechanisms. Most of the topics are further divided into different sub topics. Consequently, the information gained is detailed and the generated feedback can address students’ abilities more specifically. All the tasks that students solve in the web-based training are grouped into these predefined topic criteria in the LMSA Kit. Afterward, the LMSA Kit estimates the students’ ability scores in these criteria. These ability scores are then used in the ESF Edit for the generation of formative feedback. Block-Wise Design Template for Formative Feedback

The feedback that is generated in the ESF Edit consists of text blocks, which are set up identically for every topic (cf. Figure 3). The first text paragraph mentions the topic and several examples to offer more comprehensible access. Sample tasks help students to understand which aspects are currently being considered. In the next paragraph, the current performance and the individual development over the course are explained. Additionally, the required performance in a criterion is explained either by stating the learning objectives or by showing an exemplary task that a learner should be able to solve if the addressed competency is successfully acquired. It has been shown that a sole grading character in feedback induces superficial learning.3,16 Thus, we avoided a grading character in all explanations and offered many declarative examples. Instead, students are reminded of the required learning objective and the necessary steps to fill their gaps. The last part of each feedback block informs students about additional learning opportunities that can specifically help in this topic. These learning opportunities consist of literature and tasks for additional training. In this way, the feedback not only informs students about their performance in the LMS, but also provides support to improve their learning progress. In contrast to feedback mechanisms that are already embedded in current LMS, the progression toward the mastery of class’s learning objectives or acquiring competencies can be monitored by this new feedback. The sent e-mail contains a

Does Automatically Generated Formative Feedback Make a Difference?

As students in our courses perceived the automatically generated feedback as useful, we attempted to implement it in two large-scale introductory chemistry laboratory classes with different majors: one large-cohort lab with majors in human, dental, and veterinary medicine; the other with biology majors. The structure of these introductory classes is similar for both cohorts and did not change in the observed period. Additionally, the combination of a blended-learning lab and a lecture class is similar to classes that are taught at our institute. The general structure of the class consisted of a laboratory containing typically basic experiments in organic and inorganic chemistry and an additional lecture teaching the theoretical background. For instance, before students conducted experiments on the effect of buffer solutions, the corresponding theory was taught in the lecture. The students visited this class in their first semester and were required to write a final exam about the topics of the lecture and the laboratory. The lecture was taught by the same professor, whereas the laboratory was supervised by different teaching assistants. The students were required to pass electronic tests in the LMS before they were allowed to enter the laboratory. C

DOI: 10.1021/acs.jchemed.8b00173 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Technology Report

Table 1. Statistics of Final Exams with and without Automatically Generated Feedback in Introductory Chemistry Group

Received Feedback

Semester

Enrolled Students

Average Score %

Passed %

Failed %

Human, Dental, and Veterinary Medicine

Yes No Yes No No Yes No

Spring 17 Spring 16 Spring 15 Spring 14 Spring 13 Fall 16/17 Fall 15/16

381 354 182 177 173 114 133

64.9 60.8 68.2 50.2 61.2 60.3 50.6

78 74 81 51 75 71 50

22 26 19 49 25 29 50

Biology

The first setup of a feedback system for a course might be tedious, when teachers are not experienced with the software tools. The LMSA Kit needs to be configured for the specific class and the feedback information that should be delivered to the students. During configuration, all criteria need to be defined and tested. According to the criteria, each part of the feedback message is then edited within the ESF Edit. Configuring the LMSA Kit and editing the feedback texts may take time especially when the teachers are unfamiliar with the software. However, reusing the feedback system in a similar class can be realized with a few clicks. Hence, the necessary amount of time is reduced to a minimum after a rather timeconsuming implementation phase.

These tests could be redone for training purposes, and the tasks were similar to the tasks on the final exam. All students in the class received individual formative feedback, generated by the ESF Edit, 10 to 14 days before the final exam. Analyzing the statistics of the final exam scores in these introductory chemistry classes (cf. Table 1) showed promising results, as an improvement was observable when automatically generated feedback was offered to the students. There was a noticeable improvement between the spring semesters of 2016 and 2017 as well as the spring semesters of 2013 and 2015. The success rates in these courses normally remain constant in the spring semesters, when no formative feedback is provided. The spring semester of 2014 was a negative outlier because the lecture was canceled several times due to illness. The increased pass rate resulted in Cohen’s d-values of 0.21 and 0.3. The improvement was notably higher in the biology lab. The effect size was about d = 0.51, which can be interpreted as a medium effect by Cohen23 and can be related to the desired effects by Hattie.24 This can be seen as a first tentative indication that there might be an improvement in terms of success rate when students received automated formative feedback before the final exam through our software tools. In contrast, when comparing classes without any changes, for instance spring semesters 2013 and 2016, there was no significant measurable effect (d = 0.01). However, as we have not explicitly asked students to comment on their use of the feedback e-mail, we cannot rule out that the increase in performance is not due to any other factors, whatsoever. Further research is needed here to verify to what extent the feedback e-mail influenced students’ exam preparation and if an increase in exam scores may be further supported by sending multiple feedback e-mails over the semester.



CONCLUSION AND OUTLOOK The combination of our two software tools is a promising approach to support students in large university classes by offering individual formative feedback. The students have the opportunity to gain information on their level of competency and on individual learning suggestions for their problems. The results over the course of the last years make us confident that this software represents a great opportunity to support our students, as individual support becomes feasible where it is usually not possible. The software tools make it easier to reach each student in large classes. The software is available online, and the use of the software is free for educational purposes.20 If any problems occur, we are always pleased to get feedback to improve the software and the instructional material.



ASSOCIATED CONTENT

* Supporting Information S

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b00173.



LIMITATIONS The insight given here into the effect of this new mechanism to deliver students formative feedback is only a first evaluation. The statistical results may suggest that an increase in performance is due to the feedback, as this was the only major change in the courses. Nevertheless, further support for the claim of usefulness of automated feedback is required, for example, through qualitative evaluations or surveys to capture students’ use of the feedback in large classes. Student’s grades from midterm exams might also help to examine the progression with or without feedback. Unfortunately, midterm exams are not used in our courses. While the similar structure of both classes was important for us in this first evaluation with a large cohort, we are currently working on different changes in the structure of the class. Initial and midterm scores will be possible in coming classes to inspect the progression within different groups of students.



Guide for how to use LMSA Kit and ESF Edit to deliver feedback to students (PDF)

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Nicole Graulich: 0000-0002-0444-8609 Notes

The authors declare no competing financial interest.



REFERENCES

(1) Seery, M. K.; Donnelly, R. The implementation of pre-lecture resources to reduce in-class cognitive load. Br. J. Educ. Technol. 2012, 43 (4), 667−677.

D

DOI: 10.1021/acs.jchemed.8b00173 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Technology Report

(2) Hattie, J.; Timperley, H. The Power of Feedback. Rev. Educ. Res. 2007, 77 (1), 81−112. (3) Shute, V. J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78 (1), 153−189. (4) Hattie, J. Calibration and confidence: Where to next? Learn. Instr. 2013, 24, 62−66. (5) de Bruin, A. B. H.; Kok, E. M.; Lobbestael, J.; de Grip, A. The Impact of an Online Tool for Monitoring and Regulating Learning at University: Overconfidence, Learning Strategy, and Personality. Metacognition Learning 2017, 12 (1), 21−43. (6) Kruger, J.; Dunning, D. Unskilled and unaware of it. J. Pers. Soc. Psychol. 1999, 77 (6), 1121−1134. (7) Costa, D. S.J.; Mullan, B. A.; Kothe, E. J.; Butow, P. A web-based formative assessment tool for Masters students. Comput. Educ. 2010, 54 (4), 1248−1253. (8) Fautch, J. M. The flipped classroom for teaching organic chemistry in small classes. Chem. Educ. Res. Pract. 2015, 16 (1), 179− 186. (9) Teo, T. W.; Tan, K. C. D.; Yan, Y. K.; Teo, Y. C.; Yeo, L. W. How flip teaching supports undergraduate chemistry laboratory learning. Chem. Educ. Res. Pract. 2014, 15 (4), 550−567. (10) Seery, M. K. Flipped learning in higher education chemistry. Chem. Educ. Res. Pract. 2015, 16 (4), 758−768. (11) He, W.; Holton, A.; Farkas, G.; Warschauer, M. The effects of flipped instruction on out-of-class study time, exam performance, and student perceptions. Learn Instr. 2016, 45, 61−71. (12) Flynn, A. B. Structure and evaluation of flipped chemistry courses: organic & spectroscopy, large and small, first to third year, English and French. Chem. Educ. Res. Pract. 2015, 16 (2), 198−211. (13) Ifenthaler, D. Learning Management System. In Encyclopedia of the Sciences of Learning; Seel, N. M., Ed.; Springer: New York, 2012; pp 1925−1927. (14) Baker, R. S.; Lindrum, D.; Lindrum, M. J.; Perkowski, D. Analyzing Early At-Risk Factors in Higher Education eLearning Courses. In Proceedings of the International Conference on Educational Data Mining (EDM) (8th, Madrid, Spain, June 26−29, 2015); Santos, O. C., Boticario, J. G., Romero, C., Pechenizkiy, M., Merceron, A., Mitros, P., Luna, J. M., Mihaescu, C., Moreno, P., Hershkovitz, A., Ventura, S., Desmarais, M., Eds.; International Educational Data Mining Society, 2015; pp 150−155. (15) Espasa, A.; Meneses, J. Analysing feedback processes in an online teaching and learning environment. High. Educ. 2010, 59 (3), 277−292. (16) Gikandi, J. W.; Morrow, D.; Davis, N. E. Online formative assessment in higher education. Comput. Educ. 2011, 57 (4), 2333− 2351. (17) Zorrilla, M.; Menasalvas, E.; Marín, D.; Mora, E.; Segovia, J. Web Usage Mining Project for Improving Web-Based Learning Sites. In Computer Aided Systems Theory − EUROCAST 2005: 10th International Conference on Computer Aided Systems Theory, Las Palmas de Gran Canaria, Spain, February 7−11, 2005, Revised Selected Papers; Moreno Díaz, R., Pichler, F., Quesada Arencibia, A., Eds.; Springer Berlin Heidelberg: Berlin, Heidelberg, 2005; pp 205−210. (18) Hedtrich, S.; Graulich, N. Crossing Boundaries in Electronic Learning: Combining Fragmented Test Data for a New Perspective on Students’ Learning. In Computer-Aided Data Analysis in Chemical Education Research (CADACER): Advances and Avenues; Gupta, T., Ed.; ACS Symposium Series; Oxford University Press: Washington, D.C., 2017; Vol. 1260, pp 21−28. (19) Pelánek, R. Applications of the Elo rating system in adaptive educational systems. Comp Educ. 2016, 98, 169−179. (20) More information about the LMS Analyzation Kit and Easy Snippet Feedback Edit software, how the algorithms work, and how they are configured can be found on the Education Software homepage: http://educ.science-teaching.de. There you can also find additional material and download the software. (21) Bol, L.; Hacker, D. J.; O’Shea, P.; Allen, D. The Influence of Overt Practice, Achievement Level, and Explanatory Style on

Calibration Accuracy and Performance. J. Exp. Educ. 2005, 73 (4), 269−290. (22) Hedtrich, S.; Graulich, N. Translating Numbers into Feedback: Providing Students with Automatically Generated Feedback. Am. J. Educ. Res. 2018, 6 (2), 108−116. (23) Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; L. Erlbaum Associates: Hillsdale, NJ, 1988. (24) Hattie, J. A. C. Visible learning. A Synthesis of over 800 MetaAnalyses Relating to Achievement, Reprinted; Routledge: London, 2010.

E

DOI: 10.1021/acs.jchemed.8b00173 J. Chem. Educ. XXXX, XXX, XXX−XXX