Implementation of Online Lecture Videos in Introductory Chemistry

Aug 16, 2016 - However, a body of literature indicates that evaluators are highly influenced by perceived effort of service providers ( 23, 24 ). In o...
3 downloads 0 Views 328KB Size
Chapter 6

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

Implementation of Online Lecture Videos in Introductory Chemistry Molly Goldwasser,1 Pamela L. Mosley,2 and Dorian A. Canelas*,2 1Office

of the Vice Provost of Academic Affairs, Duke University, Durham, North Carolina 27708, United States 2Department of Chemistry, Duke University, Durham, North Carolina 27708, United States *E-mail: [email protected]

We describe a case study involving the preparation of an extensive set of online videos to web-enhance a campus-based introductory chemistry class. Student performance and perceptions were compared for two groups: an experimental group, who could freely access the videos during the semester, and a control group, who did not have access to the videos. No statistically significant difference in performance was observed on a common final exam for these two groups. Students in the control group gave statistically significantly higher ratings for “overall quality of instruction” and “workload; amount of effort/work” on the end-of semester formal course evaluations. Qualitative sentiment analysis revealed more positive sentiment than neutral or negative sentiment in the free response comments of both groups. Implications of differences in student perception and valuation of instructor effort in traditional live lectures versus courses that employ pre-recorded lecture videos are discussed.

© 2016 American Chemical Society Sörensen; Online Course Development and the Effect on the On-Campus Classroom ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

Introduction Student attention span and mind wandering during traditional live lectures constitute substantial barriers to retention of new information (1). Videos and recorded webcasts have been shown to be particulary well suited for the lecture format because these tools can present information in an attractive manner and, more importantly, allow learners to review the lecture content at their own pace. In some contexts this leads to more successful learning outcomes (2, 3) and promotes self-efficacy (4). Advocates have suggested that using online technologies can increase student motivation and engagement and improve information processing (5–7). These are correlated with increases in conceptual learning gains (5, 6); this has been found to be especially true for relatively low achieving students (7). To this end, addition of online, self-paced tools, such as videos, screencasts, or podcasts, fosters a learner-centered approach for lecture-based courses. A rapidly growing number of instructors in higher education have employed lecture videos or webcasts to supplement their classroom environments. A search of the literature reveals that, in many settings, these tools appear to have a positive, constructive impact on the academic environment and are perceived to be beneficial (4, 8–10). DeGrazia et al. observed that students supplied with optional video lectures came to class much better prepared than when they had been given textbook readings (11). He et al. examined the use of videos as a supplement to learning in an undergraduate analytical chemistry course by creating and uploading tutorial video clips about particular concepts and problems that students identified as difficult (2). Based on students’ feedback and exam performance, the researchers concluded that online tutorials are a valuable, flexible, and cost-effective tool for “improving student mastery of chemistry problem solving (2).” In experiments in which webcasts of live lectures were subsequently available to a randomly selected subset of students, those learners reported “positive learning experiences and benefits from using webcasts,” and “more webcast viewing was associated with higher performance (12).” Finally, in cases where some students are absent for legitimate reasons, the availability of lecture webcasts or videos enable those students to “improve their course grades by viewing the lectures online (13).” On the other hand, some studies highlight emerging insights into challenges and disadvantages of employing video lecture technology. A few studies offered a counterpoint to Traphagan’s reports of a positive correlation between webcast viewing and grade outcomes. For example, Owston et al. found that more viewing was not necessarily associated with higher performance, and Leadbeater et al. found lecture recordings do not have a significant impact on academic performance (7, 14). Indeed, student use of videos outside of the classroom might not always be as efficient, efficacious, and well-received as instructors intend. Through qualitative interviews with students, Cilesiz found that undergraduate learners in on-campus classes that relied heavily on recorded video lectures for content delivery moved through four stages: “ignorance, disillusionment, crisis, and coping (15).” Supplying video lectures to students also can cause some concern among instructors because some believe this will negatively impact class attendace even in cases where the student has no legimate conflict with attending 64 Sörensen; Online Course Development and the Effect on the On-Campus Classroom ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

class, and some evidence exists that this does happen in some settings (12, 14). One study demonstrated that there was a subset of students who benefitted more from attending live, in-class lectures over viewing video lectures; the authors hypothesized that this was because of better concentration, classroom interactivity, and the viewing of in-person demonstrations (16). After weighing the pros and cons of online technology, most modern educators agree that video lectures have the potential to be a valuable course component and significantly change students’ college experience. But, research on best practices for video or webcast format and the impact of various video formats on student learning and performance is still in its infancy. Researchers and course designers have begun to determine production level decision factors that impact effectiveness of learning from online videos, such as the fact that a series of shorter duration videos are preferred over longer ones (17, 18). Educational psychologists have explored video format choices on important learning parameters such as cognitive load and perceived social presence (19, 20). The study presented herein was designed to serve as a case study investigation of the implementation of online video lectures in a campus-based introductory chemistry classroom environment. This study explored how undergraduate students perceived the lecture video resources and examined whether or not the availability of these resources affected student performance on summative assessments.

Methods As faculty ramp up the quality of their online content and online course offerings, many seek to understand how traditional students in their on-campus, brick-and-mortar courses can also benefit from these resources. In the case described in this chapter, the instructor, who had created online videos for a series of open online courses, made these videos available to the on-campus undergraduate students enrolled in Chemistry 99D (Introduction to Chemistry and Chemical Problem Solving) at Duke University beginning in the Fall 2014 semester. These videos were part of the normal pedagogy of the course that term and were available to all enrolled students. We seek to investigate the extent to which the availability of this resource impacted not only student grades, but also how students perceived the class and the content.

Description of the Course and Videos The campus course, Introduction to Chemistry and Chemical Problem Solving, has been previously described (21, 22). Since its conception in 2009, the classroom structure has combined live lectures, chemical demonstrations, and student-centered group activities. Creation of the course as part of a larger departmental curriculum revision has led to substantially higher performance and retention of students who matriculate with SAT or ACT math scores in the lowest quartile of scores for their class (21). 65 Sörensen; Online Course Development and the Effect on the On-Campus Classroom ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

In 2014, more than 80 video lectures were recorded and edited to include periodic in-video interactive questions. These videos were placed online as part of a series of shortcourses on the Coursera platform. Videos varied in length from under five minutes to slightly over twenty minutes, could be played with all audiovisuals at student modulated speeds from 0.75x to 2x regular speed, and contained two types of pausing options because the ability to pause has been shown to both increase learning and increase “likability” of videos (3). The types of pausing included free pause with the ability to rewind, controlled completely by the students, and interpolated automatic pausing with interactive application questions, which has been shown to reduce mind-wandering (1). Students in the Fall 2014 campus-based class were the first cohort of undergraduate students to have access to these videos, which were highlighted in course’s unit plans on the course website underneath listings of the desired learning outcomes. Live demonstrations, in addition to those available via video, were still used in the on-campus class, and the live-lectures previously employed were converted to more interactive class discussions of problems. Other aspects of the course remained as previously described. Description of Sample The participants in this experiment were undergraduates at a large private university in the southeastern United States. All participants were enrolled in Introduction to Chemistry and Chemical Problem Solving (Chem 99D) during the Fall 2012 (N = 69) or Fall 2014 (N = 52) semesters. No recruitment materials were used to incentivize participation. Investigators did not recruit subjects for this work because this was initially conceived as an internal assessment project, and the team did not want to introduce any selection bias. Researchers collected all final course grades, final exam scores, and the institution’s formal course evaluations for analysis. Experimental Protocol Researchers compared data collected from two groups of students. The control group was comprised of students who enrolled in Chem 99D in Fall 2012, when there were no videos corresponding to the course. The experimental group was comprised of students who enrolled in Chem 99D in Fall 2014, when videos corresponding to the course were freely available and listed as part of unit plans provided to students. The same instructor taught both the 2012 and the 2014 section of this course. Other than the addition of the supplemental videos, the syllabus for the course over time was constant. Researchers conducted t-tests to compare mean final exam grades between the two groups. Individual midterm exam scores were excluded from the analysis because different midterm exams were given each semester, but the final exam remained constant. Researchers also conducted t-tests to compare mean quantitative feedback from Likert-scale questions on student course evaluations. The university’s standard course evaluation items address overall quality of the course, overall quality of instruction, the workload/amount of effort and work 66 Sörensen; Online Course Development and the Effect on the On-Campus Classroom ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

required, the difficulty of the subject matter/course, and self-reported time spent on content outside of class. To triangulate the final exam scores and the quanitative course evaluation data, researchers also analyzed open-ended question from course evaluations from each of these course sections to seek information about student sentiments regarding the class. Specifically, researchers conducted a qualitative sentiment analysis of the comments provided in the course evaluations to determine the extent to which comments about videos were positive, negative, and neutral. A 5-point Likert scale was used to rate the comments ranging from 1 (very negative), 2 (negative), 3 (neutral), 4 (positive), to 5 (very positive). Two impartial raters reviewed the course comments and compared their codes for each comment to increase concordance. Raters looked for key words associated with each level of the Likert scale to code responses; the raters counted key words in the response and mapped them to the appropriate rating using the coding schema.

Limitations Before beginning the discussion of results, a few limitations of this work should be noted. This study was relatively small in scope and sample size (N = 69 for control group and N = 52 for experimental group) and was conducted in an American educational context. The undergraduate student population studied herein was fairly homogeneous in terms of age; >90% of enrolled undergraduate students were traditional college age (under 24 years old). The students enrolled in the course were either first-year or second-year undergraduates. Due to the characteristics of the populations studied herein, caution must be exercised in attempting to extrapolate the findings to populations of more advanced undergraduate students or graduate or professional students. To protect individual confidentiality and due to the small number of students in some groups, data was not disaggregated using demographics.

Results and Discussion The results of the comparison of final exam grades are depicted in Table 1. There was no statistically significant difference (p = 0.227) between average (mean) course grades between the two groups. The students who had access to the supplemental video resources did not perform, on average, any differently from the students who did not have access to the supplemental videos. These results support similar findings in other academic subjects (7, 14). Quantitative Comparison of Course Evaluation Items The overall quality of the course, as measured by the mean response value of the student course evaluation item, was not statistically significantly different between the two course sections (Table 1). There was also no statistically significant difference between the mean evaluation scores for difficulty of subject matter and self-reported time spent outside of class. These results could be 67 Sörensen; Online Course Development and the Effect on the On-Campus Classroom ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

indicative of the consistency of the content in the syllabus and on the final summative exams. If students dedicate similar amounts of time studying similarly rigorous content, it seems reasonable that they will score similarly on the final exam, as we see with these samples.

Table 1. Comparison of Final Exam Scores and Student Responses on End-of-Semester Formal Course Evaluations

Downloaded by UNIV OF CALIFORNIA SAN DIEGO on January 16, 2017 | http://pubs.acs.org Publication Date (Web): August 16, 2016 | doi: 10.1021/bk-2016-1217.ch006

Item

Control

Experiment

p-value

83.77

80.89

0.2270

Overall quality of course

4.56

4.46

0.4183

Overall quality of instruction

4.64

4.29

0.0134a

Workload; amount of effort/work

4.16

3.62

0.0024b

Difficulty of subject matter/course

3.76

3.67

0.5973

Time spent on content outside of class

2.93

3.14

0.1816

Mean final exam score Mean course evaluation responses:

a

Indicates significance at p