Online Approaches to Chemical Education : Modern “Homework” in

Chapter 9. Modern “Homework” in General Chemistry: An Extensive Review of the Cognitive Science. Principles, Design, and Impact of Current. Online...
0 downloads 12 Views 405KB Size
Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Chapter 9

Modern “Homework” in General Chemistry: An Extensive Review of the Cognitive Science Principles, Design, and Impact of Current Online Learning Systems Erin E. Wilson1 and Sarah A. Kennedy*,2 1Chemistry

Department, Westminster College, 319 South Market Street, New Wilmington, Pnnsylvania 16172, United States 2Chemistry Department, Radford University, 801 East Main Street, Radford, Virginia 24142, United States *E-mail: [email protected].

Use of online learning systems in place of traditional paper-and-pencil homework in general chemistry courses has tremendously expanded over the past decade. These systems, like paper-and-pencil homework, serve the purpose of providing students with practice in essential problem-solving skills. However, due to advances in both technology and cognitive science, online learning systems have become substantially different than paper-and-pencil homework in pedagogically important ways. In this review, we discuss the range of features of online learning systems such as immediate feedback, multiple attempts, linking to resources, adaptive technology, and mastery requirements in light of modern cognitive science principles to reveal potential for improvements in student learning and possible student learning pitfalls. Online learning systems can be categorized as responsive, mastery-based and adaptive. We provide a guide to this categorization and the features built into current online learning systems for general chemistry and review the nascent body of literature on the impact of online learning systems on student outcomes in general chemistry and other courses. Finally, we discuss some new developments that may soon arrive on the scene for online learning systems.

© 2017 American Chemical Society Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Introduction Early reports of the impact of using computer-based homework began appearing in the literature in chemistry education in 1996 (1). By 2003 online learning systems were in limited use (2). These learning systems were developed to enhance the student learning experience and to provide instructors an alternative to paper-and-pencil homework. Online learning systems in chemistry originally served as an online alternative format for textbook questions, but have since evolved to designs based on cognitive science to improve learning for students and provide useful performance statistics for instructors. In recent years, a combination of cutting-edge computer science and a synthesis of the cognitive science of learning has led to significant evolution of the online learning system landscape in chemistry. Online learning systems are now designed to adapt to individual learner needs, encourage engagement with concepts over surface analysis, and help students develop self-regulation and metacognitive skills. Here, we focus on current generation responsive online learning systems as well as the new mastery-based and adaptive learning programs available for general chemistry. Research in the cognitive science behind online learning systems is discussed as well as the so-far limited reports of the impact of these systems and best practices for integrating these tools into the chemistry classroom.

Why Use Online Learning Systems? Online learning systems for chemistry have been developed to provide an online platform for active engagement, or time-on-task, of students with chemistry content. While these learning systems vary widely in approach and implementation, they all share a core set of features aimed at providing the benefits of graded homework to student learning (3, 4). Many studies of online learning systems in introductory, general, and organic chemistry demonstrate that online systems provide the same time-on-task benefits as more traditional forms of engagement such as paper-based homework or quizzes (2, 5–15). Measures of performance (final exam, ACS exam, course grade, or proportion of passing students) have been shown to correlate with online learning system use and scores. In one rigorously analyzed example, higher scores on an online learning system were significantly correlated with higher final exam scores even after correcting for initial placement exam scores (prior knowledge) in an introductory general chemistry course (5). Others report that online learning systems are as effective (2, 6–8) or more effective (9, 10) than traditional paper-based homework. The sections below detail features of online learning systems that benefit students and instructors. Pedagogical aspects of these features will be addressed in a subsequent section: “Pedagogical Underpinnings and Considerations of Online Learning Systems.” Benefits for Students One of the major impacts of online learning systems is that students receive immediate feedback on their work rather than waiting until an instructor or 102 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

teaching assistant grades and returns it, often up to a week later. Such rapid feedback is considered to be one of the strengths of online learning (2, 6–9, 11–15). Sometimes feedback is limited to correct or incorrect, but most online homework systems now provide more elaborative, or explanatory, feedback. Feedback can occur as hints while a student works on a problem, feedback after a correct or incorrect answer has been submitted, or through revealing the solution after a student works on a problem or assignment. Another common feature of online learning systems is the ability to retry problems (16). This is significantly different than paper homework, in which students complete a single attempt at each problem. In online systems, follow-up attempts can occur at the individual problem level, by topic or learning outcome, or whole assignments. In some cases, follow-up attempts are provided at no grade penalty, while in other cases small or moderate penalties can be levied, at the instructor’s discretion, for requiring multiple attempts to successfully complete a problem. Many online learning systems link homework problems with learning resources. In some systems, problems include a reference to a textbook section or a hyperlink directly to the relevant section of an e-book, or a hyperlink to PowerPoint slides where information may be found related to the topic. Some newer online learning systems have gone one step farther, integrating the textbook with questions that students answer online. This provides students with access to just-in-time content to be reviewed as it is needed to solve homework problems. Other systems include links to step-by-step solutions, interactive tutorials, or video presentations of how to work similar problems. More sophisticated online learning systems offer more comprehensive benefits to students at the course or student-development level. For example, several online learning systems now include diagnostic analysis of student performance. This type of analysis helps students understand which content areas need further study for mastery. Likewise, a few online learning systems have begun to incorporate a metacognitive component, providing feedback to students about the accuracy of their own perceptions of their learning state. By comparing students’ pre-assessment and/or post-assessment predictions of performance with actual performance, students become more aware of discrepancies between what they have mastered and what they believe they have mastered. Benefits for Instructors In addition to saving instructors time otherwise spent grading homework assignments by hand, online learning systems offer the ability to analyze student performance at both the course and individual student level. Assignment statistics provide a way to determine which topics are generally understood or not understood by the students, as well as which topics required the most time to complete. Typical statistics provided include the fraction of students successfully completing the problem, the average time spent on a problem, and the number of attempts required to complete the problem. Some systems provide sophisticated graphics or compile lists of the most challenging topics. These features allow instructors to target classroom or recitation time to those topics. 103 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Instructors may also use online learning systems to get an in-depth view of individual students. For most online learning systems, instructors can use the system to view each answer given by a student to a particular topic, which may provide a window into the student’s process that enables the instructor to target individual intervention and identify common misconceptions occurring in their classes. A few systems incorporate the ability to allow or require students to show their problem-solving process as well as providing a solution. More often learning systems show the student’s submitted answers only. On a broader level, the statistics reported by online learning systems can facilitate curriculum assessment. In addition to using data from standardized exams and other traditional assessment instruments, instructors can mine the data captured by online learning systems to determine strengths and weaknesses in their courses. For many online learning systems, problems can be or already are linked with specific learning outcomes so that analysis can be performed using the learning outcomes for the course or program. Curriculum designers can analyze student performance on their learning outcomes to “close the loop” between teaching, assessment and curriculum revision to make courses more effective (17). As course revisions are made, progress in improving student learning outcomes can be monitored through online learning systems to assess the effectiveness of curriculum changes. In addition, some learning systems provide comparison data using all students who have used the learning system. Instructors can compare their classes’ performance with thousands of students in a manner similar to a national standardized exam, but with the additional advantage of often being more customized for the course.

Generalized Types of Online Learning Systems for General Chemistry In this chapter, online learning systems are categorized as responsive, mastery-based and/or adaptive. All of the online learning systems discussed here are responsive, because they provide immediate feedback to students on each problem, or set of problems, attempted. Increasingly, online learning systems are emerging that incorporate more sophisticated structures based on cognitive science, such as the idea of concept mastery or the ability to adapt to the learning state of individual students to provide a customized learning experience. Mastery-based learning systems have as a central feature some form of assessment for mastery. Students who do not demonstrate mastery are provided with feedback and opportunity for instruction before attempting the assessment again. Adaptive learning systems are those that tailor content to the individual student based on their knowledge state, which must be assessed before activities are presented. Ideally in adaptive systems, students work on material that they have not yet mastered, but for which they have the necessary prerequisite skills. Some of the online learning systems require adoption of a specific textbook, others are independent of a textbook, and some have the textbook optional. A few systems even integrate the textbook into the entire online learning experience for the 104 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

student. Table 1 lists many of the most commonly used online learning systems developed for general chemistry along with their learning system categorizations and textbook options.

Table 1. Examples of Online Learning Systems in General Chemistry Currently Available; Sorted by Publisher

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Learning system category Publisher McGrawHill

Wiley

Learning system

Responsive

Masterybased

Adaptive

Connect

yes

LearnSmart

integrated

ALEKS

no

WileyPlus

yes

Orion

integrated

Catalysta

yes

MindTap Cengage

Linked to textbook?

integrated optionalc

OWLv2

optional

WebAssignb

optional

MacMillan

Sapling

optional

Pearson

Mastering Chemistry

optionalc

yes

Norton

SmartWork

optionalc

yes

Madra Learning

Madra Learning

optional

Catalyst is being phased out, but currently remains part of WileyPlus. b Cengage promotes MindTap and OwlV2 for general chemistry, whereas WebAssign is recommended for math and physics. c Adaptive: Optional indicates that adaptivity is at the assignment level only and is optional for the instructor to assign. a

Responsive Learning Systems Responsive learning systems were originally designed to provide an online platform for assigning and grading traditional-style homework assignments or quizzes. They take advantage of the capabilities of computers to incorporate the features described above in “Benefits to students”: immediate elaborative feedback, multiple attempts to solve problems and connections from questions to specific learning resources. These programs are responsive in that they provide 105 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

immediate feedback for each question that a student answers and allow him or her to either try the problem again or get a similar problem to attempt. Later, in the discussion of pedagogical underpinnings of online learning systems, Table 2 provides the types of feedback provided to students in the various online learning systems as well as the availability of multiple attempts. Strictly responsive online learning systems tend to be tied closely to a textbook (Table 1) and often use questions from the end-of-chapter material of that textbook. They may also include a library of their own additional problems authored by content experts that instructors may use and/or allow instructors to author (and sometimes share) questions. Most online learning systems rely heavily on problems very similar to those found in traditional homework. There is great potential for using online learning systems to redesign practice content to reflect current cognitive science and chemistry education research that is just beginning to be realized. This has started to appear in many systems in the form of multi-part interactive questions, interactive simulations or experiments, and guided tutorials. The Catalyst system, described below in “Impact of online learning systems,” is designed to bridge student understandings across the molecular, symbolic and macroscopic realms, reflecting recent chemistry education research. Another example is beSocratic, which emphasizes questions that probe deep understanding of topics by creating activities based on student-generated visualizations (graphs, chemical structures, etc.) (18). Some online learning systems provide students with information about their learning in the context of the whole body of course material (or some smaller section of material) in addition to showing their grade on assignments. Systems like ALEKS, LearnSmart, Orion, and Madra Learning include graphical and other representations of a student’s progress on learning outcomes and/or topics or book sections. This information can help students track their progress in a course, as well as setting assignments into the broader context of the course material. Responsive learning systems provide a course of action for a student based on the way he or she answers a question and may point to additional textbook reading, examples, or online content. These learning systems resemble traditional paper-based homework, however, in that they tend to be lists of problems to work regardless of the readiness of a student to address those problems. Purely responsive online learning systems, in other words, do not tailor the work presented to a student to the current knowledge state of the student, nor do they require a student to demonstrate mastery of a topic.

Mastery-Based Learning Systems Several of the online learning systems available for general chemistry utilize mastery learning concepts in an attempt to ensure that the student retains the knowledge learned in the program. Mastery learning, as coined by Benjamin S. Bloom in 1971, requires instruction to include formative assessments after initial learning so that corrective exercises can be implemented for the learner (19). After a student works more problems in their deficient areas, a second assessment again gauges mastery. 106 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

At first glance, responsive online learning systems may all be considered mastery-based because they combine feedback with multiple attempts, the low penalty for incorrect attempts can cause students to adopt a guess-and-check approach (13, 20, 21). In addition, seeing the same problem multiple times (perhaps with variables changed) allows students to find an approach based on surface features that yields the right answer without engaging with the concepts underlying that problem (22). Thus they might process the material in a way inappropriate to the desired learning (see “Pedagogical underpinnings and considerations of online learning systems” section below). Mastery-based systems have two basic features that distinguish them from other responsive learning systems: they require the student to correctly respond to a minimum fraction of a linked set of problems about a topic for credit, and the individual problems in a set are presented differently. Together, these features impose a penalty for a guess-and-check method and discourage an approach based on surface features. When too many in a set of problems are answered incorrectly, the student is prompted to review the material prior to reattempting a new set. They must meet the required threshold on a set of problems before they receive credit for the assignment. This severely penalizes a guess-and-check approach by vastly increasing the time penalty for incorrect responses (23). Furthermore, varying the style of problems changes the surface features of a problem, making it less likely that a problem-solving strategy based on surface features of one particular problem presentation would be successful on other versions and forcing students to grapple with the content in a deeper, more expert-like way (24, 25). Several of the online learning systems reviewed in this chapter are mastery-based according to one or both of these criteria. The Catalyst mastery assignments in WileyPlus presents groups of questions based on a topic, and students must successfully answer a specified percentage for full credit. If the student does not reach the specified level of successful completion, they may begin again with a new set of questions until they run out of attempts at the group. The questions all center on the same topic, but can be different in format. Similarly, OWLv2 presents topics as groups of related questions with significant differences in format to decrease pattern-based memorization of solution strategies and encourage conceptual understanding. Students must complete a minimum number of the questions from the group to achieve mastery. Failure to complete that number of questions results in the presentation of a fresh set of questions for another attempt. MindTap presents mastery checks after several textbook sections with embedded problems have been completed. Students must give correct responses to a specified number of sets of grouped problems, usually with at least one extra set available, to earn credit for mastery of topics. In this case, each set of grouped problems may be attempted multiple times, but the number of sets a student may attempt is fixed. ALEKS, another learning system that incorporates features of mastery approaches this somewhat differently, focusing on retention over time. Students are credited with having “learned” a topic if they complete a series of similar questions on that topic. The exact number of questions they must successfully complete is adjusted based on the pattern of correct and incorrect answers submitted. Thus a student demonstrating proficiency with a skill will complete 107 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

fewer problems than one who is not yet proficient. Every 10-14 days the students complete a learning assessment to determine their overall mastery of the course material. Based on the results of the assessment, they may be required to address topics previously completed again because they have not mastered them over this longer time frame. This is intended to help students retain knowledge and skills over time, as well as discourage a “cram and forget” strategy in the course. Widely available mastery-based online learning systems are a recent development in online learning in chemistry. There have been relatively few independently published studies on the impact of mastery-based learning systems on student performance. One instance of the evidence for mastery, in the closely related field of physics, compared learning using immediate feedback homework with unlimited attempts versus a mastery style online program; students in the mastery group outperformed the immediate feedback homework group on every question in a delayed post-test (23).

Adaptive Learning Systems Another relatively new concept in online learning is the availability of systems capable of detecting and adapting to the knowledge state of the individual learner. Most online learning systems present students with pre-built assignments relevant to the topics covered in a course. All students therefore work the same problems, regardless of their background and current knowledge state. However, it has long been recognized in chemistry that a student’s prior knowledge should be taken into account in instruction (26). Adaptive systems use initial and ongoing formative assessment of each student to determine their current knowledge state, either within the global framework of all course content (ALEKS) or within a subset of the content (Orion, LearnSmart, and some optional assignments in OWLv2, MasteringChemistry and SmartWork). Based on assessment data, these learning systems respond uniquely to each student, tailoring the problems he or she receives and the materials that particular student is prompted to review based upon specific areas of individual strength and weakness. These types of systems prevent students from moving to topics for which they are not yet ready, and allow them to progress faster if they have already mastered some of the content. Thus, the programs that are designed with adaptive learning at the core (LearnSmart, ALEKS, Orion, and MindTap) can also be considered programs that require mastery. According to Oxman and Wong, adaptive learning systems must have a content model, a learner model and an instructional model (27). The content model is a map of the content including what specific content is encompassed and how content is interrelated. A simple example would be that understanding the mole concept should occur prior to attempting to learn molarity. The learner model is the model of the student’s knowledge state (using the content model) based on the data collected by the learning system on initial assessments and ongoing work in the system. The content model and learner model together place the student within the framework of the content, and the instructional model uses both to formulate an individualized response to the student. 108 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Knowledge Space Theory is the mathematical theory employed by ALEKS (28–32). Initial assessment of each learner maps their existing knowledge against the knowledge space. Based on the assessment, ALEKS selects topics for which that particular student has the prerequisite knowledge but for which he or she has not yet demonstrated mastery. Topics for which a student has not demonstrated knowledge of the prerequisites are locked until the prerequisites are completed. Perhaps the most unique and powerful aspect of ALEKS is the periodic reassessment of the student’s knowledge. This feature provides a unique focus on retention of knowledge beyond a single period. Topics that a student is unable to successfully complete in these assessments are removed from their knowledge state and must be readdressed by the student. Orion and LearnSmart are adaptive online study tools that can be assigned by instructors or used independently by students. In both systems, the learner model includes not only the record of questions correctly and incorrectly answered by a student, but also the total time spent answering questions and the student’s confidence in their knowledge and skills for each question. After determining the knowledge state of a student within a content area, typically a chapter or chapter section from a textbook, the system makes specific recommendations for materials for a student to review. It also tailors the questions that they see after reviewing content to match those topics for which they were not proficient. LearnSmart incorporates prompted reviews of older material to encourage retention, based on what material was initially more difficult for the student and based on the topics for which the student was least confident. A similar capability is built into OWLv2 as well, in the “Adaptive Study Plan” assignment option. Students take an initial assessment quiz on a chapter or chapter sections and are guided to study materials based on their performance on the quiz. Follow-up quizzes focus on the sections on which the student was not previously proficient. MasteringChemistry and SmartWork have the capability for the instructor to add extra assignments through an adaptive learning feature. These assignments are generated based on topics from common assignments on which students demonstrated deficiencies. While this can be considered adaptive, they are assigned solely at the discretion of the instructor and often may be limited by the number of available textbook problems. These optional adaptive assignments do not provide the course-level adaptivity like the ALEKS, Orion, and LearnSmart systems. Two studies to date have investigated the difference in learning gains between responsive online learning systems and the adaptive system ALEKS. In both of these studies, the learning gains were higher using the adaptive program. This gain is attributed to the ability to map the student’s knowledge space and provide needed review for topics a student has not mastered (33, 34). More peer-reviewed studies are needed to investigate the positive effect on learning outcomes in mastery-based and adaptive programs in general chemistry.

109 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

The Impact of Online Learning Systems The impact of online learning systems on student learning in non-chemistry STEM fields is generally ambiguous or positive. In math and physics, some evidence suggests improvement in the student’s performance when using online homework versus paper-based homework. For example, a year-long study of college algebra students showed increased class retention and higher exam scores in the student group using online homework rather than paper-based homework (35). In this study, the paper-based homework was graded for completeness and correctness with comments for the students, while the online-based homework had multiple attempts with immediate feedback and more extensive online help. In an engineering course, Liberatore reports student learning gains when an online homework program (Sapling) was used instead of online Blackboard quizzes (36). Student groups were assigned textbook homework and either the online Sapling additional homework or comprehension quizzes. Over 90% of the students using the online homework system earned at least a C in the course, whereas only 72% of the quiz control group did the same, indicating that there is positive effect of online homework (36). In contrast, a study by Bonham suggests that rigorously reviewed hand-written homework had equal impact as online homework in an introductory calculus-based physics course (37). Reports of outcomes from online learning systems use in chemistry are equally ambiguous. Cole and Todd reported no difference in outcome for students in general chemistry using a course management system-based online delivery for homework vs. a paper-based delivery of the same homework (2). This was true even though written homework was ungraded (vs. immediate feedback online), although students from the paper-based groups may have accessed the online system. Likewise, Fynewever reported no significant differences in gains between paper-based homework (80.4%) and course management system-based assignments (86.3%) in general chemistry using a pre- and post-test (6). Both of these studies used the WebCT (now Blackboard) course management system to adapt problems to an online format. Long-term retention of first-semester general chemistry knowledge between course sections that did and did not use online learning systems was investigated by Gebru et al. (8) In this study, sections using WebAssign or OWL scored 2% higher on the first semester portion of the ACS Exam, a statistically insignificant difference. On the other side of the scale, Eichler and Peeples found significant improvement in final exam score for students who used Mastering Chemistry or ALEKS in a general chemistry course (33). However, in this case students opted into or out of use of the online learning systems, suggesting that more motivated students may have disproportionately populated the online learning group. In contrast, Arasasingham et al. compared course sections using graded textbook problems and an independently-designed version of an online learning system (now Catalyst) for general chemistry. The section using the online learning system outperformed the textbook section on two out of three exams, and further analysis suggested that they had especially strong gains in conceptual understanding compared to their counterparts (5). A similar outcome was observed by Malik et al. in organic chemistry using Connect, where the online learning system group outperformed the graded paper-based 110 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

homework group significantly on the ACS Organic Chemistry Exam (9). It is worth noting that more of the studies reporting positive effects of online learning systems are reported for more modern systems, which may reflect improvements in online learning system efficacy. Several of the published reports on the impact of online learning systems have noted that there is a sub-population of students who benefit more from online homework. One study of a college algebra course found that, while the performance of all students using online homework and traditional homework were not statistically different, lower-skilled students using online homework had more learning gains than the lower-skilled students who used traditional homework (38). Likewise, online learning systems have been reported to increase retention or success rate in chemistry courses even when overall average exam scores do not change (8, 14). In one case this was seen as a shift from D and F grades to B and C grades, while the proportion of A grades did not change (14). Overall success rate, measured by grades above a D, in the course increased from 71% to 90% in this case. The author concluded that online learning systems particularly helped students who struggled to succeed in the course. This effect exists in the affective domain as well. Cole and Todd reported that students scoring lower on the Group Assessment of Logical Thinking (GALT) reported higher satisfaction with and preference for an online learning system over paper-based homework, while the opposite was true for students scoring higher on the GALT (2). The authors speculated that the multimedia presentation of the assignments may have facilitated learning for the low-GALT students, while the high-GALT students found navigating the technology to be an unnecessary complication. There are also multiple reports in the literature that implementing online learning systems spurred changes in student behavior such as more regular engagement with material and using feedback to learn from mistakes (20, 21). The discipline of chemistry presents unique learning challenges to students. Students must comprehend, connect, and translate between macroscopic, particulate, and symbolic representations of phenomena (28, 39–49). Understanding each level, and especially making connections between levels, develops slowly in novice chemistry learners (28, 40, 46, 47, 50–53). Online learning systems can take advantage of multimedia capabilities to help students develop mental models in targeted ways. One online learning system, currently known as Catalyst, was deliberately designed to address the molecular, symbolic and macroscopic representations and help students build connections between them (10). Students using this online learning system in an early form not only showed overall increased learning gains compared to students doing traditional homework, but also learned differently, with early gains in conceptual areas compared to their counterparts (10). Finally, one area that has been identified as a weakness of online learning systems is the loss of important motor-memory learning that comes from handwriting structures. Students performed better in organic chemistry when they reported consistently doing homework first by hand before entering answers into an online learning system (54). Although this is currently a limitation of online learning systems, it may be anticipated that online learning systems will soon allow hand-drawn structures to be entered via touchscreens (18, 55, 56). 111 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Challenges to Student Learning in General Chemistry Several different challenges to student learning in chemistry have been identified and described by chemistry education scholars. These barriers to learning, such as the multiple-domain nature of the discipline and persistent misconceptions, hinder the acquisition of knowledge and skills in this discipline. In addition, cognitive science has identified more generalized challenges to learning, such as cognitive overload and underdeveloped metacognitive knowledge monitoring. Just as instructional strategies and assessment strategies must be designed to address these challenges and facilitate student learning, so must online learning systems. In chemistry, multiple domains of understanding have been described: the symbolic, particulate and macroscopic domains (39, 42–45). An expert in chemistry easily transitions between these domains, and the goal of chemical education is to facilitate the transition of novice learners toward this multidomain understanding. However, a challenge in chemistry education is that typical formative and summative assessments still tend to focus on the symbolic/numeric domain, with an underlying assumption that success on these assessment items reflects understanding of the other domains. However, it is well-documented that ability to solve macroscopic domain problems does not indicate understanding in the other domains (28, 40, 46, 48, 49, 57). Instead it is possible to build algorithmic strategies to solve macroscopic problems without the deeper understanding (24, 25, 40, 46, 53, 58, 59). This is dependent on instruction, as instruction deliberately designed to emphasize conceptual understanding has been shown to be effective (10, 40, 60). Instruction and practice therefore must deliberately target all domains of understanding. Another persistent challenge to building deep understanding in chemistry is misconceptions, or alternate conceptions, many of which have also been well-studied (47, 61–63). According to constructivist learning theory, learners construct knowledge from new information received by integrating it with existing knowledge and experience (64). However, it is possible for learners to construct a flawed, but entirely self-consistent and believable framework. These incorrect frameworks can be resistant to correction; confrontation with a contradictory piece of information is usually necessary (9). This creates so-called cognitive dissonance and causes the learner to seek to resolve the contradiction by reconstructing a new understanding. One role of well-designed homework is to address common misconceptions by creating such cognitive dissonance; specific feedback can then provide new information to help build an improved understanding. Another facet of cognitive science that should be considered in the design of online learning systems for general chemistry is cognitive load (39, 42–45, 65). It is widely accepted that learners process and synthesize information in a limited working memory. The working memory takes in and processes information (visual, auditory) from the environment as well as retrieving and storing information in long-term memory, much as a computer processor handles user input and retrieves and stores information in the hard drive. It is in the working memory that information and ideas are synthesized into knowledge 112 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

that a person has then learned in a meaningful way. Working memory is very limited in its capacity to hold information; only a handful of different items can be stored and manipulated at once (42). The number of items that must be juggled in working memory during a task is known as the cognitive load. This physical limitation creates difficulties for learners attempting to construct knowledge from many separate or interacting elements. Intermediate or expert learners of a subject form what are called schema, or frameworks of organized knowledge that allow multiple items to be handled by the working memory as a single item (65–67). The development and use of schema reduces the cognitive load required to perform a task. Novice learners have not yet formed schema, and so are particularly vulnerable to cognitive overload, in which there is too much information for the working memory to both hold and process. Cognitive overload impairs learning, so avoiding it is an important part of the design of any task given to students, including online learning systems (68, 69). Presentation of content, question design, and guidance or feedback provided can each either help to reduce cognitive load or contribute to cognitive overload. A sometimes overlooked contributor to student success in learning chemistry is metacognition. Working to mastery is something that expert learners self-regulate; these learners use opportunities, including homework assignments, to measure themselves against an achievement standard and self-correct deficiencies in their learning until they meet the standard (70). This process of continually comparing one’s knowledge to an appropriate standard is known as metacognitive knowledge monitoring (MKM) (70). Novice learners often lack the skills to (a) set an appropriate learning goal at the required depth of understanding, (b) evaluate their learning accurately against a standard and (c) determine and carry out appropriate corrective action. Accurate MKM allows a learner to apply learning strategies, time and effort in areas in which their performance does not meet their goals and, conversely, not to waste effort continuing to study in areas for which the desired level of mastery has been reached. They are also likely to accurately predict performance on assessments. A learner who underestimates either the level of understanding required or their own understanding does not continue to practice and study because they believe they have achieved a higher degree of mastery of knowledge than they have. It has been consistently observed across disciplines that weaker students overestimate their knowledge and skills, a phenomenon called the Dunning-Kruger effect (71–77). Often this kind of misconception of self must be confronted with evidence for students to begin develop MKM skills and judge their knowledge state more accurately. Interventions in which students are asked to compare their own predictions of performance against actual performance have been found to help students develop their metacognitive (MKM) skills (78).

Pedagogical Underpinnings and Considerations of Online Learning Systems Online learning systems originally aimed to improve student learning by providing the benefits of practicing with content exercises generally similar 113 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

to traditional homework. Leinhardt et al. found in 2007 that homework and self-directed study accounted for nearly 50% of achievement in a college-level chemistry course, making this out-of-class activity one of the most important for student learning in chemistry (4). Not only does homework improve knowledge and skills related to the content, but well-designed homework has been found to foster multidomain understanding (10), confront misconceptions (when feedback is provided), and even help develop self-regulated learning skills (79). Pedagogically-sound design of homework systems and effective use of them by instructors and students is therefore potentially a significant factor in student success in general chemistry. The basic model of online learning systems for general chemistry is the attempt-feedback-reattempt loop described by Zerr in 2007 in the context of an online learning system for Calculus (16). This design was inspired by the pattern of classroom activity in which students attempt a skill, receive feedback from the instructor, and then re-attempt the skill until they are able to complete it successfully. This is broadly based on Bloom’s mastery-based learning, in which formative assessment is followed by additional feedback and correction as necessary before additional assessment (19, 80). In this section, pedagogical considerations for timing and type of feedback, multiple attempts and content presentation are discussed. Then, attempts to design online learning systems to improve student self-regulation and the ability to harness multimedia for learning are presented. Table 2 indicates several of these pedagogical features for specific online learning systems.

Timing and Type of Feedback Formative feedback is broadly regarded as an effective and important component in the learning process. In her seminal work on feedback in learning, Shute describes three ways that feedback can facilitate learning (81). Feedback informs the learner that there is a gap between their performance and the desired knowledge state (MKM), increasing motivation to bridge the gap. Well-designed feedback can reduce the cognitive load by picking out and organizing the relevant information for the learner, freeing up working memory that would be used on these tasks for processing of information. Most obviously, feedback can create useful cognitive dissonance by confronting student misconceptions and errors as well as providing correction. Feedback on online learning tasks can take many different forms and be provided at different times (82). Although a significant body of research exists on the effects of different types and timing of feedback, mixed results have hindered consensus on best practices (81). Feedback interacts with many other factors, such as the motivation of the learner, the complexity and nature of the task, and the stage of learning (novice or more advanced). Some of the factors relevant to the provision of feedback in online learning systems are discussed below.

114 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Learning system

Correct response shown

Multiple attempts type

Optional hints

Step-by-step tutorials

Diagnostic feedback

Metacognition feedback

Learning state feedback

Sapling

yes with explanation given

same problem

yes

yes

yes

no

no

Mastering Chemistry

yes

same problem

yes

yes

yes

no

no

Connect

yes with explanation given

same problem

yes

no

yes

no

no

WebAssign

yes

same problem

no

yes

yes

no

yes

MindTap

yes with explanation given

new set of problems

no

yes

no

no

yes

WileyPlus

yes with explanation given

same problem

yes

yes

yes

no

yes

Orion

no

same problem delayed

yes

yes

no

yes

yes

Catalyst

no

new set of problems

no

yes

no

no

yes

SmartWork

yes with explanation given

same problem

yes

yes

yes

no

no

LearnSmart

yes with explanation given

same problem delayed

no

no

yes

yes

yes

115

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Table 2. Feedback Types and Method of Incorporating Multiple Attempts Used by Responsive Online Learning Systemsa

Continued on next page.

Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Learning system

Correct response shown

Multiple attempts type

Optional hints

Step-by-step tutorials

Diagnostic feedback

Metacognition feedback

Learning state feedback

OWLv2

yes with explanation given

similar problem

no

yes

yes

no

yes

ALEKS

yes with explanation given

similar problem

no

no

very limited

no

yes

Madra Learning

yes

same problem

no

no

no

yes

yes

a Some features are dependent on content adoption within each online learning system; the highest level of feature availability is reflected in these entries. Entries are based on author evaluation of online learning systems from fall 2016 through summer 2017.

116

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Table 2. (Continued). Feedback Types and Method of Incorporating Multiple Attempts Used by Responsive Online Learning Systemsa

Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

The form of feedback in most online learning systems is elaborative; that is, more information than simply whether a response is correct or incorrect is provided. There is a consensus across many studies that elaborative feedback is more effective in promoting learning than simple correct/incorrect feedback (83–85). The exact form of this feedback is highly variable; Table 2 shows the types of feedback used in the online learning systems for general chemistry, including optional hints, diagnostic feedback, step-by-step tutorials, and full explanations or worked examples. These forms of feedback exist along a continuum of specificity, with hints, diagnostic feedback and step-by-step tutorials providing more support by pointing out errors or key features and full explanations or worked examples relying on students to process and pick out key features (86). All of these types of formative feedback have been investigated, and a few general trends observed (81). Novice learners in a content area benefit from more explicit feedback, such as exposure to expert solutions to problems, scaffolding, or diagnostic feedback (85, 87, 88). More advanced learners may benefit more from less explicit feedback, even to the point of a simple correct/incorrect, as they begin to develop their own diagnostic skills and organize conceptual information into a scaffold (65, 89). Thus different online learning systems may be more appropriate depending on the population of students. Most online learning systems in general chemistry provide multiple forms of elaborative feedback through hints, guided tutorials, diagnostic feedback after incorrect answers, and/or full explanations or worked solutions (Table 2). Among the systems reviewed, only Madra Learning shows the correct answer without elaboration. Some systems show diagnostic feedback upon submission of an answer (Sapling, Mastering Chemistry, SmartWork, LearnSmart, Connect, WebAssign, WileyPlus). Some systems offer hints, step-by-step tutorials, or full solutions on demand (ALEKS, Mastering Chemistry, Sapling, SmartWork, Connect, WileyPlus, Orion). This allows students control of what feedback they utilize. Others offer step-by-step tutorials as assigned problems (e.g. Mastering Chemistry, Sapling). These levels of elaborative feedback are appropriate to the novice level of beginning chemistry learners. The timing of feedback in most online learning systems is immediate. Feedback in most cases occurs after an attempt at a problem is completed. The literature on feedback timing suggests that this is likely the most appropriate timing for novice chemistry students doing homework exercises, which largely involve procedural skills (81). Immediate feedback can relieve frustration and facilitate the learning of process skills in novice learners (85). Such students can often benefit from hints as they are attempting a problem, or viewing a detailed solution before reattempting a problem they failed to answer correctly (65). Zerr claims that the combination of immediate feedback and multiple opportunities for practice with similar problems (attempt-feedback-reattempt loop) mimics typical student-instructor interaction (16). Delayed feedback, in this case feedback that is provided after a set of questions rather than after each one, may confer different benefits, such as transfer skills (81). Multiple studies have found that this kind of delayed feedback resulted in higher retention of information and skills compared to immediate feedback, even though immediate feedback resulted in better initial performance (90, 91). While most of the online learning systems 117 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

reviewed in this work provide immediate elaborative feedback, some provide delayed elaborative feedback after a group of problems or provide feedback, such as worked examples, only on demand. This is particularly the case for mastery-based online learning systems.. It should be mentioned that feedback has the potential downside of shortcutting a learner’s developing MKM skills. Pascarella noted this effect in a study of an online physics learning system in which student problem-solving behavior was characterized (13). Compared with students completing paper-based homework, students using the online system were less likely to evaluate their final answer, instead letting the computer tell them whether the answer was correct. Reduction in metacognition has generally been recognized as a weakness of feedback, particularly feedback provided immediately after answering a question (81). However, feedback may be deliberately designed to increase MKM skills (92) (see “Metacognition” below).

Multiple Attempts All the online learning systems reviewed in this chapter incorporate multiple attempts to solve similar tasks in some fashion. At face value this is a useful feature; it is a generally accepted idea that more practice with content and skills enhances student recall and proficiency. However, it has long been recognized in the science-teaching community that, unlike expert problem-solvers, who categorize problems according to concepts involved in order to devise solution strategies, novices often employ approaches that rely only on surface features of a problem and which do not promote increased proficiency with the concepts of the discipline (24, 25). The availability of multiple attempts at the same or very similar problems available in online learning systems might exacerbate that problem by reinforcing the use of surface-feature-based methods, particularly when combined with correct/incorrect feedback rather than feedback that guides student efforts. One study of student usage patterns of an online physics learning system showed that some students switched to a more novice-like approach to problem solving after transitioning from paper-based to online homework for which multiple attempts were allowed and correct/incorrect feedback was provided (13). This phenomenon was also noted in a study of the impact of an online learning system in organic chemistry on student behavior (21). In this case, 39% of students reported using multiple attempts to guess rather than analyzing their initial incorrect answer for increased understanding Despite the potential problem of de-incentivizing a systematic, concept-based approach to problem solving, multiple attempts can provide students with the chance to correct mistakes made on initial attempts or the chance to attempt a strategy presented in diagnostic feedback or worked solution (the attempt-feedback-reattempt loop) (16). This is in contrast to traditional paper-based homework, for which even the best feedback may not be attended to because no incentive to further practice is provided beyond the intrinsic motivation of the student. 118 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

One thing that online learning systems almost universally do is provide elaborative feedback to students after submission of initial answers. Thus, even if they guess or pursue a novice-like strategy for the initial answer, well-designed feedback can guide their thinking (81). One potential drawback of this is overreliance on such help, leaving students overconfident (81). A related strategy is to provide access to expert-like solutions between submissions, with little feedback while a problem is being attempted. The presentation of expert-like solution strategies has been found to support novice learners (87). This may allow a student to check their understanding of the solution while attempting a problem independently. Another simple way to discourage a guess-and-check strategy involves careful design of online learning items. Questions with limited numbers of possible answers, such as multiple choice, lend themselves to a successful guess-and-check strategy to a greater extent than numerical problems. Another way to accomplish the same thing is to generate a new problem after each unsuccessful attempt, to prevent students using a process of elimination. A grade penalty based on number of attempts can also discourage students from using multiple attempts to guess-and-check answers. A great many of the current generation online learning systems allow the instructor to set homework policies including number of attempts allowed and amount of credit lost for each incorrect response. If a need for multiple attempts is penalized, students have an incentive for getting the problem correct in the initial attempt. However, this approach also may lead to student frustration and disengagement. In addition, it may send the message that mistakes are not a normal or acceptable component of learning. Finally, some online learning systems have re-envisioned the homework assignment with an emphasis on students demonstrating mastery of material. This is usually manifested as a requirement for a student to correctly respond to several problems bundled within an assignment or topic to some minimum standard level. Incorrect responses on too many questions results in the student having to “start over” with a new bundle of questions. This approach neatly penalizes guessing, as well as allowing students to use feedback in between attempts to improve their performance. In a study of a system using this approach in Physics, students were initially slower to learn material due to the delay of feedback, but demonstrated improved retention of the material in a post-test (23). Learning systems that employ this strategy include Catalyst, OWLv2 and ALEKS.

Use of Online Learning Systems To Improve Metacognition Three of the learning systems reviewed here incorporate MKM feedback by asking students to judge their confidence on a problem as well as provide an answer to the problem. Wiley’s Orion study tool and McGraw Hill’s LearnSmart both have students rate on a 4-point scale how confident they feel about each question. After a student completes a group of questions, both systems generate a report in which the student receives feedback about how well their confidence matches their knowledge (that is, the ability to answer the question correctly). A 119 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

student with good MKM skills will be confident most often when their answers are correct and not confident when their answers are incorrect. Madra Learning asks students both to rate confidence and predict performance prior to an assessment. However, Madra also asks students to postdict their performance after taking the assessment but prior to seeing the score. Students receive a report showing their pre- and post-diction versus the actual performance. This confronts students with information demonstrating a gap between their perceived state of knowledge and their actual state of knowledge for that type of assessment. Such metacognitive feedback can inform a student if they are under confident (they are often unsure but correct) or overconfident (they are confident but incorrect), helping the student develop this key metacognitive skill. All three of these online learning systems also provide a breakdown of performance on different topics in the assessment or course and links to integrated or external study tools. This provides support in the area of taking appropriate corrective action, one of the steps in self-regulated learning (70). A recent Madra Learning white paper reports the use of Madra Learning pre-tests to improve MKM skills (93). Over the course of four pre-tests for the first exam, students improved from a 13-17% overestimation of their performance on the first pre-test to < 5% overestimation on the fourth pre-test. In addition, students also took pre-tests for the final exam, on which they ranged from a 2% overestimation of their performance to an almost 5% underestimation. Although this has the potential limitation of having been done using very similar assessments, and therefore not investigating transfer of learning to new settings, this is strong evidence for improvement in MKM skills, at least for this type of assessment.

Multimedia and Interactive Computer Capabilities Computers can assist student learning in multiple ways. It is well-known that multimedia presentation can help students to conceptualize in chemistry, where much of the content is not directly observable (94–97). Well-designed illustrations, animations and simulations can also help students connect the macroscopic with the microscopic (e.g. PhET simulations) (98). In addition, multimedia presentation can help to reduce cognitive load as students’ grapple with multiple ideas in limited working memory (68, 69). It can do this, for example, by presenting information in both visual and auditory forms. Since a limited amount of information can be taken in through each sensory channel, presenting information through more than one sensory channel allows more total information to be taken in and processed in working memory (69). Computers can also automate responses to student input, allowing for instantaneous and tailored feedback to student input (85). This greatly expands the availability of feedback to students compared to the days of paper-and-pencil homework. Thus ever-increasing computing capability provides an opportunity to assist student learning through deliberate, research-supported design. The capabilities of computers to automatically respond to input and display sophisticated multimedia content can lead to improved pedagogical design by providing immediate feedback. Some online learning systems include guided 120 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

tutorials that break problems down into simpler steps to assist students in learning to solve a type of problem (Table 2). The student does not move on until they have successfully completed the previous step. This is made possible by the ability to program appropriate responses to student input. Other examples of ways that online learning systems are beginning to utilize the expanded capabilities of computers include a so-far small selection of animated problems (e.g. Connect), interactive figures (Sapling, MindTap), and guided interactive multimedia tutorials (Orion). The supplemental materials provided with the learning systems (often as part of a textbook or e-book purchase) have steadily progressed in sophistication, taking advantage of the modern capacity of computers. Students can access relevant sections of an e-book using direct hyperlinks from online learning problems. LearnSmart even highlights passages based on learning outcomes chosen by instructors to help students identify the most relevant parts of the text. Students also have access to video solutions, interactive tutorials, interactive simulations, animations, and video demonstrations. Well-designed presentations may also help to reduce cognitive load by using images, narration and animation effectively, allowing students a clearer path to understanding the key concepts (99). The various online learning systems provide these resources to different extents, from video solutions and interactive tutorials on many topics (Orion) to simply listing the sections of the textbook relevant to working a particular problem (SmartWork). One other recent innovation is the development of assessments integrated with the textbook. MindTap and LearnSmart each integrate questions with textbook reading, helping students find information relevant to the topic and reducing the number of things they must mentally juggle at once. MindTap, in particular, has students work the problems that commonly appear in textbook sections after worked examples as they read the book. The problems appear on the same webpage as the text, a feature students reported preferring to a hyperlink to the textbook (15). This integrated, at-your-fingertips access to content presentation reduces cognitive load by reducing the need to search for information and/or select relevant information (65). The use of computers has, in the past, limited the formats of problems that can be presented to students. Multiple choice, multiple answer, fill-in-the-blank and numerical problems are straightforward, and some systems include matching and sorting problem styles. More difficult has been drawing, both of chemical structures and other types, and graphing, but great progress has been made in expanding the tasks that can be performed and computer-graded through online learning systems. Most online learning systems have structure-drawing tools, including Mastering Chemistry, WileyPlus, Sapling, OWLv2, Connect, ALEKS, WebAssign, and SmartWork. More recently, efforts to allow drawing of chemical structures with a stylus on a touchscreen have been reported (18, 55). Graphing interfaces also exist in systems such as SmartWork, Mastering Chemistry, WileyPlus, WebAssign, and Sapling. The ease-of-use and flexibility of these tools varies.

121 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Future Landscape of Adaptive Learning Development of responsive systems for general chemistry has progressed from basic online versions of textbook question banks to inclusion of features such as hints, redirection to text sections, and video or tutorial help. Additionally, programs have added adaptive features that map student’s learning state and remediate when the student lacks mastery of particular content. An article in Educational Psychologist studying human tutoring and intelligent tutoring systems concluded that both had essentially the same effectiveness (100). The responsive and adaptive features of the general chemistry online learning programs essentially act as immediate tutors for the students while they answer questions. It seems that many of the publishers of these programs are continuing to lean more towards the adaptive feature, which helps to personalize learning. In a white paper reviewing adaptive learning systems, the authors state, “We are certainly at the point where every major publisher producing online material for higher education will need to have some form of adaptive learning system as part of their offering (27).” The landscape of adaptive programs is developing rapidly. A review of twenty adaptive technology products by Tyton Partners provides overviews of the learner profiles and faculty customization levels (101). These programs vary in their ability to provide supplemental or whole course offerings and their platforms are categorized as either off-the-shelf or authoring. Most often in general chemistry, the online programs are chosen to be a supplement to the instructor-driven course (i.e. replacing homework). Many companies, such as Cerego, require partnership with them to develop course content, rather than having the subject material (i.e. general chemistry) readily available for course delivery. Other adaptive programs including ALEKS and McGraw-Hill LearnSmart have ready-to-use online content for general chemistry. With the published advantages of adaptive programs, growth of these types of online programs is predicted. Although not a peer-reviewed source, Tyton’s whitepaper “Learning to adapt: a case for accelerating adaptive learning in higher education” reported several instances of positive outcomes for colleges and universities that adopted adaptive learning solutions. These include increases in course passing rates and increases in student retention rates at state universities and community colleges (102). As more adaptive learning platforms for chemistry are introduced in the market, their competitiveness will depend upon their published successes and how they differentiate themselves from other platforms. New strategies for evaluating adaptive learning and innovative ways to track student performance are being integrated in adaptive learning technologies. One example is a study of primary school children using ‘brain training’ software to monitor the children’s nonverbal cues while they solved math problems with a range of difficulty. The program was able to use the nonverbal cues to correctly predict the difficulty of the problem for each student 71% of the time (103). This study indicates that the adaptive learning programs may be able to learn from student behavior and use this data to better tailor the individual learning practice to the student. In a review of emotions and personality in adaptive e-learning systems, Santos discusses data sources used for monitoring emotions 122 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

and the affective interventions that are used by the educational programs (104). Cameras, screens, questionnaires, keystrokes, mouse movements, eye tracking, heart rate, skin conductance, and pressure sensitive mouse and chair have all been utilized in studies to determine the emotional state of a student in a learning situation (104–108). With these data, programs can intervene in an affective manner by modes such as hints, emotional feedback, emotional support messages, motivational comments, task sequencing, redirecting, or a sensorial signal to promote calm. Interactive computer capabilities are likely to continually enhance online learning programs.

Best Practices in Using Responsive or Adaptive Programs in General Chemistry The effectiveness of responsive and adaptive learning systems is continually being evaluated, but preliminary research shows that the programs that adapt to the individual learning state or incorporate mastery-based learning are most effective in increasing learning gains (33, 109). Implementation of any systems will lead to the most student success if both the faculty and students clearly understand why a particular program is being utilized (110). Ideally, the program will be tied intentionally to the learning outcomes of the class and not presented as simply an “add-on” for the course. Users of online learning systems find it is crucial to include a grade for the work done in the learning system within the course grade structure and extra-credit incentive was also useful (7). Incorporating concepts or questions that were presented in the online system into the classroom and on exams can further indicate the value placed on the online learning program. Continual discussion with students individually and collectively about their progress in the program and its usefulness can encourage students to invest time into utilizing the program (21). With the analytics provided by the program, instructors can identify trouble areas for individuals so that one-on-one instruction can be targeted to exact student need. Additionally, instructors can review the entire class progress and implement change in the classroom based on misunderstandings in the online learning assignments. Describing the program in favorable terms to students will help them understand the true value of using these online learning tools. Many publishers provide documentation about ‘best practices’ for their online learning programs that includes similar advice for instructors. With hints and videos, many of the programs can be described as a virtual tutor aiding the student as they work problems. Immediate feedback allows the students to remedy their learning instantaneously. Additionally, the programs that integrate e-text reading can help the students integrate reading and solving problems to provide a multi-faceted approach for their learning.

123 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

Conclusion Responsive, mastery-based, and adaptive online programs for general chemistry can be extremely useful pedagogical tools with a multitude of functionality for both the student and instructor. The main features and categories of most currently available platforms that include a general chemistry offering are outlined in the tables above and can be a guide for helping instructors choose the right kind of online tool for their teaching purposes. In review of literature on the topic of effectiveness of online homework and learning systems, most research indicates the use of the online tools as effective or more effective than paper-pencil homework. While some programs have evolved to include an ever-increasing number of bells and whistles for responsive learning, other programs have focused on mastery learning or adaptive learning as their standard for effective student learning. Although the concepts of adaptive learning, responsiveness, and mastery learning have been researched, there is certainly a gap in knowledge about the effectiveness of these programs in a general chemistry setting.

References 1.

Spain, J. D. Electronic Homework: Computer-Interactive Problem Sets for General Chemistry. J. Chem. Educ. 1996, 73, 222. 2. Cole, R. S.; Todd, J. B. Effects of Web-Based Multimedia Homework with Immediate Rich Feedback on Student Learning in General Chemistry. J. Chem. Educ. 2003, 80, 1338. 3. Cooper, H.; Robinson, J. C.; Patall, E. A. Does Homework Improve Academic Achievement? A Synthesis of Research, 1987-2003. Rev. Educ. Res. 2006, 76, 1–62. 4. Leinhardt, G.; Cuadros, J.; Yaron, D. “One Firm Spot”: The Role of Homework as Lever in Acquiring Conceptual and Performance Competence in College Chemistry. J. Chem. Educ. 2007, 84, 1047. 5. Arasasingham, R. D.; Martorell, I.; McIntire, T. M. Online Homework and Student Achievement in a Large Enrollment Introductory Science Course. J. Coll. Sci. Teach. 2011, 40, 70. 6. Fynewever, H. A Comparison of the Effectiveness of Web-Based and PaperBased Homework for General Chemistry. Chem. Educ. 2008, 13, 264–269. 7. Parker, L. L.; Loudon, G. M. Case Study Using Online Homework in Undergraduate Organic Chemistry: Results and Student Attitudes. J. Chem. Educ. 2012, 90, 37–44. 8. Gebru, M. T.; Phelps, A. J.; Wulfsberg, G. Effect of Clickers versus Online Homework on Students’ Long-Term Retention of General Chemistry Course Material. Chem. Educ. Res. Pract. 2012, 13, 325–329. 9. Malik, K.; Martinez, N.; Romero, J.; Schubel, S.; Janowicz, P. A. MixedMethods Study of Online and Written Organic Chemistry Homework. J. Chem. Educ. 2014, 91, 1804–1809. 10. Arasasingham, R. D.; Taagepera, M.; Potter, F.; Martorell, I.; Lonjers, S. Assessing the Effect of Web-Based Learning Tools on Student Understanding 124 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

11. 12.

13.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

14.

15.

16. 17.

18.

19. 20.

21.

22.

23.

24.

25. 26.

of Stoichiometry Using Knowledge Space Theory. J. Chem. Educ. 2005, 82, 1251. Epstein, M. L.; Brosvic, G. M. Students Prefer the Immediate Feedback Assessment Technique. Psychol. Rep. 2002, 90, 1136–1138. Mavrikis, M.; Maciocia, A. Incorporating Assessment into an Interactive Learning Environment for Mathematics. LTSN MSOR Maths CAA Ser. 2003, 1–17. Pascarella, A. M. The Influence of Web-Based Homework on Quantitative Problem-Solving in a University Physics Class. In Proc. NARST Annual Meeting; 2004. Revell, K. D. A Comparison of the Usage of Tablet PC, Lecture Capture, and Online Homework in an Introductory Chemistry Course. J. Chem. Educ. 2014, 91, 48–51. Zumalt, C. J.; Williamson, V. M. Does the Arrangement of Embedded Text Versus Linked Text in Homework Systems Make a Difference in Students Impressions, Attitudes, and Perceived Learning? J. Sci. Educ. Technol. 2016, 25, 704–714. Zerr, R. A Quantitative and Qualitative Analysis of the Effectiveness of Online Homework in First-Semester Calculus. JCMST 2007, 26, 55. Mattingly, K. D.; Rice, M. C.; Berge, Z. L. Learning Analytics as a Tool for Closing the Assessment Loop in Higher Education. KM&EL 2012, 4, 236–247. Bryfczynski, S.; Pargas, R. P.; Cooper, M. M.; Klymkowsky, M. BeSocratic: Graphically-Assessing Student Knowledge. In Proceedings of the IADIS International Conference Mobile Learning, IADIS; 2012; pp 3–10. Block, J. H.; Airasian, P. W.; Bloom, B. S.; Carroll, J. B. Mastery Learning: Theory and Practice; Holt, Rinehart. and Winston: New York, 1971. Richards-Babb, M.; Drelick, J.; Henry, Z.; Robertson-Honecker, J. Online Homework, Help or Hindrance? What Students Think and How They Perform. J. Coll. Sci. Teach. 2011, 40, 81. Richards-Babb, M.; Curtis, R.; Georgieva, Z.; Penn, J. H. Student Perceptions of Online Homework Use for Formative Assessment of Learning in Organic Chemistry. J. Chem. Educ. 2015, 92, 1813–1819. Roediger, H. L., III; Gallo, D. A.; Geraci, L. Processing Approaches to Cognition: The Impetus from the Levels-of-Processing Framework. Memory 2002, 10, 319–332. Gladding, G.; Gutmann, B.; Schroeder, N.; Stelzer, T. Clinical Study of Student Learning Using Mastery Style versus Immediate Feedback Online Activities. Phys. Rev. ST Phys. Educ. Res 2015, 11, 010114. Hardiman, P. T.; Dufresne, R.; Mestre, J. P. The Relation between Problem Categorization and Problem Solving among Experts and Novices. Mem. Cognit. 1989, 17, 627–638. Chi, M. T.; Feltovich, P. J.; Glaser, R. Categorization and Representation of Physics Problems by Experts and Novices. Cognit. Sci. 1981, 5, 121–152. Pavlinic, S.; Wright, A. H.; Buckley, P. D. Students Using Chemistry Courseware-Insights from a Qualitative Study. J. Chem. Educ. 2000, 77, 231. 125 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

27. Oxman, S.; Wong, W. White Paper: Adaptive Learning Systems; 2014. 28. Arasasingham, R. D.; Taagepera, M.; Potter, F.; Lonjers, S. Using Knowledge Space Theory to Assess Student Understanding of Stoichiometry. J. Chem. Educ. 2004, 81, 1517. 29. Falmagne, J.-C.; Cosyn, E.; Doignon, J.-P.; Thiéry, N. The Assessment of Knowledge, in Theory and in Practice. In Formal concept analysis; Springer: Berlin, Heidelberg, 2006; pp 61–79. 30. Taagepera, M.; Arasasingham, R.; Potter, F.; Soroudi, A.; Lam, G. Following the Development of the Bonding Concept Using Knowledge Space Theory. J. Chem. Educ. 2002, 79, 756. 31. Taagepera, M.; Noori, S. Mapping Students’ Thinking Patterns in Learning Organic Chemistry by the Use of Knowledge Space Theory. J. Chem. Educ. 2000, 77, 1224. 32. Taagepera, M.; Potter, F.; Miller, G. E.; Lakshminarayan, K. Mapping Students’ Thinking Patterns by the Use of the Knowledge Space Theory. Int. J. Sci. Educ. 1997, 19, 283–302. 33. Eichler, J. F.; Peeples, J. Online Homework Put to the Test: A Report on the Impact of Two Online Learning Systems on Student Performance in General Chemistry. J. Chem. Educ. 2013, 90, 1137–1143. 34. Saiki, D.; Gebauer, A. Online Homework and Student Success in Preparatory Chemistry. Chem. Educ. 2013, 18, 74–79. 35. Burch, K. J.; Kuo, Y.-J. Traditional vs. Online Homework in College Algebra. Math. Comput. Educ. 2010, 44, 53. 36. Liberatore, M. W. Improved Student Achievement Using Personalized Online Homework for a Course in Material and Energy Balances. Chem. Eng. Educ. 2011, 45, 184–190. 37. Bonham, S.; Beichner, R.; Deardorff, D. Online Homework: Does It Make a Difference? Phys. Teach. 2001, 39, 293–296. 38. Brewer, D. S.; Becker, K. Online Homework Effectiveness for Underprepared and Repeating College Algebra Students. JCMST 2010, 29, 353–371. 39. Johnstone, A. H. Macro-and Micro-Chemistry. Sch. Sci. Rev 1982, 64, 377–379. 40. Gabel, D. L. Use of the Particle Nature of Matter in Developing Conceptual Understanding. J. Chem. Educ. 1993, 70, 193. 41. Gabel, D. L.; Samuel, K.; Hunn, D. Understanding the Particulate Nature of Matter. J. Chem. Educ. 1987, 64, 695. 42. Johnstone, A. H. You Can’t Get There from Here. J. Chem. Educ. 2010, 87, 22–29. 43. Johnstone, A. H. Chemical Education Research in Glasgow in Perspective. Chem. Educ. Res. Pract. 2006, 7, 49–63. 44. Johnstone, A. H. The Development of Chemistry Teaching: A Changing Response to Changing Demand. J. Chem. Educ. 1993, 70, 701. 45. Johnstone, A. H. Why Is Science Difficult to Learn? Things Are Seldom What They Seem. Journal of Comput. Assist. Learn. 1991, 7, 75–83.

126 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

46. Kozma, R. B.; Russell, J. Multimedia and Understanding: Expert and Novice Responses to Different Representations of Chemical Phenomena. J. Res. Sci. Teach. 1997, 34, 949–968. 47. Nakhleh, M. B. Why Some Students Don’t Learn Chemistry: Chemical Misconceptions. J. Chem. Educ 1992, 69, 191. 48. Pickering, M. Further Studies on Concept Learning versus Problem Solving. J. Chem. Educ 1990, 67, 254. 49. Sawrey, B. A. Concept Learning versus Problem Solving: Revisited. J. Chem. Educ. 1990, 67, 253. 50. Davidowitz, B.; Chittleborough, G.; Murray, E. Student-Generated Submicro Diagrams: A Useful Tool for Teaching and Learning Chemical Equations and Stoichiometry. Chem. Educ. Res. Pract. 2010, 11, 154–164. 51. Kelly, R. M.; Barrera, J. H.; Mohamed, S. C. An Analysis of Undergraduate General Chemistry Students’ Misconceptions of the Submicroscopic Level of Precipitation Reactions. J. Chem. Educ. 2009, 87, 113–118. 52. Kern, A. L.; Wood, N. B.; Roehrig, G. H.; Nyachwaya, J. A Qualitative Report of the Ways High School Chemistry Students Attempt to Represent a Chemical Reaction at the Atomic/molecular Level. Chem. Educ. Res. Pract. 2010, 11, 165–172. 53. Nyachwaya, J. M.; Warfa, A.-R. M.; Roehrig, G. H.; Schneider, J. L. College Chemistry Students’ Use of Memorized Algorithms in Chemical Reactions. Chem. Educ. Res. Pract. 2014, 15, 81–93. 54. Smithrud, D. B.; Pinhas, A. R. Pencil-Paper Learning Should Be Combined with Online Homework Software. J. Chem. Educ. 2015, 92, 1965–1970. 55. Ouyang, T. Y.; Davis, R. Recognition of Hand Drawn Chemical Diagrams. In AAAI; 2007; Vol. 7, pp 846–851. 56. Bryfczynski, S.; Pargas, R. P.; Cooper, M. M.; Klymkowsky, M.; Hester, J.; Grove, N. P. Classroom Uses for BeSocratic. In The impact of pen and touch technology on education; Springer, 2015; pp 127–136. 57. Nurrenbern, S. C.; Pickering, M. Concept Learning versus Problem Solving: Is There a Difference? J. Chem. Educ. 1987, 64, 508. 58. Holme, T. A.; Luxford, C. J.; Brandriet, A. Defining Conceptual Understanding in General Chemistry. J. Chem. Educ. 2015, 92, 1477–1483. 59. Salta, K.; Tzougraki, C. Conceptual versus Algorithmic Problem-Solving: Focusing on Problems Dealing with Conservation of Matter in Chemistry. Res. Sci. Educ. 2011, 41, 587–609. 60. Noh, T.; Scharmann, L. C. Instructional Influence of a Molecular-Level Pictorial Presentation of Matter on Students’ Conceptions and ProblemSolving Ability. J. Res. Sci. Teach. 1997, 34, 199–217. 61. Bain, K.; Moon, A.; Mack, M. R.; Towns, M. H. A Review of Research on the Teaching and Learning of Thermodynamics at the University Level. Chem. Educ. Res. Pract. 2014, 15, 320–335. 62. Raviolo, A.; Garritz, A. Analogies in the Teaching of Chemical Equilibrium: A Synthesis/analysis of the Literature. Chem. Educ. Res. Pract. 2009, 10, 5–13. 63. Taskin, V.; Bernholt, S. Students’ Understanding of Chemical Formulae: A Review of Empirical Research. Int. J. Sci. Educ 2014, 36, 157–185. 127 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

64. Cakir, M. Constructivist Approaches to Learning in Science and Their Implications for Science Pedagogy: A Literature Review. Int. J. Env. Sci. Educ. 2008, 3, 193–206. 65. Paas, F.; Renkl, A.; Sweller, J. Cognitive Load Theory and Instructional Design: Recent Developments. Educ. Psychol. 2003, 38, 1–4. 66. Bodner, G. M. Strengthening Conceptual Connections in Introductory Chemistry Courses. Chem. Educ. Res. Pract. 2007, 8, 93–100. 67. Talanquer, V. Threshold Concepts in Chemistry: The Critical Role of Implicit Schemas. J. Chem. Educ. 2014, 92, 3–9. 68. Mayer, R. E. Multimedia Learning, 2nd ed.; Cambridge University Press, New York, 2009. 69. Mayer, R. E.; Moreno, R. Nine Ways to Reduce Cognitive Load in Multimedia Learning. Educ. Psychol. 2003, 38, 43–52. 70. Isaacson, R. M.; Fujita, F. Metacognitive Knowledge Monitoring and Self-Regulated Learning: Academic Success and Reflections on Learning. JoSoTL 2006, 6, 39–55. 71. Bell, P.; Volckmann, D. Knowledge Surveys in General Chemistry: Confidence, Overconfidence, and Performance. J. Chem. Educ. 2011, 88, 1469–1476. 72. Grimes, P. W. The Overconfident Principles of Economics Student: An Examination of a Metacognitive Skill. J. Econ. Educ. 2002, 33, 15–30. 73. Hacker, D. J.; Bol, L.; Horgan, D. D.; Rakow, E. A. Test Prediction and Performance in a Classroom Context. J. Educ. Psychol. 2000, 92, 160. 74. Miller, T. M.; Geraci, L. Unskilled but Aware: Reinterpreting Overconfidence in Low-Performing Students. J. Exp. Psychol.-Learn. Mem. Cognit. 2011, 37, 502. 75. Pazicni, S.; Bauer, C. F. Characterizing Illusions of Competence in Introductory Chemistry Students. Chem. Educ. Res. Pract. 2014, 15, 24–34. 76. Potgieter, M.; Ackermann, M.; Fletcher, L. Inaccuracy of Self-Evaluation as Additional Variable for Prediction of Students at Risk of Failing First-Year Chemistry. Chem. Educ. Res. Pract. 2010, 11, 17–24. 77. Serra, M. J.; DeMarree, K. G. Unskilled and Unaware in the Classroom: College Students’ Desired Grades Predict Their Biased Grade Predictions. Mem. Cognit. 2016, 44, 1127–1137. 78. Miller, T. M.; Geraci, L. Training Metacognition in the Classroom: The Influence of Incentives and Feedback on Exam Predictions. Metacogn. Learn. 2011, 6, 303–314. 79. Ramdass, D.; Zimmerman, B. J. Developing Self-Regulation Skills: The Important Role of Homework. J. Adv. Acad. 2011, 22, 194–218. 80. Guskey, T. R. Rethinking Mastery Learning Reconsidered. Rev. Educ. Res. 1987, 57, 225–229. 81. Shute, V. J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78, 153–189. 82. Atkinson, R. C.; Shiffrin, R. M. Human Memory: A Proposed System and Its Control Processes. Psychol. Learn. Motiv 1968, 2, 89–195. 128 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

83. Bangert-Drowns, R. L.; Kulik, C.-L. C.; Kulik, J. A.; Morgan, M. The Instructional Effect of Feedback in Test-like Events. Rev. Educ. Res. 1991, 61, 213–238. 84. Mason, B. J.; Bruning, R. Providing Feedback in Computer-Based Instruction: What the Research Tells Us. Center for Instructional Innovation, University of Nebraska-Lincoln, 2001; CLASS Research Report No. 9. 85. Moreno, R. Decreasing Cognitive Load for Novice Students: Effects of Explanatory versus Corrective Feedback in Discovery-Based Multimedia. Instr. Sci. 2004, 32, 99–113. 86. Black, P.; Wiliam, D. Assessment and Classroom Learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. 87. Chandler, P.; Sweller, J. Cognitive Load Theory and the Format of Instruction. Cognit. Instr. 1991, 8, 293–332. 88. Graesser, A. C.; McNamara, D. S.; VanLehn, K. Scaffolding Deep Comprehension Strategies through Point&Query, AutoTutor, and iSTART. Educ. Psychol. 2005, 40, 225–234. 89. Kalyuga, S.; Ayres, P.; Chandler, P.; Sweller, J. The Expertise Reversal Effect. Educ. Psychol. 2003, 38, 23–31. 90. Schroth, M. L. The Effects of Delay of Feedback on a Delayed Concept Formation Transfer Task. Contemp. Educ. Psychol 1992, 17, 78–82. 91. Kulhavy, R. W.; White, M. T.; Topp, B. W.; Chan, A. L.; Adams, J. Feedback Complexity and Corrective Efficiency. Contemp. Educ. Psychol 1985, 10, 285–291. 92. Nicol, D. J.; Macfarlane-Dick, D. Formative Assessment and Self-Regulated Learning: A Model and Seven Principles of Good Feedback Practice. Stud. High. Educ. 2006, 31, 199–218. 93. Madra Learning. Self-Tuning Student Performance Expectations: Improving Underperforming Student Test Scores at the University of Utah; 2017. 94. Aldahmash, A. H.; Abraham, M. R. Kinetic versus Static Visuals for Facilitating College Students’ Understanding of Organic Reaction Mechanisms in Chemistry. J. Chem. Educ. 2009, 86, 1442. 95. Falvo, D. Animations and Simulations for Teaching and Learning Molecular Chemistry. Int. J. Technol. Teach. Learn. 2008, 4, 68–77. 96. Tasker, R. Chemists’ Guide to Effective Teaching. In Chemists’ Guide to Effective Teaching; Prentice Hall: Upper Saddle River, NJ, 2005; pp 195–211. 97. Stieff, M.; Wilensky, U. Connected Chemistry—incorporating Interactive Simulations into the Chemistry Classroom. J. Sci. Educ. Technol. 2003, 12, 285–302. 98. Moore, E. B.; Chamberlain, J. M.; Parson, R.; Perkins, K. K. PhET Interactive Simulations: Transformative Tools for Teaching Chemistry. J. Chem. Educ. 2014, 91, 1191–1197. 99. Ward, M.; Sweller, J. Structuring Effective Worked Examples. Cogn. Instr. 1990, 7, 1–39. 100. VanLehn, K. The Relative Effectiveness of Human Tutoring, Intelligent Tutoring Systems, and Other Tutoring Systems. Educ. Psychol. 2011, 46, 197–221. 129 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.

Downloaded by COLUMBIA UNIV on October 31, 2017 | http://pubs.acs.org Publication Date (Web): October 26, 2017 | doi: 10.1021/bk-2017-1261.ch009

101. Tyton Partners. Learning to Adapt 2.0: The Evolution of Adaptive Learning in Higher Education; 2016. 102. Tyton Partners. Learning to Adapt: A Case for Accelerating Adaptive Learning in Higher Education; 2013. 103. Van Amelsvoort, M.; Joosten, B.; Krahmer, E.; Postma, E. Using Non-Verbal Cues to (automatically) Assess Children’s Performance Difficulties with Arithmetic Problems. Comput. Hum. Behav. 2013, 29, 654–664. 104. Santos, O. C. Emotions and Personality in Adaptive E-Learning Systems: An Affective Computing Perspective. In Emotions and Personality in Personalized Services; Springer International Publishing: Switzerland, 2016; pp 263–285. 105. Felipe, D. A. M.; Gutierrez, K. I. N.; Quiros, E. C. M.; Vea, L. A. Towards the Development of Intelligent Agent for Novice C/C++ Programmers through Affective Analysis of Event Logs. In Proc. Int. MultiConference Eng. Comput. Sci.; 2012; Vol. 1. 106. Khan, I. A.; Brinkman, W.-P.; Hierons, R. Towards Estimating Computer Users’ Mood from Interaction Behaviour with Keyboard and Mouse. Front. Comp. Sci. 2013, 7, 943–954. 107. Santos, O. C.; Salmeron-Majadas, S.; Boticario, J. G. Emotions Detection from Math Exercises by Combining Several Data Sources. In International Conference on Artificial Intelligence in Education; 2013; pp 742–745. 108. Woolf, B. P.; Arroyo, I.; Cooper, D.; Burleson, W.; Muldner, K. Affective Tutors: Automatic Detection of and Response to Student Emotion. In Advances in Intelligent Tutoring Systems; Studies in Computational Intelligence; Springer: Berlin, Heidelberg, 2010; Vol. 308, pp 207–227. 109. Guskey, T. R. Closing Achievement Gaps: Revisiting Benjamin S. Bloom’s “Learning for Mastery”. J. Adv. Acad. 2007, 19, 8–31. 110. Hauk, S.; Segalla, A. Student Perceptions of the Web-Based Homework Program WeBWorK in Moderate Enrollment College Algebra Classes. JCMST 2005, 24, 229.

130 Sörensen and Canelas; Online Approaches to Chemical Education ACS Symposium Series; American Chemical Society: Washington, DC, 2017.