Evaluating the Use of LearnSmart and Connect in Introductory

Chapter 5. Evaluating the Use of LearnSmart and Connect in Introductory General Chemistry Classes: The Pros and Cons of an Online Teaching and Learnin...
0 downloads 9 Views 814KB Size
Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Chapter 5

Evaluating the Use of LearnSmart and Connect in Introductory General Chemistry Classes: The Pros and Cons of an Online Teaching and Learning System Rashmi Venkateswaran* *E-mail:

[email protected]

When asked if they like chemistry, many first-year university students often groan and say it is the hardest course they take. While part of this assessment at the University of Ottawa is based on the fact the course load is heavy due to a lecture component, a lab component and many tutorial sessions, another major reason for this statement comes from the fact that students often have difficulty in integrating the conceptual and problem-based aspects of the course. Chemistry requires students to understand ideas that range from microscopic to macroscopic, to read large bodies of text and understand how the text can be converted into chemical equations or visual images, and then to integrate all these concepts and apply them to the solution of mathematical problems. It takes a great deal of maturity, introspection and time to develop these skills and students in first year often simply do not have the time required to gain these skills given that chemistry is only one of the many courses that they take. Many textbook publishing companies have begun pairing their textbooks with online homework programs with a view to helping students practice and apply the knowledge they are learning from the textbook in an interactive manner. McGraw-Hill Education has taken steps to help students acquire the necessary skills using online technology that builds a basis for their knowledge and then allows them to apply that same knowledge. McGraw-Hill Education Connect®, the online teaching and learning program © 2016 American Chemical Society Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

that is available with McGraw-Hill Education textbooks, comes paired with SmartBook® with LearnSmart® in many cases. As the instructors of general chemistry at University of Ottawa chose the McGraw-Hill Education text, this chapter will present a preliminary evaluation, as well as some of the advantages and disadvantages of the SmartBook vis a vis a standard or e-book. It will also look at whether the integration of SmartBook helps students to learn to determine what is important in a text, how to reinforce that knowledge and gain an understanding of the underlying concepts, and how to apply that knowledge to the solution of problems. Did pairing the conceptual knowledge gained by SmartBook with concrete problems provide a holistic approach to the study of chemistry and what were the advantages and disadvantages, given some of the student limitations mentioned above? While any homework takes time, whether paper or online, this chapter will present some information about SmartBook and an informal evaluation of whether use of this system was able to provide a successful learning outcome for students.

Introduction One of the more challenging courses taken by first-year science students is the Introductory General Chemistry course (1). Evaluations obtained from students who have taken this course at University of Ottawa over several years have consistently demonstrated that they perceive the workload to be very heavy. It is true that the course contains several components including at least two lectures, a lab, course-related tutorial or problem sessions, and in some cases lab-related tutorials for assistance in completion of lab reports. In order to succeed in the course, students must learn the material, study for midterms and exams, complete homework assignments, perform experiments, and submit lab reports. Informal comments and discussions with students, in addition to responses to a direct question regarding workload found on student evaluations, indicate that students consider chemistry to have a heavier workload than most of their other courses. An actual comparison of science courses however seems to indicate that a similar workload is expected in other first year science courses. It appears that students may perceive incorrectly that introductory general chemistry has a greater workload. Anecdotal evidence also indicates that students have difficulty in relating the chemical equations and scenarios explained in lecture or class to the experiments they perform in the lab, suggesting that there is a disconnect between the microscale (chemical nomenclature, chemical equations, atomic/molecular depictions) and the macroscale (real-world connections, experimental observations, lecture demonstrations) (2–6). A considerable effort is expended by most general chemistry instructors to offer a variety of tools to assist students in overcoming these learning challenges. Some instructors focus on helping students to understand the basic concepts or 84 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

to improve their conceptual understanding in chemistry (7–10). Other instructors work on correcting misconceptions in order to help students learn chemistry (11–13). Many instructors struggle with the need to address concepts and yet ensure students are capable of solving problems (14, 15). However, financial constraints and ease of use invite many instructors to turn to online resources (16, 17), some created by the instructor (18, 19) and some provided by textbook publishers (20–22).

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Online Resources Most textbooks today come with a great deal of electronic support, much of which can be found online. This support generally consists of an electronic version of the textbook, online homework programs, access to videos and/or animations and various other accessories, such as flash cards, concept inventories and so on. The dwindling physical resources that most instructors have available coupled with larger class sizes makes it increasingly difficult to assign homework to students that can be corrected by either a teaching assistant or the instructor. Online homework is a convenient alternative that allows instructors to assign homework in a timely manner without the necessity of having an assistant take the time to make corrections. Further, it allows students to practice chemistry and get timely feedback. In order to have a system in which the correction can be completed without the intervention of an instructor or teaching assistant (which is the feature that makes online homework most attractive), the majority of the questions are of the fill-in-the-blank or multiple choice category. While such questions are not ideal from a teaching and learning point of view, nonetheless, students have the opportunity to practice learning terminology and conceptual chemistry as well as the chance to apply their understanding in the form of problems. Often, students who do not see their response anywhere in the answer list realize that they must have made an error somewhere. Extremely good multiple choice questions have common student errors as false responses, thus requiring students to actually understand the underlying theory in order to obtain the correct response. Students, however, have developed a variety of methods to deal with multiple choice questions. They are often capable of using logic to eliminate certain responses. Other times, they will guess with no real understanding of why they are choosing a certain response. Possibly the most pernicious problem of any online homework is that some students share answers on social media; if students are only interested in receiving points, this becomes an issue very difficult to resolve. Some of the commonly used online homework systems include Sapling, Mastering Chemistry used by Pearson, OWL by Cengage Learning and used with Nelson Education texts, WileyPLUS used by Wiley, and Connect used by McGraw-Hill. Mastering Chemistry, which has been used previously at University of Ottawa, contains tutorial-type questions that have a feedback system allowing students who are experiencing difficulty to request a hint. The hint then guides the student towards the correct response by helping the student to arrive at one of the values necessary to obtain the correct response. Sometimes there are hints for each step of a multi-step calculation. The other systems named 85 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

above have varying approaches that provide similar support to students. All the publishers mentioned above provide an electronic version of the textbook (e-book) along with the online homework system. The textbook is still considered a primary resource, but most general chemistry textbooks are quite large and include more information than is customarily taught in any given course. Thus many students, when reading, find it difficult to distinguish what content is truly important and what content can be studied more leisurely. Often, students will read every word of a textbook, treating all parts of the text equally. This leads to an inability to judge what content is most likely to be tested, resulting in students finding that what they studied was not on the test. In meetings with students after tests or exams over a number of years, when asked why they felt the content on the test did not reflect what they studied, many students confessed that they read the text in its entirety without focusing on the topics specifically targeted in class.

What Is SmartBook? Most McGraw-Hill Education textbooks are supported by Connect and SmartBook. SmartBook, the adaptive online textbook, differs from e-books offered by other publishers in ways that can be highlighted by instructors. Some of these differences that have been used at the University of Ottawa to help students approach their chemistry learning in new ways will be explained with examples. The first way in which SmartBook differs is that it helps students to prioritize the information as they are reading the text. In any particular chapter, important information is highlighted and information that is provided for interest, background or context is lightly greyed, as shown in Figure 1 (23). This does not mean that the information in the greyed text is unimportant; it simply indicates to the student that this information can be referred to once the student has grasped the key concepts or if the student wishes to have additional information or context to the material they are reading. This is helpful as students often have difficulty in determining what information they must know as opposed to what information it would be nice for them to know. SmartBook helps them to identify the information they must know. The decision regarding what text is highlighted and what text is greyed out is not made by the textbook authors, but rather by subject experts who are also specialists in cognitive learning. Once students have read through a certain number of sections, they are then encouraged to put into practice what they have read. SmartBook integrates LearnSmart questions, offering a metacognitive assessment that allows them to determine whether they have correctly understood the terminology, the concepts and simple applications of the material they have read to this point.

86 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Figure 1. Example of text as it appears in SmartBook. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.)

LearnSmart and Metacognition What sets LearnSmart questions apart is the metacognitive component. The LearnSmart questions were designed by a team of experts in adaptive learning technology who are also subject experts, and these questions are designed to guide student learning based on student response. Metacognition has been assigned a number of different definitions, but one of these definitions includes the ability to assess whether a particular response is correct or incorrect (24). The adaptive technology takes students responses and based on these responses designs a pathway of questions. The assignment comes with a metacognitive component that asks a chemistry question and then requires students to respond to the question “Do you know the answer” with one of the following choices: “I know it”, “Think so”, “Unsure”, or “No idea”, as shown in Figure 2 (23).

87 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Figure 2. Example of a LearnSmart question showing the metacognitive component. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.) Subsequently, students are asked to enter their response to the chemistry question, as shown in Figure 3 (23).

Figure 3. Example of LearnSmart question. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.) 88 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

LearnSmart adapts the next question based on the students question response and their metacognitive response. Thus a student who answered “I know it” and gave the correct response to the first question, as shown in Figure 4 (23), would have a different second question than a student who responsed “Think so” and responded correctly as well.

Figure 4. Example of a correct response to a LearnSmart question. Note the “Challenge” button appears whenever an answer is submitted. Students can challenge the system when they believe an answer is correct but is marked as incorrect. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.) If a student consistently responds metacognitively saying “I know it” and gets the incorrect response to the question, LearnSmart will suggest to the student that it would be a good idea to go back and read the text material to reinforce their understanding, as shown in Figure 5 (23). While a particular assignment may contain 20 questions, a student can answer anywhere from 20 to 40 questions depending on their responses. The assignment is only complete when the student has actually responded to the original 20 questions with a reasonable degree of certainty. If there are incorrect or very uncertain responses, LearnSmart questions will continue, but return to the areas of uncertainty or error, asking different questions until the student can respond with assurance. Further, SmartBook notes where the student displayed uncertainty and will reinforce these areas by selectively highlighting the text that will explain the student’s doubts. 89 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Figure 5. Suggestion by SmartBook that a student should read more before testing. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.)

Students are also encouraged to “recharge” or to practice again at another time; at that time, the previous areas of weakness are tested again to reinforce that particular knowledge with the goal that the student has acquired sufficient confidence in and understanding of the subject matter to effectively retain the information. SmartBook accesses a data set consisting of tens of thousands of student data results and over five billion probes used to identify patterns for when students are likely to forget, as well as the frequency and sequence in which content needs to be reinforced for long term memory. The algorithm guides the appropriate probe types and frequency based on the data set and then optimizes based on individual student results. It is essentially a massive data set for probability guidance combined with student specific input. SmartBook, when regularly used with the practice assignments in LearnSmart, is designed to help students move information from short-term memory to long-term memory.

Connect McGraw-Hill Education Connect is the online teaching and learning program that is available with most McGraw-Hill textbooks. SmartBook can be accessed through Connect; in addition, Connect has assignable end-of-chapter static and algorithmic problems that students can complete, either as homework, practice or assessment. There are true/false, fill in the blank, and multiple choice type questions. There are also numeric problems that require that the student enter a numeric response, as shown in Figure 6 (23). 90 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Figure 6. Example of an algorithmic problem in Connect. (Reproduced with permission from reference (23). Copyright 2013 McGraw-Hill Education.) An assignment can be constructed by using any one of the question types or by having a selection of question types. Many questions can be made algorithmic; in an algorithmic question, different parameters within a question are treated as variables. In effect, although a single question is assigned, if a student repeats the question, they will see a similar question with one or more of the quantities having a different value. This allows students to practice a problem without memorizing a rote solution and hopefully allows the students to see that problems that appear different on the surface are not that different in the ways in which their solutions can be approached. As with most programs, there are hints at certain points in the response, allowing students who are unable to solve the problem to get guidance in how to approach the solution to the question.

Preliminary Evaluation of SmartBook in a General Chemistry Class I used the online support provided by McGraw-Hill Education along with the textbook Chemistry: The Molecular Nature of Matter and Change, Canadian Edition, by Silberberg, Lavieri and Venkateswaran (23) to teach a group of approximately 85 students taking introductory general chemistry at the University 91 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

of Ottawa, in Ottawa, Canada. These students were in a section of the course intended mainly to support students who either had not taken the prerequisite chemistry course in high school, had taken the course and had not done well in it, or who were mature students returning to University or to a program in Science. The course was structured so as to provide them with three 1.5 hour classes each week. There was a laboratory component (3 hour biweekly) and a strongly recommended (but not mandatory) course tutorial and lab tutorial. Students were assigned a LearnSmart assignment for each chapter and were required to complete the assignment for the beginning of the class. They were also assigned a set of problems in Connect which were due by the date of the quiz for that chapter. Students who purchased the textbook package were given access to Connect with SmartBook automatically. Students had the option of purchasing access to the online components separately.

How Online Homework Components Were Weighted The LearnSmart assignments were assigned a weight of 5% and the Connect assignments a similar weight, overall giving a homework component of 10%. Students were informed during the first class that they could choose to opt out of doing the online homework, in which case the 10% weight would be transferred to their final exam. Of the 85 students, 5 students chose to opt out, 2 for financial reasons. The questions from the homework were not reviewed in class unless students explicitly requested that a question be explained. In general, students did not ask any questions regarding the LearnSmart homework. If students had any issues with the software, they were encouraged to contact McGraw-Hill Education support directly. The majority of issues were in regard to altering due dates since the course material was fluid and largely based on the students’ ability to learn the key concepts and principles.

Groupwork Students occasionally requested guidance or help with some of the more difficult problems from Connect (usually corresponding to one or two end-of-chapter questions in the more difficult chapters). These questions were then given as group work during class so that 2-4 students per group could work collaboratively to find a solution, with occasional assistance from other groups or myself if needed. Additionally, students worked in groups in class and in tutorials to solve more complex, integrated type problems from the end-of-chapter questions in the textbook, to write new problems or to identify areas of difficulty. With the exception of the introductory chapters (background to gas laws) , equilibrium (gas phase, acid/base and solubility) and atomic/molecular structure (the nature of the quantum atom, electron configurations and periodicity, Lewis structures and VSEPR) for which multiple chapters were included in a single test, each chapter was tested separately. 92 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

How Did Students Do? Although no statistical or formal research studies were carried out, students who actively participated in the online homework (either the LearnSmart fully, the Connect fully or both to at least 70%) did obtain better final grades in the course. Students who were already understanding the material tended to get higher marks on the homework component based on a comparison of the students quiz grades for the chapter with their homework grades. There was a significant portion (as much as 25%) of the class that had difficulty with understanding both the concepts and with applying the theory to solving problems. These students, who may well have benefited significantly from doing the online homework, were the ones who had the lowest percentages of completion of the homework exercises. Whether this was because they were overwhelmed with the workload of the course, or because they were unable to spend the time required to complete the exercises is uncertain. The students who benefited most were those in between, that is, neither the best nor the worst. These students completed an average of 60-100% of the homework assignments and there was a clear increase in their final exam grade as compared to their quiz grades. With regards to the metacognitive component of the LearnSmart questions, it may appear at first glance that the question is asking how confident the student is with the response, which may not appear in reality to be an aspect of metacognition. One of the broad definitions of metacognition is reflection on one’s own learning or understanding (25, 26). Ideally, for a student to respond to the question “Do you know the answer” in any way at all, the student would have had to think about whether they really do know the answer and in so doing, analyze why they think they do or do not know the answer. That process constitutes a metacognitive prompt that none of the other online homework programs contains. Whether students actually go through this process, however, or whether they simply randomly click on one of the four choices is something much more difficult to determine and no questions regarding this particular process were specifically asked of the students in the class. Student Feedback Many students offered anecdotal evidence of their experience with the online homework. The majority of students felt there was too much work required for the course and that the homework was just one more thing for them to do, offering comments along the lines of “overwhelming” or “so much homework” in response to the question “How could the course and/or the teaching be improved?”. Nevertheless, a large number (more than 50%) felt that the homework did help them to improve their understanding of the concepts, the language used, the chemical theory connecting the microscopic with the macroscopic, and the ability to apply this theory to solve problems (comments similar to “Learnsmart helped”, in response to the question “What did you like about the course and/or the teaching?”). Most notably, many students strongly expressed that SmartBook helped them to learn how to read the textbook. Previously, they would read every word on every 93 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

page, assigning equal importance to all the text they read. SmartBook helped them to determine which parts of the text were more relevant and guided their focus. Students frequently expressed surprise at the fact that content that they had thought was important turned out not to be important and conversely, content that they did not think they needed to know was highlighted as being an area of focus. Once they started using SmartBook and became accustomed to it, students who used to say that the content of a quiz completely took them by surprise and that even though they studied, they were not prepared, began saying that the test was reasonable but long. In other words, although they were having trouble completing the material within the specified time, they were expecting to see the content that was on the test. This was one of the most positive outcomes of using SmartBook. Students who used SmartBook felt that their understanding of the terminology improved. They were able to differentiate between words that they had previously felt meant the same thing (for example, heat and temperature, or frequency and wavelength). They were also able to define the terminology more clearly and with less ambiguity. They were more comfortable with the units associated with quantities. One of the features of LearnSmart questions that they appreciated was the “Challenge” button, that appears after an answer has been submitted, as shown previously in Figure 4. If they entered a response that they felt was correct but they were told that their response was incorrect, they had the option to challenge the response. The challenge was sent directly to those who had written the questions and a response was received within 48 hours. If several students challenged the same question, the question was modified and instructors were notified. Students felt empowered by the opportunity to challenge what they felt was an incorrect grading and appreciated being told that their response was correct or being informed as to why their response was incorrect. They also found it interesting that they did not necessarily get the same questions as their friends even if they did the assignment at the same time, since the branching of the questions depended on their responses as well as their confidence level. It did take a while for students to understand that an assignment with 20 questions sometimes took a very long time to complete if they were providing incorrect responses to multiple questions. Many students found that frustrating at first, but soon realized that reading the text before doing the assignments made it easier, faster and more interesting to do the assignment. Room for Improvement Those students who found the experience frustrating spent much more time on the LearnSmart assignment (as indicated by reports showing how much time was spent by individual students and also which questions were incorrect). These students had difficulty with the concepts and terminology from the beginning of the course (based on quiz grades), and felt that the time it was taking them to complete the LearnSmart exercise was not really helping them to learn (based on end-of-semester evaluations). This was definitely one of the negative aspects of LearnSmart. However, as with any homework system, the time spent on doing the homework does have a direct impact on student understanding (27).Overall, 94 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

students stated that they felt that LearnSmart was useful and helped them to improve their grade. Connect received more mixed reviews than either LearnSmart or SmartBook. The mixed review was likely due to the selection of problems on the assignments. Problems of higher order thinking and of an integrated nature were selected for the assignments so as to provide an opportunity for students to practice problems of the type that they would typically see on a test or exam. However, students reported that they felt unprepared to approach such problems as the only background they had was with the concepts and ideas taught through LearnSmart and the example type problems done when needed in class. Students felt that they were spending too much time trying to find a way to approach the problems and rarely actually got to solve them. Student Suggestions for the Future Several students suggested that a better approach would be to use Connect to teach the individual steps required to solve a problem (for example, how to convert mass to amount, how to use Hess’ law to find enthalpy of reaction, or how to use an ICE table) and then to have the students work in groups to solve a more complex, integrated problem using these individual steps. This modification would only necessitate a different selection of questions for the assignment and this approach will be tried in the next iteration of the course. Fewer questions in each assignment was also a suggestion that was endorsed by many of the students. With the exception of these two issues, students found the homework forced them to stay current with the material and did lead them to ask more questions in class, which had a net positive effect. When informally asked whether they felt more engaged in class, several students commented that they found that having to complete the LearnSmart exercises before class meant they had to read the chapter in advance. They could then pose questions in class regarding content they did not understand, or content that interested them, which kept them engaged in the course. Overall, with this particular group of students, the combination of Connect and SmartBook had a very positive effect on student learning, on quiz grades, on the final exam grade and overall on final course grades. McGraw-Hill Education Canada Case Studies In general, in other courses where Connect and SmartBook are used, the outcome has been similarly positive (28, 29). Results obtained from a survey conducted by McGraw-Hill Education Canada in 2016 among 2100 Canadian college and university students indicated that 81% of the respondents felt that SmartBook helped to make the course more interesting and engaging. When compared with e-books, more than 83% of the students consistently indicated that they found the SmartBook helped them retain information for longer periods of time, kept them engaged, helped them use their time more effectively, helped them prepare for class and helped them improve their grades. Over 75% of the students felt similarly about SmartBook when compared to a standard print textbook. 95 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Some of the written comments provided similar sentiments about SmartBook, and repeatedly comments expressing the idea “fantastic!”, “gave me confidence”, “interesting and useful”, “loved it”, and “felt very prepared”, were offered by students. Many of the negative comments associated with SmartBook related to the cost of the product, such as “Overpriced”. Most interestingly, students were asked if they would choose to purchase Connect with SmartBook for a future course even if their instructor chose not to assign grades for it and over 62% of students said that they would.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Instructor Point of View While the utility of SmartBook to students has been discussed to this point, it is essential to examine SmartBook from the point of view of instructors as well. It is possible to get statistics on student responses to the LearnSmart questions that include the metacognitive component. The area of least concern is those students who think they know the correct answer and who also submit the correct response. The area of greatest concern is those students who think they know the correct answer and yet submit an incorrect response. These students are usually overconfident and mistakenly think that their understanding is complete; this group of students often encounters the greatest difficulty on tests and exams. SmartBook allows such students to be identified in advance and it is possible to address learning gaps. The students also have access to these reports and so they can use them to understand and address their own knowledge deficiencies. While it is less troublesome, the other group that merits concern is the group of students who indicate that they are unsure or that they do not know the correct response but still respond correctly. Generally, such students are lacking in confidence and while they have the knowledge, they are hesitant to apply it and thus may perform poorly on tests and exams. Once identified, it is possible to meet with these students and encourage them to be more confident in their knowledge. One key benefit of SmartBook lies in its ability to assist instructors in determining where students are having difficulty and in guiding them in the most helpful and appropriate way towards a better understanding of the course content so that the greatest number of students can succeed in the course. There are many other reports that can be generated that give specific information about particular students, specific information for all students and average or general values for all students. Once a particular session of a course is complete, these reports can be used to help instructors assess what teaching strategies were useful to students and which techniques may need to be reassessed to determine their usefulness. The major disadvantage to SmartBook is having to deal with the frustration of students who resent having to spend more time than the assignment is supposed to take. When an instructor sets the assignment, there is an indication of the number of questions and how long it should take to complete the assignment. However, if students have difficulty with the questions or if their metacognitive response and question response are not aligned, the adaptive learning software creates additional questions to reinforce learning. As a result, in an assignment with 20 questions, a student may end up answering 40 questions and taking significantly more time 96 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

to complete the assignment than expected. This frustration and resentment on the part of the student is often conveyed to the instructor. One potential way to address this issue is to openly inform students of the purpose and value of the adaptive learning system, and the reason assignments may take longer than expected. It does not resolve the issue, but may help students to deal with the frustration more purposefully.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

Conclusion While there are a number of tools, both physical and online as well as multiple in-class strategies used to help them, students continue to struggle with the introductory general chemistry course. The McGraw-Hill Education online support in the form of Connect and SmartBook provides a powerful advantage to both students and instructors. Targeted, adaptive techniques that include metacognitive prompts allow students to read with focus, practice what they have read with understanding and apply what they have learned with confidence, leading to deeper learning over a longer period of time. Instructors are able to see where students are having difficulty, either in actual learning or their own perception of their learning, and apply constructive support to correct misconceptions, address learning gaps and bolster or temper confidence, depending on the individual student’s needs. The online tools, paired with in-class activities such as think-pair-share exercises or active group work provides a holistic approach to helping students grasp the essential concepts and terminology in the course and to then apply these ideas to solve more complex, real world, or open-ended problems. Student feedback on how best to apply these tools must be taken into consideration to provide the best learning experience possible. Building a pathway of learning using stepwise support is a logical plan and one that agrees with student comments in evaluations. An explanation of how this system differs from other homework systems and the value of thinking metacognitively at the beginning of the course would also help students understand what the purpose of the homework is. Hopefully, this will also encourage students to think about their answer to the metacognitive part of the response (which encourages metacognition on multiple levels) and help them to learn more efficiently. Based on student feedback and instructor experience with SmartBook, future plans of study could include a comparison of the ability of SmartBook to help students learn as compared to other publishers’ e-books, a study of whether SmartBook makes the students reliant on what text to read or actually helps them to learn how to read a text independently, and whether students actually use a metacognitive process to answer the question “Do you know the answer”. With SmartBook providing strong support for the underlying ideas and chemical concepts, and Connect used to give students the opportunity to see the different ways in which these ideas and concepts can be applied to solve rudimentary chemical problems, instructors can use valuable in-class time to encourage students to work together to solve problems that require students to make connections between different ideas in chemistry, that need students to 97 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

think beyond the normal parameters to which they are accustomed and to work collegially and combine their individual strengths to arrive at a solution.

References 1. 2.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

3. 4. 5.

6.

7. 8. 9.

10.

11. 12.

13.

14. 15.

16. 17.

Laidler, K. J. Too much to know. J. Chem. Educ. 1974, 51, 696–700, DOI: 10.1021/ed051p696. Carter, C. S.; Brickhouse, N. W. What makes chemistry difficult? Alternate perceptions. J. Chem. Educ. 1989, 66, 223–225, DOI: 10.1021/ed066p223. Herron, J. D. Using Research in chemical education to improve my teaching. J. Chem. Educ. 1984, 61, 850–854, DOI: 10.1021/ed061p850. Bodner, G. M. Constructivism: A theory of knowledge. J. Chem. Educ. 1989, 63, 873–878, DOI: 10.1021/ed063p873. Champagne, A. B.; Klopfer, L. E.; Gunstone, R. F. Cognitive research and the design of science instruction. Educational Psychologist 1982, 17, 31–53, DOI: 10.1080/00461528209529242. Chiu, M. H.; Chou, C. C.; Liu, C. J. Dynamic processes of conceptual change; Analysis of constructing mental models of chemical equilibrium. J. Res. Sci. Teach. 2002, 39, 688–712, DOI: 10.1002/tea.10041. Bergquist, W.; Heikkinen, H. Student ideas regarding chemical equilibrium. J. Chem. Educ. 1990, 67, 1000–1003, DOI: 10.1021/ed067p1000. Nakhleh, M. B. Student’s models of matter in the context of acid-base chemistry. J. Chem. Educ. 1994, 71, 495–499, DOI: 10.1021/ed071p495. Krishnan, S. R.; Howe, A. C. The mole concept: Developing an instrument to assess conceptual understanding. J. Chem. Educ. 1994, 71, 653–655, DOI: 10.1021/ed071p653. Furio, C.; Azcona, R.; Guisasola, J.; Ratcliffe, M. Difficulties in teaching the concept of amount of substance and mole. Int. J. Sci. Educ. 2000, 22, 1285–1304, DOI: 10.1080/095006900750036262. Hackling, M. W.; Garnett, P. J. Misconceptions of chemical equilibrium. Eur. J. Sci. Educ. 1985, 7, 205–214, DOI: 10.1080/0140528850070211. Nakhleh, M. B. Why some students don’t learn chemistry: Chemical misconceptions. J. Chem. Educ. 1992, 69, 191–196, DOI: 10.1021/ ed069p191. Gorodetsky, M.; Gussarsky, E. Misconceptualization of the chemical equilibrium concept as revealed by different evaluation methods. Eur. J. Sci. Educ. 1986, 8, 427–441, DOI: 10.1080/0140528860080409. Nakhleh, M. B.; Mitchell, R. C. Concept learning versus problem solving: There is a difference. J. Chem. Educ. 1993, 70, 190–192. Nakhleh, M. B. Are our students conceptual thinkers or algorithmic problem solvers? Identifying conceptual students in general chemistry. J. Chem. Educ. 1993, 70, 52–55, DOI: 10.1021/ed070p190. Diener, L. Selected online resources for teaching about alternative energy. J. Chem Educ. 2012, 89, 950–952, DOI: 10.1021/ed200068y. Kirchhoff, M. Online resources for teacher and students from the American Chemical Society. J. Chem. Educ. 2009, 86, 127, DOI: 10.1021/ed086p127. 98

Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.

Downloaded by UNIV OF GEORGIA on December 6, 2016 | http://pubs.acs.org Publication Date (Web): November 22, 2016 | doi: 10.1021/bk-2016-1235.ch005

18. Donovan, W. J.; Nakhleh, M. B. Students’ use of web-based tutorial materials and their understanding of chemistry concepts. J. Chem. Educ. 2001, 78, 975–980, DOI: 10.1021/ed078p975. 19. Shields, S. P.; Hogrebe, M. C.; Spees, W. M.; Handlin, L. B.; Noelken, G. P.; Riley, J. M.; Frey, R. F. A transition program for underprepared students in general chemistry: Diagnosis, implementation and evaluation. J. Chem. Educ. 2012, 89, 995–1000, DOI: 10.1021/ed100410j. 20. Eichler, J. F.; Peeples, J. Online homework put to the test: A report on the impact of two online learning systems on student performance in general chemistry. J. Chem. Educ. 2013, 90, 1137–1143, DOI: 10.1021/ed3006264. 21. Richards-Babb, M.; Curtis, R.; Georgieva, Z.; Penn, J. H. Student perceptions of online homework use for formative assessment of learning in organic chemistry. J. Chem. Educ. 2015, 92, 1813–1819, DOI: 10.1021/acs.jchemed.5b00294. 22. Evans, J. A. OWL (Online Web-Based Learning) (published by Cengage-Brooks/Cole). J. Chem. Educ. 2009, 86, 695–696, DOI: 10.1021/ed086p695. 23. Silberberg, M. S.; Lavieri, S.; Venkateswaran, R. Chemistry : The Molecular Nature of Matter and Change, 1st CE; McGraw-Hill Education: Canada, 2013. 24. Dunning, D; Johnson, K; Ehrlinger, J.; Kruger, J. Why people fail to recognize their own incompetence. Current Directions in Psychological Science 2003, 12, 83–87, DOI: 10.1111/1467-8721.01235. 25. Rickey, D.; Stacy, A. M. The role of metacognition in learning chemistry. J. Chem. Educ. 2000, 77, 915–920, DOI: 10.1021/ed077p915. 26. Cooper, M. M.; Sandi-Urena, S. Design and validation of an instrument to assess metacognitive skillfulness in chemistry problem solving. J. Chem. Educ. 2009, 86, 240–245, DOI: 10.1021/ed086p240. 27. Leinhardt, G.; Cuadros, J.; Yaron, D. “One firm spot”: The role of homework as lever in acquiring conceptual and performance competence in college chemistry. J. Chem. Educ. 2007, 84, 1047–1052, DOI: 10.1021/ed084p1047. 28. Welch D., Franklin University, Columbus, Ohio, Case Study, McGraw-Hill, 2010. 29. Independent case study of over 700 students studying Anatomy and Physiology I at six distinct institutions, McGraw-Hill, 2012.

99 Schultz et al.; Technology and Assessment Strategies for Improving Student Learning in Chemistry ACS Symposium Series; American Chemical Society: Washington, DC, 2016.