Using Text Messages To Encourage Meaningful Self-Assessment

Oct 15, 2018 - Rodriguez, Santos-Diaz, Bain, and Towns .... chemical business has become an independent company named Nouryon under its new venture...
0 downloads 0 Views 3MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Using Text Messages To Encourage Meaningful Self-Assessment Outside of the Classroom Deborah G. Herrington*,† and Ryan D. Sweeder‡ †

Chemistry Department, Grand Valley State University, Allendale, Michigan 49401, United States Lyman Briggs College, Michigan State University, East Lansing, Michigan 48825, United States

Downloaded via UNIV OF LOUISIANA AT LAFAYETTE on October 18, 2018 at 09:33:38 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.



ABSTRACT: Recent articles in this Journal have advocated for focusing instruction on core ideas of the discipline and moving assessment beyond content knowledge to also eliciting evidence about what students can do with their knowledge. Yet, the deep connected learning supported by this kind of instruction and assessment requires students to engage in meaningful learning activities both within and outside of the classroom. We can structure learning activities within our classes that require students to use and apply their knowledge, construct explanations and arguments, and engage in discussion with others; however, we have less control of what students do outside of the classroom. Further, meaningful engagement of students with these types of assessments outside the classroom requires a means for closing the assessment loop, using student responses to inform instruction and support learning and development of these skills. This paper describes a free and easy to use system, combining Remind and Google Forms, that allows instructors to engage students in the types of assessment questions called for in the literature by leveraging the method of communication students use most frequently, text messaging. Through this pilot study at two different institutions, we have identified key issues that instructors should consider in its implementation as a means for incorporating student self-assessment and instructor formative assessment into their classes. Overall student feedback to the use of the “Remind” system in the way we describe in this paper was positive. Students indicated that it helped them to assess their own understanding as well as prompted them to work on chemistry more frequently. Additionally, students indicated preferences for afternoon or early evening messages and a greater dislike for questions requiring them to draw and upload pictures. KEYWORDS: High School/Introductory Chemistry, First-Year Undergraduate/General, Curriculum, Testing/Assessment



BACKGROUND

particulate level and construct causal mechanistic explanations of phenomena is not something that is intuitive.9 It is a skill and way of thinking that students must be taught and have an opportunity to practice both inside and outside of the classroom. Further, development of deep, connected understanding that allows for the construction of explanations requires student self-assessment and self-regulated learning, something that many students struggle with.

Goal of Assessment

A 2015 commentary in this Journal by Cooper1 argued that if a “goal of science is to develop explanatory theories to help us organize our understanding and make predictions about the natural world”, then a goal of science education should be to “help students construct causal, mechanistic explanations of phenomena”. As the way in which we assess students tells them what we value and strongly influences students’ study practices,2−6 if a goal is for students to construct explanations of phenomena, then our assessments should reflect this. However, assessment practices, particularly in introductory courses where breadth of content coverage is often favored over depth of conceptual understanding, rarely require students to construct such causal, mechanistic explanations. Rather, they largely target disaggregated knowledge fragments. More recently, in this and other journals, authors have advocated for moving beyond doing calculations to requiring students to explain the results of their calculation at the particulate (submicroscopic) level,7 and provided suggestions for adapting traditional assessment questions to require students to integrate chemistry core ideas with scientific practices and crosscutting concepts.8 Yet, the ability to connect macroscopic, symbolic, or mathematical representations to the © XXXX American Chemical Society and Division of Chemical Education, Inc.

Self-Assessment and Self-Regulated Learning

The use of self-assessment or self-regulated learning (SRL) strategies is strongly correlated with student success and is necessary for students to apply chemistry content knowledge and construct causal mechanistic explanations of phenomena. Unfortunately, most students have little or no experience or training in how to self-assess or study effectively.10 This generally results in students employing ineffective learning strategies such as rereading passages in the text multiple times or repeated practice of the same skill while eschewing more effective practice techniques such as self-testing or interleaved practice.11,12 Instructors can structure their classes in such a Received: May 15, 2018 Revised: August 29, 2018

A

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

provide information about student understanding that can inform subsequent class instruction.

way to promote and support meaningful learning activities that engage students in SRL strategies, but they have less control of what students do outside of the classroom. These challenges have led to suggestions for improving student performance by providing intentional training in chemistry courses to help students focus on how to learn. Cook et al.13 have reported in this Journal the very positive impacts of introducing metacognition training into their general chemistry classes. Further, their in-depth description of The Study Cycle has helped propagate similar practices in many introductory chemistry classrooms. A student who demonstrates strong metacognitive skills can identify his/her own content weaknesses and can then employ deliberate practice methods to address their deficiencies. However, this is a skill that often requires practice, and students therefore can benefit from opportunities that help them identify their own content weaknesses.



REMIND ASSESSMENT SYSTEM DESIGN The ultimate goal in designing this system was to find a way to help students self-assess and provide students with practice connecting knowledge fragments in a useful manner outside of the classroom. This necessitated engaging students with questions requiring application of content knowledge, construction of arguments, and formulation of explanations of phenomena. Though this can involve some multiple choice or short answer types of questions, this frequently also requires students to be able to provide written explanations or draw and upload pictures. Given the ability of online homework systems to provide immediate feedback, such systems are well-situated to help students practice gaining mastery at routine calculations or answering simple low-level content questions. Thus, we view the Remind assessment system as complementary to existing online homework systems. A secondary goal was to serve as a formative assessment to inform subsequent instruction. Accordingly, the overall Remind

Homework

Instructors suggest or require homework to encourage students to identify their own content and procedural weaknesses. Consequently, students who complete assigned homework tend to be more successful in general chemistry.14 In large general chemistry courses this largely means online homework which allows instructors to provide graded homework assignments, often with randomized questions so each student has a unique assignment, with immediate feedback, sometimes customized to the student, for a large number of students.15 Unfortunately, the questions used by these systems are generally limited to multiple choice or numerical/short answer as these can be easily graded in such programs.16 They do not provide students with practice applying their knowledge and constructing explanations. Further, students often view online homework as a way to get points and focus on getting the correct answer rather than using it as a self-assessment or learning tool.17 This is doubly problematic as the development of deep, connected understanding that allows for the construction of explanations requires students to assess what they know and what they do not know. Leveraging Students’ Preferred Technology

Though many chemistry instructors may be more comfortable with their computers and email, most of our students are more comfortable with their phones, apps, and text messaging. Ye et al. have taken advantage of students’ preference for text messaging communication as a means to engage with students outside of the class to monitor their study skills.18 Though understanding the differences in student study habits can help explain differences in student performance, and knowing what study habits students are using can help instructors suggest strategies to help students improve their performance, it is perhaps even more useful if students’ constant connection to their phones could be used to prompt them to engage in application of chemistry knowledge to construct causal mechanistic explanations. Here we describe a free and easy to use outside of class assessment method that leverages the communication method used most frequently by students, text messaging, to ask students low-stakes questions that require them to apply chemistry content knowledge from recent classes to explain chemical phenomena. Such questions, when structured properly, can serve as student self-assessment, provide students with practice constructing causal mechanistic explanation, and

Figure 1. Remind assessment system design.

assessment system, illustrated in Figure 1, was designed on the basis of the following criteria: (1) Leverage the communication technology that students use most frequently to encourage them to engage with meaningful chemistry questions routinely rather than all at once or just before the exam. (2) Engage students with questions that require them to apply chemistry knowledge and construct causal, mechanistic explanations. (3) Provide information about student understanding that can inform subsequent instruction and close the assessment loop. (4) Develop a low-cost system that was easy to use for both student and instructor. Informed by these overall goals and criteria, the general Remind assessment process starts with students receiving a text message with a link 1−2 days after a class period. The link takes them to a question, or often a series of related questions, B

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

what students know and can do with that knowledge as well as provide students with an opportunity to self-assess their understanding and practice using their chemistry knowledge to construct explanations. A sample question is provided in Box 1.

that requires them to apply or explain core chemistry concepts. As the questions relate to content covered most recently in class, they provide a means for students to assess their understanding of the material right away as opposed to waiting until right before the test, or worse finding out after the test that they did not understand the concepts. Students are expected to respond to the question before the beginning of the next class. Prior to the next class meeting, the instructor views student responses to obtain a quick snapshot of student understanding of the core concepts from the previous class and uses this information to close the assessment loop and inform instruction of the subsequent class. The Remind assessment system itself is composed of two programs that are both free and easy to use for students and instructors. First, to leverage students’ constant connection to their phones and use of text messaging, we chose to use Remind19 which is free for both student and faculty use. Once an instructor sets up an account on Remind, he or she can easily set up classes, each of which generates a set of easy to follow instructions and code that can be used to invite students to join the class. Students can choose to enter their cell phone number and get text messages, get messages through the Remind app, or receive emails. An example of what a Remind message looks like using the app can be seen in Figure 2.

As they were not the traditional types of assessment questions found in test banks or textbook end of chapter questions, all questions used were developed by the authors. Cooper and coworkers1,8 have previously pointed out that the development of these types of questions takes careful thought. Though we provide some examples of these types of questions, we refer you to this work by Cooper for other examples and suggestions for the development of such questions. Once a question was developed, we found that it took about 10−15 min to code the question into Google forms and set up the Remind message to deploy, depending on the question length and whether or not an appropriate graphic available through a creative commons license needed to be identified and included. Although it is possible for an instructor to personally message individual students and provide them with specific feedback using Remind, we did not do this given how time-consuming it would be for a large class. Rather, we chose to use the student responses to provide data for subsequent class discussion. An example of how the student response data were used in class is illustrated in Box 2.

Figure 2. Example of a Remind message.

To accommodate a variety of question formats that allow students to do things like make predictions, perform calculations, construct explanations, evaluate data, and draw pictures to illustrate concepts, Google forms20 was chosen as the platform for developing the assessment questions. This platform is free, allows for a variety of question formats including file upload, and provides a mechanism for systematic collection of student responses with graphical displays, offering a quick snapshot of how students are answering the questions that can be easily incorporated into subsequent lectures. Though it is possible to post a link to the Google form on a course management site or send it via email, again, sending a message through Remind linking to the Google form meets our students where they are most comfortable. The link takes students directly to the question that they could then answer using their phones, or they could also access this using a computer if they wished. Further, Remind messages can easily be scheduled to go out on a particular day at a particular time so instructors could schedule questions to go out throughout the week at regular intervals to encourage students to be continually working on chemistry. Though it is possible to use the Remind system to ask more traditional questions or to use it for a variety of other purposes such as surveying students, here we describe the use of this system as a means to meet the call for assessment reform and focus on questions that go beyond typical multiple choice or calculations. Questions were designed to elicit evidence about



CLASSROOM IMPLEMENTATION This pilot study was reviewed and approved as exempt by our Institutional Review Boards (GVSU ref. No. 18-027-H; MSU x17-1192e). Our initial implementation of this system involved voluntary student participation in three general chemistry courses. Two courses were 80 person sections of primarily freshman that met for 75 min twice per week. The third was a 25 person off-sequence course primarily for sophomore science majors that met for 50 min three times per week. All classes met in a typical college lecture hall, and although the classes regularly employed some lecturing, students in all classes were C

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Box 2 shows how the data from student responses to this question was used in subsequent instruction. A graph of the student answers was used to highlight that a fair percentage (40%) of the students were not able to correctly perform the calculation, which indicated that many people still needed to practice this skill, and specific sources of these types of questions could be identified for students (e.g., similar online homework or textbook questions). Further, it should be noted that though 60% of the students (n = 58) who answered this question could do the calculation correctly, only half of those students were able to correctly identify that a frequency of 1.1 × 10−15 Hz corresponded to UV radiation. This illustrates the importance of going beyond just asking students to perform calculations. Though we want them to be able to correctly perform this calculation, we also want them to be able to interpret this number and do something with it. It was also important to note that another 9% of students who answered the question did not do the calculation correctly but still identified the type of radiation as UV. Moreover, a quick reading of the explanations indicated that students were not able to construct a coherent explanation, nor were many students consistent in the type of radiation they identified and what they talked about in their explanations. This is not surprising as many students have little experience constructing such explanations and often do not think about being consistent in their answers from one part of a question to the next. Though some explanations were better than others, at this point in the semester, all explanations were missing key pieces, despite the fact that the question prompt indicated to students what key pieces should be included in their explanations. To support explanation construction, example anonymous student responses were provided to the class and they discussed each response in groups of 3−4, trying to identify claim, evidence, and reasoning in each. After about 2 min of small group discussion, as a whole class students decided that each answer was missing key pieces. For example, they indicated that option A was just a claim with no supporting evidence or reasoning. With this, as a class, we were able to construct a better explanation that contained all the required pieces. Further, examining a single complete student response allowed us to explicitly highlight for students the issue of inconsistencies in responses. For the case in Box 2, the student identified the radiation type as IR but then proceeded to talk about UV radiation in the subsequent part of the question. Though as instructors we find this problematic, many students do not unless it is explicitly pointed out to them. In another example about gas laws (Box 3), students were asked to draw a representation of molecules inside a volleyball and depict how it would change if the temperature dramatically increased or decreased. Students used their phone to take a picture of their drawing and then upload the photo. From the drawings, several common themes were present and exemplars were identified to display in class. These anonymous drawings were critiqued in groups to identify strengths and weaknesses to help students develop a more robust understanding. Most of the students focused on one single relationship (between temperature and volume, or temperature and pressure) for both temperature increase and decrease. However, this realworld scenario is perhaps best answered by recognizing that, with increasing temperature, the pressure increases largely because the volleyball construction physically prevents an increase in volume, whereas with decreasing temperature, the

assigned to groups of 3−4 in which they worked daily. Students received approximately two Remind questions per week. Given that we had a direct way to communicate with students, the Remind system was also used to provide students with study tips such as encouraging students to engage in selftesting or to visit office hours, and prompted students to reflect on how different exam preparation approaches influenced their performance, other important elements of SRL. These other types of messages were sent approximately once per week. Routinely, we explicitly emphasized to students that the purpose of the Remind questions was to help them assess their own understanding of the material, and that consistently doing so by engaging in these questions would ultimately help them perform better on quizzes and tests. However, we did not assign points for participation, largely because we wanted to test the technology and did not want to deal with 185 stressed students if we were assigning points for participation and the technology did not work as intended. A sample question used early on in the first semester general chemistry course when discussing electromagnetic radiation is shown in Box 1. This question goes beyond asking students to perform a routine calculation by asking them to use the results of this calculation to construct an explanation. D

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

STUDENT USE As previously indicated, in this pilot phase student responses to the Remind questions were voluntary, with the exception of one question noted with the asterisk in Figure 3. As can be

Figure 3. Response rate to Remind questions.

seen in the combined data from the two 75 min classes (Figure 3), student participation prior to Exam 1 was moderate, but it dropped off considerably. Informal discussions with students suggested that students looked at the questions when they were sent, but did not necessarily answer them. A quick midsemester survey gave additional insight into how students were using these questions and why students were or were not responding. Survey responses (77% response rate) were consistent with informal conversations indicating that most students were always or almost always looking at the questions when they were sent (86%), but that substantially fewer students were answering the questions with this frequency (48%) (Figure 4).

volume likely decreases so that the internal and external pressures remain equal. This is illustrated nicely by the second student answer in Box 3. In contrast, the first student’s answer in Box 3 provides a good example of focusing solely on how temperature affects the volume. Comparing and contrasting these two anonymized drawings provided a great lead in for a class discussion about how ball construction actually affects what happens. A comparison could be made to taking a ball that was already full and pumping more air into it and asking students if the ball actually got bigger. By seeing and critiquing the drawings, students gained a recognition of how to better apply the gas law concepts to explain macroscopic observations. This was demonstrated on an exam question where students needed to draw and explain how a weather balloon changed as it increased in elevation; 88% of students had accurate drawings and explanations. The authors did not find that the incorporation of this new technology had a notable impact on the amount of content covered in class, but rather represented a shifting of time. The approximately 5 min usually spent at the beginning of a class period on a Remind question replaced the usual review of the prior day’s content or warm-up questions that would typically start a class. We also found that it was not uncommon to be able to drop one in-class example with the knowledge that the students would be asked to practice the material on their own outside of the class. In this way, the Remind system provided the advantage of allowing the students to engage with an active learning problem without dedicating the in-class time to the students actually working. Instead, the class time was dedicated exclusively to the discussion of the problem after all students have spent all the time they are willing to spend on it. Moreover, this format encourages students to answer the question on their own, rather than relying on their group members to propose an answer to which they may just agree.

Figure 4. Frequency of question engagement.

Further, from Figure 5 it appears that students were largely not answering the questions because they were not making them a priority; they would look at them but not answer right away and then forget. Other impediments to responding to questions that students shared were the time the question was sent and the type of question. Initially, we tried posting questions at different times of day to see how this affected the response rate. Overwhelmingly students indicated that they preferred questions posted late afternoon/early evening. They did not like questions posted in E

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

calculation type question, far fewer could interpret their calculation or use that result to construct a coherent explanation. This supports the need for systems like this that allow us to engage students with questions beyond those available in typical online homework systems. Though students generally reported that they found the Remind questions useful both from the standpoint of helping them to self-assess and reminding them to work on chemistry, without some form of immediate external motivation, such as awarding points for completing the questions, overall student participation is low. Students reported frequently looking at the questions, but not making time to actually answer them unless rewarded by with points. Future implementations will therefore provide students with points for completing the questions. Another barrier to student participation was the timing of the posts. We often think of our students as working late at night; however, students overwhelmingly did not like questions posted late at night. They much preferred late afternoon to early evening posting of questions. Now that we have found this system to be easy to use on both the student and instructor sides and have indications that it provides a means for student self-assessment as well as formative assessment of instruction, we plan to conduct a study that examines the relationship between the level of student participation in answering these questions with their performance on other summative course assessment measures (exams and final grades). Further, we have some qualitative evidence that students’ abilities to write coherent explanations improved over the course of the semester. Accordingly, we plan to develop a rubric to assess the quality of student explanations and take a look to see if there is any relationship between student participation in answering the Remind questions and changes in the quality of their explanations.

Figure 5. Reasons for noncompletion of questions.

the morning or late at night (10 or 11 pm) and did not like questions posted on Friday night or Saturday. Further, students generally did not like questions where they had to upload a picture. As pictures can provide important information about student understanding of chemistry concepts, we view questions that require students to draw particulate diagrams or show their work and then upload pictures of their diagrams/work to be important and thus did include some of these questions, but we tried to keep them at a minimum for this pilot given the students’ dislike of them (in Figure 3 these were Q1, Q3, Q12, and Q13).



STUDENT RESPONSE The overall student response to the Remind system was positive. Several students made unsolicited comments about the value of the Remind questions in both midsemester and end of course evaluations. In particular, students valued the self-assessment aspect. Some sample student comments include the following: • Doing the remind questions is a very helpful way to see what you know vs what you do not (midsemester) • Keep with the reminds, they were helpful and reminded me to work on chem (end of semester) Students also indicated that they found starting class based on Remind questions useful. • Reviewing the Remind questions, because it clearly explains things that we may have had trouble (midsemester) The most common critique of the Remind questions was that they did not get credit for completing them. • I think the remind questions are helpf ul but because there really is not any incentive for doing them grade wise so that is why many of us do not do them (midsemester)



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Deborah G. Herrington: 0000-0001-6682-8466 Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS The authors would like to thank the reviewers for their thoughtful comments and our students for providing us with valuable feedback and allowing us to use examples of their work.



CONCLUSIONS AND FUTURE WORK The Remind/Google Forms system provides a free and easy to use mechanism for instructors to meet students when and where they are at, to provide students with questions that require application of chemistry concepts and help them assess their own understanding of course content. Using the text messaging format allows instructors to send questions to students spread throughout the week to encourage them to continually think about chemistry as opposed to trying to cram everything in right before an exam. With Google forms data summary, instructors can quickly gain an overall perspective of how well students understand the content, and summary graphs or anonymous individual student responses can easily be incorporated to guide discussion in subsequent classes, thus closing the assessment loop. Further, student data indicated that though students might be able to do a typical recall or



REFERENCES

(1) Cooper, M. M. Why Ask Why? J. Chem. Educ. 2015, 92 (8), 1273−1279. (2) Momsen, J.; Offerdahl, E.; Kryjevskaia, M.; Montplaisir, L.; Anderson, E.; Grosz, N. Using Assessments to Investigate and Compare the Nature of Learning in Undergraduate Science Courses. CBE-Life Sci. Educ. 2013, 12 (2), 239−249. (3) Scouller, K. The Influence of Assessment Method on Students’ Learning Approaches: Multiple Choice Question Examination versus Assignment Essay. High. Educ. 1998, 35 (4), 453−472. (4) Scouller, K.; Prosser, M. Students’ Experiences in Studying for Multiple Choice Question Examiniations. Stud. High. Educ. 1994, 19 (3), 267−279. F

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

(5) Snyder, B. The Hidden Curriculum; The MIT Press: Cambridge, MA, 1973. (6) Crooks, T. J. The Impact of Classroom Evaluation Practices on Students. Rev. Educ. Res. 1988, 58 (4), 438−481. (7) Seery, M. Take It Easy on the Equations. Let Me Explain.... Educ. Chem. 2018, May 2018. (8) Underwood, S. M.; Posey, L. A.; Herrington, D. G.; Carmel, J. H.; Cooper, M. M. Adapting Assessment Tasks To Support ThreeDimensional Learning. J. Chem. Educ. 2018, 95 (2), 207−217. (9) Johnstone, A. H. Why Is Science Difficult to Learn? Things Are Seldom What They Seem. J. Comput. Assist. Learn. 1991, 7 (2), 75− 83. (10) Hartwig, M. K.; Dunlosky, J. Study Strategies of College Students: Are Self-Testing and Scheduling Related to Achievement? Psychon. Bull. Rev. 2012, 19 (1), 126−134. (11) Bjork, R. A.; Dunlosky, J.; Kornell, N. Self-Regulated Learning: Beliefs, Techniques, and Illusions. Annu. Rev. Psychol. 2013, 64 (1), 417−444. (12) Hora, M. T.; Oleson, A. K. Examining Study Habits in Undergraduate STEM Courses from a Situative Perspective. Int. J. STEM Educ. 2017, 4 (1). DOI: 10.1186/s40594-017-0055-6 (13) Cook, E.; Kennedy, E.; McGuire, S. Y. Effect of Teaching Metacognitive Learning Strategies on Performance in General Chemistry Courses. J. Chem. Educ. 2013, 90 (8), 961−967. (14) Leinhardt, G.; Cuadros, J.; Yaron, D. One Firm Spot”: The Role of Homework as Lever in Acquiring Conceptual and Performance Competence in College Chemistry. J. Chem. Educ. 2007, 84 (6), 1047. (15) Eichler, J. F.; Peeples, J. Online Homework Put to the Test: A Report on the Impact of Two Online Learning Systems on Student Performance in General Chemistry. J. Chem. Educ. 2013, 90 (9), 1137−1143. (16) Cadaret, C. N.; Yates, D. T. Retrieval Practice in the Form of Online Homework Improved Information Retention More When Spaced 5 Days Rather than 1 Day after Class in Two Physiology Courses. Adv. Physiol. Educ. 2018, 42 (2), 305−310. (17) Khanlarian, C.; Shough, E.; Singh, R. Student Perceptions of Web-Based Homework Software: A Longitudinal Examination. In Advances in Accounting Education; Emerald Group Publishing Limited, 2010; Vol. 11, pp 197−220. (18) Ye, L.; Oueini, R.; Dickerson, A. P.; Lewis, S. E. Learning beyond the Classroom: Using Text Messages to Measure General Chemistry Students’ Study Habits. Chem. Educ. Res. Pract. 2015, 16 (4), 869−878. (19) Remind. Remind https://www.remind.com/ (accessed May 11, 2018). (20) Google Formscreate and analyze surveys, for free. https:// www.google.com/forms/about/ (accessed May 11, 2018).

G

DOI: 10.1021/acs.jchemed.8b00361 J. Chem. Educ. XXXX, XXX, XXX−XXX