Characterizing Peer Review Comments and ... - ACS Publications

Jan 3, 2019 - Sweetland Center for Writing, University of Michigan, Ann Arbor, Michigan 48109-1055, United States. •S Supporting Information. ABSTRA...
0 downloads 0 Views 1MB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

pubs.acs.org/jchemeduc

Characterizing Peer Review Comments and Revision from a Writingto-Learn Assignment Focused on Lewis Structures S. A. Finkenstaedt-Quinn,† E. P. Snyder-White,† M. C. Connor,† A. Ruggles Gere,‡ and G. V. Shultz*,† †

Department of Chemistry, University of Michigan, Ann Arbor, Michigan 48109-1055, United States Sweetland Center for Writing, University of Michigan, Ann Arbor, Michigan 48109-1055, United States



Downloaded via UNIV OF KANSAS on January 3, 2019 at 15:51:36 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

S Supporting Information *

ABSTRACT: Lewis structures are fundamental to learning chemistry, yet many students struggle to develop a complex understanding of its meaning and uses. Writing-to-Learn supports students in developing a deeper conceptual understanding of the topic, making it an ideal pedagogy to apply to student learning of Lewis structures. One difficulty often associated with classroom writing is the capacity of instructors to provide feedback to each student on their written work; however, this practical constraint can be mitigated through incorporating peer review. Peer review and revision are known to support conceptual learning and yet are underutilized in STEM (Science, Technology, Engineering, and Math) classrooms. Additionally, peer review is an authentic, common, and necessary practice in chemistry research, which warrants its incorporation early on in chemistry courses. A major concern regarding the use of peer review-based feedback is the ability of students to provide concept-based feedback that is both correct and detailed enough to enhance student understanding and support substantial revisions. In response, this work investigates the relationship between revision and the characteristics of students’ peer review comments in the context of a Writing-to-Learn assignment focusing on student understanding of Lewis structures. Chemistry students wrote a summary of Lewis’ 1916 paper introducing Lewis structures, participated in peer review, and revised their work in response to a structured prompt detailing specific chemistry concepts to be covered. The peer review comments and students’ revisions were thematically analyzed. The peer review comments were deductively analyzed according to an analytical framework to characterize the usefulness of comments. The extent and type of student revisions were also analyzed and paired, if relevant, with the associated peer review comments. Results indicate that students provided both detailed and conceptually focused comments on their peers’ work, irrespective of specific chemistry content being addressed. Although the assignment and peer review rubric were content focused, students made a mixture of editing and content revisions following peer review. These results suggest that further scaffolding of what constitutes good feedback and revision may further promote student learning. KEYWORDS: First Year Undergraduate/General, Chemistry Education Research, Communication/Writing, Lewis Structures, Constructivism FEATURE: Chemical Education Research



rized to produce Lewis structures efficiently.2−5 However, these methods do not necessarily consider students’ understanding of the underlying science behind Lewis structures, which may lead to rote learning rather than conceptual understanding.1 These formulas are often created on the premise that if students are able to accurately construct Lewis structures, understanding of the rules and concepts that guide them will come more naturally later in the curriculum. Although this approach may work for some students, it does not work in all cases, and students’ ability to draw Lewis structures and apply them to chemical phenomena breaks

INTRODUCTION

Lewis Structures

In chemistry, Lewis structures provide a means to communicate about molecular structures and eventually reaction mechanisms and pathways. In order to successfully progress in chemistry, introductory students must learn to not only draw Lewis structures but also understand their meaning, specifically how a structure might impact a molecule’s reactivity. Despite the importance of drawing and understanding Lewis structures, it remains an area in which many students struggle,1 and, in response, many instructional strategies have been proposed.2−5 Much of the previous literature regarding techniques and formulas for teaching Lewis structures focuses on providing step-by-step formulas for determining acceptable Lewis structures that can be memo© XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: August 31, 2018 Revised: December 7, 2018

A

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

down as complexity increases.1,6 These gaps begin when students are first learning to draw Lewis structures without understanding their purpose or the basis for the rules they are given to guide them. Rote memorization of Lewis structures does not provide students with an understanding of the theory behind the notation development, which guides the way Lewis structures are drawn and interpreted as well as deviations from the model of current understanding of atomic structure. Writing-to-Learn (WTL) is a promising strategy through which students can learn about Lewis structures because it can be used to direct student focus on the conceptual basis of the structures rather than only on memorizing and creating correct representations. WTL assignments can be designed to guide students to consider the limitations and assumptions inherent in Lewis structure models, so that they will be better able to understand them when faced with increasing complexity and application to novel situations. The student data used herein comes from a WTL intervention described by Shultz and Gere that was used to develop students’ understanding of Lewis structures and explore student conceptions of the nature of science.7 Their intervention included peer review and revision to remediate errors and deeper misconceptions in a lower-risk format than students would typically experience, such as during an exam or subsequent course.7 Additionally, the revision process encourages students to correct errors they, or their peers, may have included in their initial draft. Shultz and Gere found that students were able to successfully summarize an excerpt from Lewis’ 1916 paper describing Lewis structures8 but struggled with contrasting Lewis structures to prior and current theories of molecular bonding.7

assignments where students were required to include multimodal representations in their writing and they found that inclusion of the representations improved student conceptual learning.19 Our conceptualization of WTL, informed by the work of Anderson et al. and Klein,10,11 incorporates peer review and revision in a writing process that is directed by a prompt designed to support conceptual learning through applying content knowledge to a scenario. These WTL assignments have been successfully implemented in a range of chemistry contexts and shown to develop student conceptual understanding.7,21,22 Peer review and revision are important aspects of our conceptualization of WTL as they allow students to revisit their understanding of the target content through interaction with their peers. Peer review plays an important role in the learning process because students can learn both through exposure to their peer’s work and by providing feedback to peers, where students who both read and provided reviews benefit the most.26,27 Additionally, research comparing peer and expert feedback supports the viability of peer-based feedback during writing assignments.28−30 In a series of studies comparing student revisions following single expert, single peer, and multiple peers, as well as the types of feedback from each group,28,30,31 Cho and Schunn28 found that the students who received feedback from multiple peers had the greatest improvement in their writing. They propose that this improvement arises from students receiving more directive feedback from peers which may have led to the students with multiple peers making a greater number of revisions that expanded on or clarified content discussions.30,31 Additionally, Patchan et al. found that students provided feedback equivalent in quality to both content and writing experts, validating a peer-mediated feedback process focused on feedback relating to content.29 With the ability of peers to provide feedback that is useful during the revision process, the incorporation of peer review into WTL assignments has the potential to further develop student conceptual understanding. This work branches off that done by Shultz and Gere by focusing specifically on the types of comments elicited by peer review and the characteristics of the revisions that students made.7 From this study we seek to develop a better understanding of the types of comments, e.g., content versus stylistic, that students provide when guided by a content-directed peer review rubric and how these comments may relate to the revisions students make.

Writing-to-Learn

The Writing-to-Learn (WTL) pedagogy can help students move beyond rote memorization to an understanding of Lewis structures because it focuses on developing conceptual understanding. Research on the impact of writing on conceptual learning indicates that WTL is an effective practice, where the level of efficacy depends upon the design of the assignment, how it is embedded in the course, and students’ backgrounds.9−12 Anderson et al. identified three key characteristics of writing assignments that lead to effective conceptual learning: students were engaged in meaningmaking tasks, the assignments had clear writing expectations, and the writing process was interactive.10 Klein also noted the importance of social interactions in effective writing assignments.11 Despite an increase in research pertaining to its effective use, the uptake of writing for conceptual learning in STEM (Science, Technology, Engineering, and Math) has been slow.13−15 Specifically within chemistry, as compared to other STEM disciplines, writing pedagogies are minimal and generally concentrated on improving laboratory writing and the use of peer review.7,16−23 For example, the Science Writing Heuristic (SWH) applies writing in the context of the laboratory, guiding students through the lab by having them respond to short answer questions pertaining to the choices they are making, and was found to improve conceptual understanding.16,18,24,25 Calibrated Peer Review has also been used in chemistry to develop both scientific practices and conceptual learning, where students write an essay and then participate in a peer review process where peers provide feedback and assign scores to their writing.17 Focusing specifically on WTL, McDermott and Hand implemented



THEORETICAL FRAMEWORK This work is guided by the theories of social constructivism and distributed cognition. In social constructivism, knowledge is theorized as being constructed by the learner as they incorporate new knowledge with existing knowledge within a social context.32 Much of the cognitive theory connecting writing and learning places an emphasis on writing as part of a cyclical process involving recalling pre-existing knowledge, constructing new knowledge through a feedback process and reinforcing that knowledge through the process of verbalization via the written word.33,34 Additionally, when students “verbalize” their knowledge, gaps in their understanding or connections between concepts are revealed and then may be filled in by developing new relationships.34,35 Distributed cognition posits that knowledge is stored both internally and externally in sources such as other individuals or texts.36 From this theory, the act of writing is supported by the assignment B

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 1. General coding scheme for peer review comments and revision. The coding scheme for the peer review comments was adapted from Patchan et al., and the revision coding scheme was developed and applied by the research team.29 The arrows indicate the order of the analysis. Detailed versions of the coding rubrics can be found in the Supporting Information. *Only those peer review comments receiving the “Type of Feedback” code “problem/solution” were further coded in the rest of the categories.

Herein we focused on how the type and language of peer feedback impacted revision and what types of revisions students made following feedback. The analysis of student peer reviews was guided by the analytical framework developed by Patchan et al., which provides a range of categories for type and language of feedback in peer review comments.29 Patchan et al. applied their framework to comments focusing on stylistic and argumentation aspects in students’ writing.29 The framework was expanded and utilized, as described below, to probe student feedback guided by a content-focused peer review process. This work was guided by three research questions:

description, or external representation, where the description first directs students toward specific content for which they then retrieve internalized knowledge.37 The assignment described herein incorporates multiple external representations of the content: the section of the Lewis paper that students read and summarize, the assignment prompt (see Supporting Information) and the peer review rubric (see Supporting Information). The Lewis paper contributes external knowledge and can help direct students’ writing.8,37 The content-directed prompt and peer review rubric are both topic directive which helps focus the learning process, a point reinforced by Graham et al., who found that having some form of checklist brought students back to the target content, creating another role for the peer review rubric provided in this assignment.38 Similarly, Butcher and Kintsch found such guidance to be especially helpful when provided during the writing process.39 Furthermore, by incorporating peer interaction in the writing process, our conceptualization of WTL includes the social aspects thought to promote learning via the theories of social constructivism and distributed cognition, both of which view knowledge as constructed and held by a group of individuals. It is the interactions between individuals, specifically via an artifact or object, that can transform knowledge.36 The assignment described here promotes knowledge transformation through the dictated interactions imposed by the peer review process. Following completion of their initial draft, students must engage with their peers in the process of peer review via a content-directed rubric. This creates an environment for further construction of knowledge through social interactions, both in reading peers’ conceptualization of content and in providing their own content feedback.26 Finally, through the process of revision, students can apply any expanded understanding of concepts that they have gained to their original draft, thereby engaging in restructuring their knowledge.



1. What types of comments do students give when peer review is guided by a rubric focusing on conceptual learning? 2. What characteristics of peer review comments are associated with student revision? 3. What types of revisions do students make and are there differences between the content and editing focused changes?

EXPERIMENTAL SECTION

Context

This study included analysis of peer review comments and student writing from an undergraduate general chemistry classroom at a large public research university. The students were given a WTL assignment that involved reading and summarizing a 1916 paper written by Gilbert Lewis proposing a new method for modeling molecular structure.8 The assignment is described in detail by Shultz and Gere.7 Briefly, students opted in to participate in an additional activity outside of the regular class. The students wrote a summary of the Lewis paper based on a detailed assignment description that C

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

asked students to discuss specific molecular concepts. Following submission of the initial draft, students were assigned their peers’ work to review through the online tool Peerceptiv. Each student was assigned three papers to review and received feedback from one to four peers. Peer review was guided by a rubric that directed students to provide feedback on specific content, rather than writing mechanics. (Writing prompt and peer review rubric are provided in the Supporting Information.) Both peer review and revision were graded based on completion. Each student then received reviews of their own work and was required to revise and resubmit their writing. We obtained IRB approval to collect and use student data, and every student provided consent for our use of their responses.

question, seemingly to minimize the weight of their comment. Additionally, to “Explanation of the Problem” and “Explanation of the Solution”, the code “concept” was added when students provided a conceptual explanation of the incorrect chemistry concept they identified in their comment. In the categories “Focus of Problem” and “Focus of Solution”, the code “multiple” was added. This code was added to parallel the code “combination”, in this case “multiple” was used to code for comments where students broadly mentioned a concept, “higher order”, and then went into detail about what part needed improvement, “substance”. The adapted dictionary allowed us to capture whether students were identifying conceptual trouble spots in their peer’s work and the extent to which they were providing constructive feedback related to the chemistry content. Each peer review comment was scored for either the presence (1) or absence (0) of each code presented in the coding dictionary. Comments were first coded for the type of feedback they provided; the first category in the dictionary. If a comment was scored as a 1 for any of the codes other than “problem/solution” it automatically received a score of zero for all the subsequent codes. The problem/solution codes are of the most interesting in the “Type of Feedback” category as they represent students’ ability to provide their peers with feedback that can aid in revision, thereby developing conceptual understanding through the means described in our theoretical framework. Within each subsequent category, the comment was only coded as having one of the characteristics presented in that category, with the exception of the “Focus of the Problem” and “Focus of the Solution” categories. Here comments could be coded as both “higher order” and “substance”, whereupon they also received the code of “multiple”. Similarly, they could receive the codes “higher order” and “lower order”, also receiving the code “combination”. Additionally, comments relating to visuals or paragraph transitions were coded as “word-sentence” in the “Scope” category. Researchers determined IRR for the “problem/ solution” “Type of Feedback” code and the “Type of Problem/ Solution” codes with kappa values from 0.70 to 0.95, which aligns with the kappa values found by Patchan et al.29

Peer Review Comment Coding

The primary data source for this work was student peer review comments from the aforementioned WTL assignment.7 The data consisted of 1,132 comments from 66 reviewers, who commented on 70 summaries. Each review included comments associated with a specific question from the peer review rubric. The areas covered in the peer review rubric included (1) comparing Lewis structures to pre-Lewis theories, (2) comparing Lewis theories of bonding and molecular structure to current theories, (3) summarizing Lewis structures, (4) summarizing Lewis’ 1916 paper, (5) writing style and organization, and (6) including important terms related to bonding and molecular structure (Supporting Information). Each summary was reviewed by one to four peers, with the majority reviewed by three. The peer review process was anonymous. Student comments were analyzed deductively using a coding scheme described by Patchan et al. which compared the type and language of student, content expert, and writing expert comments on student writing by coding various types of feedback.29 An initial subset of the comments was coded using the Patchan et al. dictionary by two researchers (general scheme depicted in Figure 1A).29 Comparison and discussion following this led to the expanded set of codes, whereupon another subset of comments was coded. From this second subset, the coders again compared and refined the coding definitions. The reasoning behind added or changed codes in the set are described below. One researcher then coded the full set of comments. Each comment was coded as a unit, irrespective of whether it was made up of one or multiple sentences. In a few cases, we found that students used the same review comments for multiple papers. In these instances, the duplicates were removed and any unique comments by that reviewer were kept. As our work was focused on students’ abilities to provide comments related to Lewis structures, the coding dictionary was expanded to allow for more detail related to the specificity of chemistry-related feedback as well as to incorporate additional characteristics identified in the student comments (Figure 1A, full coding scheme in the Supporting Information, Table S1). To the “Type of Feedback” category, “verification” was added as many students just replied with a “yes”, indicating that the rubric criterion had been fulfilled. There was also greater variation noted in the affective language that students used. To accommodate this, the category was modified by adding the code “hedging”, which combined the codes “downplay” and “question” as students used these in similar ways. Comments were coded this way when students provided feedback using tentative language or posing their comment as a

Revision Coding

In addition to the student comments, the changes between the initial and final drafts of student writing were analyzed to identify if they were related to the peer review comments the student had received and the types of changes students made related to Lewis structures. Of the 70 summaries, only 55 had corresponding initial and revised drafts and thus were the only ones coded when considering changes between drafts. First it was determined whether summaries had been revised. Revised documents included writing with any change between drafts. The summaries containing revisions were then categorized according to the degree and type of revision (Figure 1B, full coding scheme in the Supporting Information, Table S2). Degree of revision was coded as negligible revision, two to three sentences, and more than three sentences. The revisions were categorized by type as content related to Lewis structures, editing, or both. Here, the code editing was applied when students made structural or grammatical type changes. The summaries with any change between drafts were identified as revised but may have received the code “negligible revision” if the changes were on a scale of less than two sentences. The editing-coded revisions were broken into sentence level and D

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 1. Type of Feedback Coding Scheme with Definitions Used during the Scoring Process and Exemplars for Each Code Codea Verification Summary Praise Problem/ Solution

Definition Identifies that the rubric was addressed. A list of the topics, a description of the claims, or an identified action. A complementary comment or identification of a positive feature. Identifying what needs to be fixed and/or suggesting a way to fix an issue.

Exemplar “Yes, both terms from chemistry 130 lecture and from Lewis’ paper are included in the summary in a smooth manner.” “Yes, terms such as the Helmholtz Electron Theory of Valence and tautomers are included in this essay.” “The paper wonderfully summarized the Lewis structures.” “This paper included most if not all of the important points made by Lewis. Just make sure you talk about the cubical atom theory and the postulates that go with it. This was the main part of the paper written by Lewis. Make sure you use the terminology that Lewis used in his paper, such as the octet rule.”

“Type of Feedback” codes were applied to the whole set of 1,132 comments.

a

Cochran’s Q and McNemar’s tests, using a bivariate analysis. The tested variables with a p value less than 0.25 were included in the final model. We included “problem/solution” type feedback and “Type of Problem/Solution” in the final model, but we had also tested “Explanation of Problem/Solution” and “Focus of Problem/Solution” for inclusion. While we acknowledge that the peer review process creates additional sources of variability due to the author and peer reviewer interactions, we felt that the increased complexity of using a mixed-effects logistic regression model with nested variables was not necessary in this situation. As each author both gave and received peer reviews from one to three randomized peer reviews, the complex error intercorrelation matrix needed to account for the random effects is beyond the scope and computational power feasible here. Application of the model is described in the Results and Discussion section.

paragraph level changes. Similarly, the content revisions were coded as either minor or major changes. Minor content changes were revisions where a student built on, clarified, expanded, refined, or otherwise altered existing discussion of bonding and molecular structures. The addition of a figure to supplement existing content was also coded as a minor change. Major changes involved the introduction and elaboration of a new concept related to molecular structures or bonding, complete change in their explanation of an already included concept, or linking of concepts. The drafts were coded using the Compare Documents functionality on Microsoft Word to improve the consistency of scoring. All revision changes were scored by two of the researchers. A subset was scored first to apply and develop the scoring rubric. Following making modifications to the rubric, each summary was scored by the two researchers and they discussed any discrepancies to reach consensus. The peer reviews associated with each completed assignment (where students had submitted both initial and revised drafts) were checked to determine if any could be associated with changes between the drafts. Peer reviews associated with changes were coded as a one and those that had not received a zero. As not all students submitted both initial and revised drafts, the number of peer review comments that received this coding was 874 out of the 1,132 total comments.



RESULTS AND DISCUSSION This work involved the analysis of a WTL assignment where students wrote a summary of Lewis’ 1916 paper, underwent peer-mediated peer review, and revised their drafts based on feedback. Specifically, the peer review comments and changes upon revision between initial and final drafts were analyzed in an effort to (1) characterize the types of feedback that students provide their peers when guided by a rubric focusing on developing understanding of Lewis structures, (2) identify if certain types of feedback were more likely to lead to revision, and (3) characterize the types of revisions that students made. Changes between student drafts were correlated to specific peer review comments, and then the degree and type of change, content or structural, was characterized. By looking at these two sources of data in conjunction, we were able to learn more about students’ ability to provide chemistry-specific feedback that promotes revision in the areas of molecular structures and bonding.

Data Analysis

Following coding, data analysis was conducted using Stata. To examine relative frequencies of codes within categories, Cochran’s Q was performed followed by McNemar posthoc to test for significance.40 Pearson’s Chi Squared test of independence was used to analyze differences in the occurrence of codes between the students who did and did not revise as well as differences in between the peer review rubric criteria.40 During significance testing, statistical significance was set at 0.05 and p values below that are reported throughout. Effect sizes were also calculated when appropriate. Cohen’s d was used throughout with the exception of comparisons between rubric criteria where phi was used (data and pertinent effect size values presented in Table 3). For the effect size calculations using Cohen’s d, 0.2−0.49 was considered small, 0.5−0.79 was medium, and 0.8 or greater was considered large. Focusing specifically on whether certain characteristics of peer review comments led to higher occurrence of revisions, we performed a logistic regression analysis on the peer review data where association with revision was our output variable.41 To construct our logistic regression model, we first tested a series of predictor variables, informed by the results of the

Peer Review Comments

The peer review process was scaffolded through a rubric, whereby students responded to each paper they read on six criteria, five of which targeted specific chemistry content and the sixth focused on writing. An initial analysis of the student peer review comments showed that students were primarily using the peer review rubric as directed, where their comment directly related to Lewis structures concepts they were supposed to address. As seen in the exemplars presented in Table 1, students used discipline specific terminology in their responses. According to the coding scheme, student peer review comments were first characterized as either verification, summary, praise, or problem/solution (Table 1, with full E

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

dictionary presented in Table S1) under “Type of Feedback”. Of the comments, 48% (542 out of 1,132) were coded as “problem/solution”, indicating that the student had addressed some area of difficulty with their peer’s writing by either identifying the issue or providing a suggestion for improvement. Here, students used a mix of the terminology from the related rubric criteria and more specific terminology related to structures and bonding. Often students identified missing comparisons of Lewis’ theory of the atom to previous and current theories of atomic structure and bonding. They also identified a series of terms that had not been fully incorporated into the writing such as kernel, tautomer, and valence electrons. The rest of the comments were primarily coded as “verification”, at 29% (325 of 1,132), where the student indicated that their peers’ writing incorporated all the specified elements, or “praise”, 20% (226 of 1,132). In the comments coded as “verification”, students frequently used the chemistry specific terminology given in the rubric criterion to indicate that their peers had covered the desired content. For example, the exemplar presented in Table 1 for “verification” was in response to the criterion probing use of terms from the general chemistry course and Lewis’ paper. Conversely, those comments coded as “praise” used less specific language and were succinct. Only 3% (39 of 1,132) of the comments were coded as a “summary”, where the comment was a description of what their peer had written about. The language used in the summary comments was similar to that used in the “problem/ solution” comments, where it was a mix of rubric and more specific terms like bonding, polarity, and octet. The fact that almost half of the comments identified areas of difficulty indicates that students are able to provide constructive feedback to their peers, as seem by Cho and MacArthur and Patchan et al.27,29 We also examined the type of feedback comments that were associated with revisions (Figure 2). The majority of comments associated with revision were coded as “problem/ solution”, with only 3 of the 172 comments associated with a revised assignment not having received the “problem/solution” code. Of the comments identifying areas of difficulty, 42% were

relevant to revisions that students made and 58% did not have related changes. Therefore, despite the fact that comments identified areas for peers to improve their summaries, the “problem/solution” comments did not always lead to changes (p ≤ 0.001, effect size = 1.53 comparing between revision and no revision for “problem/solution”). While we saw that students did not always make changes when areas of difficulty are identified for them, they almost exclusively made changes only when feedback prompted them to do so. Additionally, almost half of the peer review comments were characterized as “problem/solution”, which indicates that students are able provide substantive feedback on content related to Lewis structures when guided by a content-focused rubric. Instructors may be able to promote feedback similar to those coded “problem/solution” by providing examples of constructive feedback to their students in class and drawing attention to the fact that comments identifying issues were almost exclusively the only comments tied to revision. Student revision may also need to be reinforced with suggestions for revision so that students incorporate more of the feedback they receive that identifies areas in their writing where the discussion of bonding and molecular structures should be modified. Following characterization of the general form of feedback, the “problem/solution” type feedback comments were further coded to capture how students addressed identified areas of difficulty in their comments (“Type of Problem/Solution”, Table 2), language characteristics, and the content-relevancy and complexity of the comments (full coding scheme in Table S1). Focusing first on the “Type of Problem/Solution”, the 542 comments identified as “problem/solution” under “Type of Feedback” were coded as “problem”, “solution”, or “both” (Table 2). Overall, 24% of students identified a trouble spot in their peer’s work (“problem”, 133 of 542), 46% proposed improvements or corrections (“solution”, 249 of 542), and 30% both identified a problem and provided its solution (“both”, 160 of 542). Overall, the chemistry specific language was similar between comments of these three codes, where students used the language from the rubric. When the comments were more explicit there was a focus on content related to Lewis’ cuboidal model of the atom, discussions of polarity, and how electrons were depicted. The comments coded as “solution” and “both” tended to provide greater detail in the chemistry contents students thought should be added. For example, comments suggested adding discussions on bond order, electronegativity, and covalent bonds. None of these types of “problem/solution” comments were more or less likely to occur (p > 0.05 for each comparison between problem, solution, and both). Additionally, the way an area of difficulty is targeted, i.e., identifying a trouble spot or proposing a solution, appeared to be irrelevant to students using the feedback when revising their work (p > 0.05 when comparing revision versus no revision associated comments for each of problem, solution, and both). The language used also did not appear to impact student revision (see Figure 1A and Table S1 for codes, p > 0.05). Most students used neutral affective language in their comments, 72%, with the next most common being hedging used by 17% of students. Most students, 89%, did not provide any language localizing the problem they had identified. The lack of localizing language was generally not problematic as the specific content in the draft could be identified easily from the bonding and molecular structure related terminology students

Figure 2. Type of feedback made in peer review comments. Students provided primarily “problem/solution” type feedback, followed by “verification” type feedback. In all types of feedback, comments were statistically more associated with no revision, however almost exclusively only revised if they received “problem/solution” feedback. **p < 0.01, ***p ≤ 0.001. aEffect size: Verification = 0.72; Summary = 0.43; Praise = 0.60. bEffect size: Problem/Solution = 1.2. F

DOI: 10.1021/acs.jchemed.8b00711 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 2. Type of Problem/Solution Coding Scheme with Definitions Used during the Scoring Process and Exemplars for Each Code Codea

Definition

Problem

Only a problem is explicitly identified. Only a solution is explicitly offered. Both a problem and solution are provided.

Solution Both

Exemplar “At the beginning, it was a little unclear of what ideas Lewis brought to the table regarding polarity.” “Like stated above, use terminology that Lewis used, such as octet rule. You could also add the term tautomerism (known today as resonance structures). This deals with drawing a compound formula multiple ways.” “The essay did not really mention the differences of the 1916 model compared to today’s. One thing you could mention is the model of the cubes, since you show it. You can talk about how he thought that they bonded like cubes but today we know that’s not how they actually bond.”

“Type of Problem/Solution” codes were applied to the 542 comments that received the “problem/solution” code in “Type of Feedback”.

a

used. The “Scope” of the comments were coded primarily as “midlevel”, 77%, and “word-sentence”, 23%, focusing on specific chemistry concepts or terms. The last categories for the peer review feedback included “Explanation of Problem/Solution” and “Focus of Problem/ Solution”. Here, comments coded as “problem” in “Type of Problem/Solution” were coded in the “Explanation of Problem” and “Focus of Problem” categories. Similarly, “solution” coded feedback was coded in the “Explanation of Solution” and “Focus of Solution” categories. Feedback that had been coded as “both” was coded for both sets of explanations and focus. The “Explanation” category was used to characterize if the comment went into why, a change was necessary, whereas the “Focus” category characterized what needed to be changed. For the “Explanation of Problem/ Solution”, the majority of the feedback was coded as “absent” where students did not go into why something was an area of difficulty. The code “concept” was only applied to a small number of comments, indicating that while students were identifying areas of difficulty during the peer review process, they were not using it as a place to explain concepts related to bonding or molecular structure to their peers. For the “Focus of Problem/Solution” categories, comments were coded almost entirely as “higher order” or “substance” (Table 3),

when proposing improvements or corrections related to bonding or molecular structures. It also shows that students were providing more in-depth discussion of Lewis structure related concepts when they proposed improvements or corrections than when just identifying something incorrect. There were no differences seen in the likelihood of revision between any of the codes in “Explanation of Problem/ Solution” or “Focus of Problem”; however, in “Focus of the Solution”, comments with “higher order” solutions were more associated with revised drafts (p ≤ 0.01, effect size = 0.44) and those with “lower order” solutions were more associated with drafts that did not revise (p ≤ 0.01, effect size = 0.61). We were also interested in how the specific chemistry content students addressed in the different rubric criteria impacted the characteristics of their comments. To investigate this impact, we conducted three chi squared tests of independence comparing the frequency of codes against the six peer review rubric criteria to determine whether the following code subsets were independent of rubric criteria categorization: (1) the prevalence of “problem/solution” type feedback, (2) variations in “Type of Problem/Solution”, and (3) the number of comments associated or not with revision. The frequency of codes corresponding to rubric criteria was found to be statistically significant; however, chi squared values were small (Table 4). Further analysis of the effect sizes

Table 3. Comparative Frequency of “Focus of the Problem” and “Focus of the Solution” Codes Codea

Focus of the Problem

Focus of the Solution

Assignment Combination Lower Order Higher Order Substance Multiple Total

2 3 68 184 55 11 323

0 1 88 150 200 32 471

Table 4. Comparison of Peer-Review Comment Characteristics by Rubric Criteria Subset of Codesa

N

χ2 Value

p Value

Effect Sizeb

“Problem/solution” feedback “Type of Problem/Solution” Association with revision

542 1132 874

74.04 25.39 35.87