Developing Student Process Skills in a General Chemistry Laboratory

4 hours ago - Read OnlinePDF (1 MB) ... reported their understanding of process skills and their perceived improvements over the course of the semeste...
0 downloads 0 Views 1MB Size
Article pubs.acs.org/jchemeduc

Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX

Developing Student Process Skills in a General Chemistry Laboratory Gil Reynders,† Erica Suh,‡ Reneé S. Cole,† and Rebecca L. Sansom*,‡ †

Department of Chemistry, University of Iowa, Iowa City, Iowa 52242-1002, United States Department of Chemistry and Biochemistry, Brigham Young University, Provo, Utah 84602, United States



Downloaded via NOTTINGHAM TRENT UNIV on August 24, 2019 at 06:16:00 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

S Supporting Information *

ABSTRACT: Laboratory coursework is widely considered to be an integral part of chemistry undergraduate degree programs, although its impact on students’ chemistry knowledge is largely unsubstantiated. Laboratory experiences provide opportunities to learn skills beyond chemistry content knowledge, such as how to use scientific instrumentation appropriately, how to gather and analyze data, and how to work in a team. The acquisition of process skills, including critical thinking, problem solving, and communication, is an integral part of becoming a scientist and participating in the scientific community. As apprentice scientists, chemistry students interact with each other in a context-rich environment where the need for process skills can arise organically. This study seeks to understand the role of laboratory courses in developing process skills. Students in a first-year chemistry laboratory course used rubrics to assess their own process skills. During the course, the students also received feedback via rubrics from a teaching assistant trained in rubric use. Additionally, students reported their understanding of process skills and their perceived improvements over the course of the semester. Our results suggest that students understand group dynamics process skills such as teamwork and communication better than they understand cognitive process skills such as critical thinking and information processing. While the evidence further suggests that students improved their process skills, and students reported that they improved their process skills, they showed inconsistent abilities to self-assess and provide justification for their assessment using rubrics. KEYWORDS: First-Year Undergraduate/General, Chemical Education Research, Laboratory Instruction, Interdisciplinary/Multidisciplinary, Communication/Writing, Problem Solving/Decision Making, Learning Theories, Student-Centered Learning FEATURE: Chemical Education Research



for K−12 Science Education8 defines science and engineering practices that characterize the norms and values of the scientific community. These include asking questions, developing and using models, planning and carrying out experiments, analyzing and interpreting data, using mathematics and computational thinking, constructing explanations, supporting arguments with evidence, and obtaining, evaluating, and communicating information.8 The process skills evaluated in this study are information processing, critical thinking, problem solving, interpersonal communication, teamwork, and management.9 These skills have been identified by the Process Oriented Guided Inquiry Learning (POGIL) Project10 as skills that all STEM undergraduates should develop. There is significant overlap between process skills and science practices. For example, problem solving includes asking questions along with planning and carrying out experiments. Teamwork and management also play a part in planning and carrying out experiments. Rather than conceptualizing learning as the acquisition of knowledge, we draw on ideas from Sfard11 and advocate for a broader view of learning as participation in the scientific

INTRODUCTION Laboratory coursework is an important part of undergraduate chemistry programs, where students develop both specific technical skills and more general process skills that are useful for their future careers. A recent national report1 and chemistry education editorial2 highlight the lack of evidence for laboratory experiences supporting science content learning. However, this is in contrast with the deeply held belief by chemistry faculty that laboratory experiences are an essential part of an undergraduate chemistry program and help to develop important process skills like critical thinking, teamwork, and communication.3 Furthermore, the National Association of Colleges and Employers has specified these process skills as necessary for graduates entering science, technology, engineering, and mathematics (STEM) careers.4 Given that laboratory coursework requires costly human and material resources, it is important to fully understand the effects of laboratory instruction on student development, for learning both technical and process skills. Process skills, which include both cognitive skills (information processing, critical thinking, and problem solving) and group dynamics skills (interpersonal communication, teamwork, and management), are important learning goals because they are needed in the workforce5 and are expected skills for members of the scientific community.6,7 Indeed, the Framework © XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: May 9, 2019 Revised: July 25, 2019

A

DOI: 10.1021/acs.jchemed.9b00441 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

RESEARCH QUESTIONS This study sought to answer the following research questions: 1. How well did students understand process skills after using rubrics? 2. To what extent did student process skills improve during a laboratory course when process skills were an explicit learning goal? 3. How well did students assess their process skills using rubrics?

community and doing science. Recognizing the value of science practices and process skills, several novel laboratory pedagogies have attempted to integrate conceptual learning with behaviors associated with doing science and being scientists. Some examples include Argument Driven Inquiry12 and the Science Writing Heuristic,13 which both emphasize asking scientific questions, designing appropriate procedures to test those questions, supporting conclusions with experimental evidence, and communicating ideas clearly. Specifically, the ADI study’s indicator of success was students’ abilities to “construct a scientific argument and to participate in scientific argumentation” by assessing a student’s written and oral arguments.12 The Model−Observe−Reflect−Explain (MORE) Thinking Frame14 encourages students to develop models of chemical phenomena that span macroscopic and submicroscopic worlds, and to revise those models using evidence gathered in a series of experiments. In one case, the MORE Thinking Frame has been used to increase the frequency of correct ideas and lower the frequency of incorrect ideas in student-generated models of aqueous solutions.15 POGIL10 encourages students to develop critical thinking skills while taking on a variety of group roles that mirror the interactions within scientific communities. The observed learning gains of students in courses using these pedagogies, including an increased ability to develop and support valid arguments,12 solve problems, and engage in metacognition,14 support the idea that an enriching laboratory environment can help students develop as scientists. Some laboratory courses operate within constraints (e.g., lack of resources for TA training or department-prescribed experiments for all laboratory sections) that prevent the formal adoption of one of these evidence-based pedagogies, but instructors in these programs may still wish to emphasize process skills as part of a more traditional course. Despite the importance of process skills, they usually are not named as explicit learning goals, taught directly, or assessed as part of undergraduate coursework. Studies have shown that students are not always aware that developing process skills is a goal in their laboratory courses,16 believe that technical skills are the primary goal for learning in lab,17 and may not be developing expected cognitive skills while in the laboratory.18,19 Additionally, students do not always receive specific feedback on their development of these skills, nor are students expected to develop the ability to assess their own skills. The Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project has created rubrics to directly assess process skills and provide feedback to students.20−22 The goals of the ELIPSS project are to create instructional tools that can be used by STEM undergraduate instructors to assess students’ process skills and provide feedback to students on their skill development. The rubrics include definitions for each process skill that were adopted from the POGIL Project9 and represent a consensus understanding of each skill by a community of STEM faculty. The rubrics were designed to provide faculty with information to reflect on their teaching and to provide feedback to students to encourage them to be metacognitive and regulate their learning. The ELIPSS rubrics have been implemented in multiple chemistry classrooms and resulted in changes to instructor practices to better align assessments with intended learning outcomes.22 The current study uses ELIPSS rubrics to assess student process skills in an introductory chemistry laboratory course.



THEORETICAL FRAMEWORK: SITUATED LEARNING The theoretical foundation of this research is situated learning,23 which posits that learning is based in social coparticipation. In science, learning means becoming part of, sharing the values of, adhering to the norms of, and skillfully practicing as a member of the scientific community. This learning takes place through “legitimate peripheral participation”23 as learners interact with each other and an enriching environment to gradually become fuller participants in the community. Legitimate peripheral participation is “an opening, a way of gaining access to sources for understanding through growing involvement.”23 Our work with process skills rubrics is an attempt to increase access to the scientific community by making the norms of the community more transparent for students, supporting them as they practice those norms within a quasiauthentic context of the laboratory course, and helping them reflect on the degree to which they have achieved learning and become part of the scientific community. While ELIPSS rubrics were created to assess both written work and interactions,20 we have focused solely on interactions even for skills traditionally considered to be cognitive in nature, such as information processing, critical thinking, and problem solving. This choice is rooted in the idea that learning happens via social coparticipation, and that spoken interactions can provide evidence of the ways that students are adopting the norms of the community. For example, in their interactions with their lab partners, we hoped to see evidence that students were moving from colloquial to scientific vocabulary, increasing their ability to support claims with evidence, and planning and carrying out experiments with greater efficiency. The methods used in this study include observations and field notes from student groups working in lab along with surveys that asked students to reflect on their experiences in lab, their own behavior, and their interactions with their lab partner. These methods were chosen to obtain data about students’ social coparticipation from multiple sources and to triangulate our findings.



METHODS

Participants and Setting

This study was conducted at a large, selective, private university in the United States. The university’s Institutional Review Board approved this study before participants were contacted or data was collected. Participants were students enrolled during summer 2018 in the general chemistry laboratory course for science and engineering majors. Traditionally, students take the lab course during the second semester of the general chemistry lecture sequence, although some students take it as late as their senior year. Approximately 80−85% of students were studying life sciences. The remaining students had majors B

DOI: 10.1021/acs.jchemed.9b00441 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Figure 1. Rubric for interpersonal communication. The definition for the skill is at the top. Each category has six possible levels (zero through five). Level zero indicates that no evidence of the category was observed.

and engineering practices.8 As part of these changes, the intention was that students would use some laboratory time to engage in scientific dialogue with their lab partners, but we found that they chose to finish the lab as quickly as possible16 and save all the thinking for afterward.25 It is a challenge to convince students that practicing the norms of the scientific community is worth their immediate time and effort, given the many competing demands on their time. In an attempt to satisfy the needs of both students and instructors,3,26 we placed prompts for discussion directly in the experimental procedure and linked the discussion prompts to questions on the lab report. Ideally, this change would simultaneously accomplish two instructional goals: (1) enable students to complete the lab report more efficiently by allowing them to analyze data during the lab with the help of their lab partner and TA, and (2) increase the amount and quality of scientific discourse during lab. We used the ELIPSS rubrics to communicate to students that the development of process skills is an important part of their laboratory experience, and to help students assess their progress. This approach is consistent with constructive alignment in which an instructor aligns their intended learning outcomes, tasks, and assessments to clearly indicate their values to students and where the students should focus their efforts.27,28 The use of rubrics as a tool for self-reflection is also consistent with Zimmerman’s theory of self-regulated

outside the life sciences, and many of these students are pursuing careers in the health professions. There are 11 experiments each semester. During this course, a teaching assistant was assigned to each room, and one supervising instructor monitored the work of all students. There were 51 students who agreed to participate in the study. Because this study occurred during the summer term, there were two differences from how laboratories are normally operated during traditional fall and winter semesters: the frequency of the experiments and the number of students enrolled in the course. During normal semesters, students perform one laboratory experiment each week, but during the summer students performed two experiments per week. Additionally, because student enrollment at the university is much smaller during the summer, there were only about 60 students split between three lab rooms. Typically, fall and winter sections have about 120 students split between four lab rooms. The current study took place in the context of a course redesign and reform, in which a traditional general chemistry laboratory course (emphasizing quantitative and qualitative techniques to find the correct or accepted value by following a detailed procedure) was modified to place greater emphasis on conceptual understanding, making connections between observations and the molecular level or chemical symbolism,24 and creating opportunities for students to engage with science C

DOI: 10.1021/acs.jchemed.9b00441 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

learning29 because it allows students to use feedback to reflect on past performance and plan future performances.

After students assessed themselves, this TA shared these rubrics with the students during the following laboratory period. This sequence allowed students to judge their own work without the influence of the research assistant’s evaluation and, afterward, provided feedback that the students could use to gauge their self-report accuracy. The TA was only able to observe approximately 20 of the 60 students enrolled in the course during each laboratory period. Thus, students received TA rubrics once every three weeks, and all students received the TA feedback three times during the semester. Since the research assistant was trained in the use of rubrics, we considered the TA rubrics to be a more valid and reliable measure of student performance on process skills, and we chose to consider the TA rubrics as our measure of student improvement throughout the semester. Survey. After the 10th experiment, students completed a short survey that asked them to describe process skills in their own words. Additionally, they were asked how they changed their behaviors as a result of using the process skills rubrics. This allowed us to investigate how well the students understood process skills and how rubric usage helped students improve their skills. The complete survey is included in the Supporting Information. Student Assessment of Learning Gains. After the final experiment, students completed a process skills specific version of the Student Assessment of Learning Gains (SALG) survey.31 For each process skill, the students were asked to rate their gains for the skill as a whole and for the component categories of the skill. This served as another way to measure students’ perceptions of their own growth. The complete SALG survey is included in the Supporting Information. A review of selfassessment literature by Ross32 has indicated that the validity and reliability of student self-assessments may vary, but selfassessments can improve student learning and lead to productive instructor−student interactions. Interviews. The rubrics and reflection surveys were also implemented in the same course in winter 2019. Four students agreed to be interviewed about their rubric use and their responses to the surveys to help clarify how they understood the rubrics and how they used them for self-assessment.

Study Design and Instruments

We used a convergent mixed methods design30 to address our research questions. We used a variety of data sources to triangulate our findings, providing multiple types of evidence that contribute to our understanding of the impact of rubric use on students’ process skills, as well as their perceptions of the rubrics and their own learning. Descriptions of each data source are below. For student rubrics, students worked in groups of two for all experiments, remaining with the same lab partner throughout the semester. After each laboratory experiment, each student completed a rubric (Figure 1) addressing a single process skill (e.g., interpersonal communication, critical thinking, etc.) to assess their group’s performance. The instructor for the course chose which process skills were assessed for each experiment on the basis of her expectations for the students and the most appropriate skill for the experiment. For example, the four categories for critical thinking are analysis, synthesis, argument, and critique. The critical thinking rubric was used during a kinetics experiment where students analyzed different types of data to determine what conclusions could be drawn from each type, synthesized spectrometric data and data from trials with different initial concentrations to write an experimental rate law, constructed an argument about which steps in the hypothetical mechanism could be the rate-limiting step, and critiqued arguments for different steps on the basis of their experimental data. For each category on the rubric, students scored themselves from 1 to 5, where 1 represents poor performance and 5 represents the highest level of performance. A score of zero indicates no evidence was observed. In addition, students provided evidence statements that described the specific actions they took in lab to justify their self-reported scores on the rubric. The complete set of rubrics that we used is included in the Supporting Information. The rubrics used in this study assessed the following process skills:9 Cognitive Process Skills • Information processing: Evaluating, interpreting, manipulating, or transforming information. • Critical thinking: Analyzing, evaluating, or synthesizing relevant information to form an argument or reach a conclusion. • Problem solving: Identifying, planning, and executing a strategy that goes beyond routine action to find a solution to a situation or question. Group Dynamics Process Skills • Interpersonal (oral) communication: Exchanging information and understanding through speaking, listening, and nonverbal behaviors. • Teamwork: Interacting with others and building on each other’s individual strengths and skills, working toward a common goal. • Management: Planning, organizing, directing, and coordinating one’s own and others’ efforts to accomplish a goal. TA Rubrics. Three times during the semester, a research assistant (and former teaching assistant for the course) also evaluated each group’s performance and completed the same rubric with scores and evidence statements to justify the scores.



QUANTITATIVE DATA ANALYSIS We computed the average score for each completed rubric for each of the experiments. If a score of zero was recorded, meaning that a category was not observed, the average score was calculated from the remaining category scores for that rubric. To investigate whether students improved specific process skills over time, we used paired sample t tests for the communication and information processing rubrics, where the TA evaluated the same students on the same skill at two different experiments. For critical thinking, the rubric was used twice, but the TA evaluated different groups of students at the two instances during the course, so we used an independent samples t test for that comparison. To investigate whether students improved at the ability to self-assess accurately over the course of the semester, we used a split-plot analysis of variance (ANOVA) to compare rubric scores from experiments one and seven (communication), and five and eight (information processing). We chose this statistical comparison because it allowed us to test whether student scores were different from TA scores, whether student and TA scores were changing over time, and whether the two groups changed in the same way. This comparison was possible D

DOI: 10.1021/acs.jchemed.9b00441 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 1. Coding Structure Used To Analyze Students’ Definitions of Process Skills Score (0−3) 3 2 2 2 1 1 1 0

Student Responsea

Cognitiveb

Process skills are critical thinking and management skills that are independent of actual chemical knowledge. They are general principles that can help in any class and they function to enhance chemistry learning by helping us apply the knowledge to the actual carrying out of experiments. Process skills are skills that will help us effectively communicate and work together. This can be applied both in the lab as well as in our lives with the people we interact with. Process skills are skills of processing! They are the skills that you develop as you process information and apply to a variety of different situations. How effectively we are able to communicate and go over the lab as a group. This includes understanding, communicating, analyzing, and organizing information pertinent to the lab. Also how well we are all participating in the lab. They are basic skills and characteristics of good scientists and professionals Skills that help students communicate well with each other. They are the skills we use to analyze experiments and come to conclusions. Demonstrating proper techniques when performing certain laboratory skills Percentage (N = 51)

×

Noncognitivec

×

9.8

×

×

17.7

×

2.0

×

7.8 ×

× × 33.3

Percentage (N = 51)

×

× ×

Purposed

62.8

3.9 27.5 17.6 13.7

37.3

Exemplar survey responses to the prompt, “In your own words, what are process skills?” are shown for each score. bCognitive process skills include critical thinking, information processing, and problem solving. cNoncognitive process skills include interpersonal communication, teamwork, and management. dPurpose involves the application of process skills within and beyond an educational environment. a

points, respectively. We used this as a measure of how well students understood the process skills; if they understood the category, they should have been able to provide relevant evidence to support their self-assessment. For example, students provided evidence statements for their information processing self-assessments after experiments five and eight. The definition of information processing, which was provided to the students, was “evaluating, interpreting, and manipulating or transforming information.”9 Evidence statements such as “We did pretty good at evaluating the information and its relevance” were coded as a zero because the student did not give sufficient evidence that they understood what evaluating meant or what information they evaluated. These types of statements were common. Additionally, statements that were off-topic such as “We double checked instructions to make sure we had all the right measurements and steps right before starting each experiment” were also coded as zero because they gave no evidence about the student’s information processing. When assessing the “manipulates and transforms” category in week five, one student said “Today we saw precipitate and knew that that reaction was either a single or double replacement reaction.” This statement was coded as a 1 because it did display evidence for the correct skill, but the wrong category. The student was describing an interpretation they made, not a transformation, because they were describing how they interpreted their visual observations during lab. After the experiment in week eight, the same student gave evidence for their manipulating and transforming and said “We were able to transform information from our graph into words and explain what was happening on a molecular level during the titration.” This statement displays evidence of the correct skill (information processing) and correct category (transform), so it received a 2. The students’ evidence statements were assessed by one researcher, and a subset was assessed by another. The two researchers had an inter-rater agreement score of 80% or greater for each rubric category and 90% for all rubrics combined.

because the same students had TA rubrics for those matched pairs of experiments. Because of our small sample sizes, we ran the analogous nonparametric tests to ensure that the results were the same. We report the parametric statistics for convenience because they will be more familiar to readers.



QUALITATIVE DATA ANALYSIS

Student Definitions of Process Skills

In the survey assigned on the 10th week of the semester, students were asked to define process skills in their own words. We analyzed these responses in order to understand how students may perceive process skills and their importance beyond an educational setting. To measure the quality of the responses and to determine the students’ level of understanding, student responses were scored on a scale of zero to 3. We arrived at this scale after reading and discussing student responses and testing several scoring schemes to ensure that they adequately described the breadth of student responses. A student’s response received a 3 if it included three components: (1) understanding of the role of process skills beyond this course, (2) understanding of cognitive process skills, and (3) understanding of group dynamics process skills. A 2 point response included two of the three components. A 1 point response included only one component. Responses with the score of zero included vague answers that did not mention any skills or components by name and did not recognize the purpose of process skills. Each student response was coded by the first and second authors independently, and then, the two sets of scores were compared. In cases of disagreement, the researchers discussed the reasoning behind each score and came to consensus, occasionally seeking the fourth author’s opinion. Student Evidence Statements

We also analyzed the evidence statements given by students to justify their rubric scores to see how well the evidence they provided matched the category they were scoring. Some answers were not related to the targeted process skill at all, some were related to the targeted process skill but not to the category for which they were used, and some were related to the targeted skill and category. These were awarded zero, 1, or 2 E

DOI: 10.1021/acs.jchemed.9b00441 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 2. t Test Comparisons of TA Rubric Scores Process Skill c

Communication Information processingd Critical thinkinge

N

First Experiment Mean

Second Experiment Mean

p Valuea

Cohen’s db

19 18 16

3.05 4.24 3.00

4.48 4.91 3.39