Article pubs.acs.org/jchemeduc
Effect of the Level of Inquiry of Lab Experiments on General Chemistry Students’ Written Reflections Haozhi Xu and Vicente Talanquer* Department of Chemistry and Biochemistry, University of Arizona, Tucson, Arizona 85721, United States S Supporting Information *
ABSTRACT: The central goal of this exploratory study was to characterize the effects of experiments involving different levels of inquiry on the nature of college students’ written reflections about laboratory work. Data were collected in the form of individual lab reports written using a science writing heuristic template by a subset of the students enrolled in the first and second semester of general chemistry at a research-intensive university. Our findings indicate that the level of inquiry of the experiments seems to affect three main areas of students’ reflections: knowledge, evaluation, and improvements. In the case of knowledge, our findings were particularly interesting as higher levels of inquiry were associated with a smaller proportion of reflective statements in this area. However, these types of reflections shifted from mostly focusing on factual knowledge to largely concentrating on procedural knowledge and metacognitive knowledge. In general, our results elicit trends and highlight issues that can help instructors and curriculum developers identify strategies to better support and scaffold student thinking in different learning environments. KEYWORDS: First-Year Undergraduate/General, Chemical Education Research, Inquiry-Based/Discovery Learning, Student-Centered Learning FEATURE: Chemical Education Research
■
structured lab activities to open-inquiry projects.24 Additionally, the structure of students’ individual written laboratory reports (called “lab reports” subsequently) has been changed by adopting the SWH as a framework to guide the presentation, analysis, and discussion of results.16 As part of the evaluation of this reform process, we completed a small-scale pilot investigation to explore how engagement in different types of experiments affected the structure of students’ written reflections about laboratory work. In particular, we analyzed the lab reports written by individual students to communicate and analyze their experimental results. Our central goal was to explore the effects of experiments involving different levels of inquiry on students’ reflection patterns, looking to develop a better understanding of how the openness of experimental chemistry tasks affects student postlaboratory reasoning. This information is critical not only to assess what different types of laboratory experiences may afford in terms of student reasoning, but also to design strategies to better support and scaffold student thinking.
INTRODUCTION Recent reform efforts in science education emphasize the need for engaging students in scientific inquiry.1,2 In particular, science educators stress the importance of creating opportunities for students to generate and evaluate scientific explanations, and to participate in scientific practices.3,4 At the university level, such opportunities are more likely to exist in the laboratory setting where science students typically work in small collaborative groups for extended periods of time. Unfortunately, educational research indicates that traditional laboratory activities often fail to engage students in the discussion and analysis of central ideas and do not effectively promote the development of valued science practices.5−11 The lack of authentic and meaningful opportunities for inquiry in university science laboratories has prompted efforts to transform laboratory instruction.12,13 In the particular case of chemistry, changes to conventional views of practical work have been suggested by several authors14,15 and different models of reform, such as those that follow the science writing heuristic (SWH) framework16,17 or a process-oriented, guided-inquiry learning (POGIL) approach,18 have been implemented in different college settings. However, only a few research studies have explored what these new educational models afford in terms of college-level student learning,19 actual engagement in practical work,20 and development of science practices.21−23 Efforts to reform our general chemistry laboratories in recent years have led to the implementation of a set of experiments that engage students in different levels of inquiry, from highly © 2012 American Chemical Society and Division of Chemical Education, Inc.
■
WRITING IN THE LABORATORY Several educational researchers have suggested that the introduction of writing activities in the science classroom and laboratory creates unique opportunities for students to develop argumentation reasoning skills and become active and reflective learners.25,26 In particular, analytical writing can help students organize their ideas into more coherent and interconnected conceptual frameworks.27 Published: November 19, 2012 21
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
for science and engineering majors. Students in these courses attend a 150-min weekly laboratory class (known as the “lab”) where they work in self-selected groups of four people under the supervision of a teaching assistant (TA). On average, 24 students divided in 6 groups of 4 students are engaged in experimental work in a given laboratory class and they attend 14 laboratory sessions in a given academic semester. Experiments in the general chemistry course involve students in applying common analytical techniques (e.g., chromatography, spectroscopy) to the study of diverse chemical systems.
In contrast to oral discourse that tends to be divergent and highly flexible, written discourse is convergent and facilitates the generation and analysis of arguments.28 When properly planned and scaffolded, writing encourages students to hypothesize, interpret, organize, elaborate, synthesize, and reflect on ideas.29 Thus, writing can be used to promote higher levels of reasoning and facilitate learning in inquiry-based contexts. However, research also indicates that students need guidance and support in order to engage in effective argumentative writing.30 Many students do not necessarily conceptualize writing as a tool to develop scientific knowledge31 and need guidance as they engage in the process of developing meaning and understanding through writing.32 The science writing heuristic is a writing framework designed to provide such guidance to students conducting laboratory investigations.17,33,34 The SWH presents students with a template that facilitates the writing process and creates opportunities for understanding the value of writing to learn. This template scaffolds reasoning by prompting students to generate test questions, design experimental procedures, collect and interpret data, propose claims, build arguments based on evidence, and reflect on their understandings.34 The SWH can be used as an alternative format to laboratory reports, but it also can serve as an instructional model that frames laboratory work as a process of knowledge construction and incorporates argument into inquiry-based instruction.16 Results from diverse research studies indicate that the SWH template is useful for helping students generate meaning from data, make connections among procedures, data, evidence, and claims, and engage in metacognition.29,33 The use of the SWH framework also seems to have a positive effect on learning outcomes and achievement.35 Greenbowe and colleagues16,19,36 have implemented and tested the SWH approach in college chemistry laboratories. For example, they have explored the effect of laboratory work guided by the SWH framework on students’ ability to build connections between test questions, data, claims, and evidence in investigations related to the concept of physical equilibrium.36 Their results showed that students in experimental lab sections outperformed those students in the control sections. These researchers have also demonstrated the positive effects that laboratory work framed using the SWH has on students’ overall academic performance in general chemistry.19 However, to our knowledge, there is not much information about how the inquiry level of chemistry experiments may influence college students’ reflections as framed by the SWH. The present exploratory study was designed to increase our knowledge in this area.
Data Collection
The main results of our study are based on data collected in laboratory classes taught by the same TA during the first and second semester of the general chemistry course sequence: General Chemistry I (GCI) and General Chemistry II (GCII). This TA was a Ph.D. student in chemical education with extensive experience teaching general chemistry labs. These laboratory classes were selected because they were part of a pilot project designed to increase the level of inquiry of different experiments. Experimental activity in these laboratory classes was framed using the science writing heuristic and students were asked to build their written lab reports using the SWH template. This template includes the following major components:16 Beginning questions Safety considerations Procedures and tests Data, calculations, and representations Claims Evidence and analysis Reflections and additional questions Student writing in this last section of the written report was guided by the following questions: What did you learn in this lab? What do you not completely understand? How have your ideas changed as a result of this lab? What new questions do you have? How would you improve what you did? Data were collected in the form of individual lab reports written by a subset of the students enrolled in the pilot laboratory classes taught by the same experienced TA. In particular, we obtained copies of the individual reports written by each member of one group of students per lab experiment (a single experiment could correspond to work completed in one to three consecutive lab sessions). Individual lab reports written by students in this randomly selected group were collected for several experiments (see Table 1). However, in order to increase the diversity of the lab reports subject to analysis, we chose to collect the work of different randomly selected groups of students throughout two academic semesters. This approach allowed us to collect and analyze a total of 36 lab reports written by 16 students organized in four different groups. More detailed information about the composition of these student groups and the experiments in which they were engaged is presented in Table 1. Given that students in the GCII labs were not asked to complete written reports for several experiments, the number of collected reports at that level is lower than for the GCI level. Analysis of general achievement data indicated that students in our sample had, on average, a grade point average (GPA) of 2.40/4.00, slightly lower than the average GPA for all of the students in the general chemistry course (GPA = 2.60/4.00). However, the difference in the distribution of grades was not statistically significant. Signed
■
RESEARCH GOALS The central goal of this study was to characterize the effects of experiments involving different levels of inquiry on the nature of students’ reflections about laboratory work. In particular, our investigation was guided by the following research question: What differences exist in the nature of the written reflections in students’ lab reports for experiments that use different levels of inquiry?
■
METHODOLOGY
Setting and Participants
This study was conducted at a public university in the southwestern United States. The student population includes over 30,000 undergraduate students representing diverse groups (52% female, 48% male; 34% from minority groups, mostly Hispanic). The institution offers a two-semester sequence of general chemistry courses 22
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
Table 1. Comparative Characteristics of the Groups and Experiments Course
Student Group
Number of Students, F/M
GCI
G1
4/0
G2
3/1
GCII
G3 G4
1/3 0/4
Lab Experiment (Brief Description) Measurement (Identification of plastics based on densities) Separation (Chromatography of food pigments) Identification (Identification gases by estimation of molar masses) Light Properties (Determination of efficiency of different light sources) Emission (Detection of ions by emission spectra) Absorption (Quantification of colorants in beverages) Qualitative Analysis I (Detection of cations in solution) Indigo Synthesis Kinetics (Determination of bleaching rate orders) Water Project (Analysis of local water composition)
Number of Lab Reports Collected
Inquiry Level
4 4 4
Structured Guided Verification
2
Guided
3 3 4 4 4 4
Verification Structured Guided Structured Guided Structured
our analytical system, together with concrete examples from the data. Codes were assigned to capture different types of reflective statements made by the study participants regarding knowledge, evaluation, and improvements. In designating Knowledge codes, we considered instances in which students made explicit declarative statements about facts, concepts, or procedures that they knew or understood as a result of lab work (as well as statements about what they did not clearly understand). In this part of the analysis, we adopted the revised Bloom’s Taxonomy39 to characterize reflections in four different subtypes (identified by specific codes): factual knowledge; conceptual knowledge; procedural knowledge; and metacognitive knowledge (mostly indicative of students’ awareness of gaps in their declarative or procedural knowledge, but not necessarily indicative of control or regulation of cognition). For assigning Evaluation codes, we noted when students made evaluative statements about different aspects of lab work, lab results, and these results’ implications. Specific codes in this area corresponded to evaluative statements on methodological issues, assumptions made, and application of lab results. Improvements codes were recorded when students included statements about what and how they would improve their lab work. Specific codes were used to differentiate suggested improvements in the areas of methodology and experimental design. It is important to point out that although reflective statements in the areas of knowledge, evaluation, and improvements may be indicative of similar learning outcomes, they reveal different reflective stances that we wanted to differentiate. For example, as shown in Table 3, a student may adopt an evaluative stance when recognizing that a caliper would have been a more effective tool in determining the volume of regular shapes, while another student may reflect on the same idea in a propositive manner by suggesting the use of such a tool to improve accuracy in future measurements. The first type of statement is indicative of reflection-on-action, while the other is more characteristic of reflection-for-action.40 To complete our analysis, we segmented student writing into reflective statements with boundaries determined by perceived changes in the focus of the reflections. The nature of each statement was then characterized by a general and a specific code, for example: KnowledgeProcedural knowledge; Evaluation Methodology; ImprovementsAnalysis. We used number of words to explore the relative weight of different types of reflective statements in students’ reports. The decision to use number of words was based on the careful analysis of written reflections associated with experiments in each of the three inquiry categories.
consent forms approved by our Institutional Review Board were collected from all of the participants. Although all of the experiments in the observed laboratory section had been framed using the SWH, each of the experiments offered different opportunities for students to make independent choices before, during, and after the experimental work. To characterize the level of inquiry of each lab experiment, we built a rubric based on similar rubrics used by other authors to describe the inquiry continuum,24,37,38 although we took into consideration the specific nature of the activities that our study participants were asked to complete as part of their lab assignments. Our rubric, which is included in the Supporting Information for this paper, focused on the analysis of five key activities: 1. Asking research questions 2. Obtaining background information 3. Collecting data through a procedure 4. Interpreting data to generate arguments and explanations 5. Reflecting on the experience This rubric was applied to evaluate the level of inquiry of each of the lab experiments from which lab reports were collected. Our evaluation was based on direct observations of student work in the lab, as well as on the analysis of what students were asked to do during each experiment as described in the students’ lab manual and in the written notes that TAs were required to use to guide lab activities. The resulting categories for each experiment are indicated in the rightmost column in Table 1 and brief descriptions of specific examples of experiments at each level of inquiry are presented in Table 2. The level of inquiry of each experiment emerged from independent analysis of the data by two researchers, followed by a discussion in which all disagreements were resolved. As shown in Table 1, written reports were collected for experiments corresponding to three different levels of inquiry as defined by our rubric: verification (level 1), structured (level 2), and guided (level 3). Although students completed some experiments that corresponded to the highest level of inquiry in our rubric (open inquiry), they were not required to write a lab report in any of these cases. Data Analysis
This study focused on the analysis of the “Reflections and Additional Questions” section in students’ lab reports. A process of iterative analysis of the written reflections led to the identification of three major types of reflective statements (general codes), each of them divided in different subtypes (specific codes). Table 3 includes the list of general and specific codes in 23
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
Table 2. Specific Experiment Examples at Each of Three Levels of Inquiry for Which Written Reflections Were Analyzed Level 1, Verification: Emission Spectroscopy Instructions
Level 2, Structured: Absorption Spectroscopy Instructions
1. Follow a set of steps to determine the accuracy of an emission spectrometer using a helium discharge lamp and known values of the emission spectrum for helium.
Prepare a set of solutions of a food dye to build a Beer−Lambert’s reference graph and determine its molar absorptivity. [No specific steps of how to prepare the solutions or build the graph were provided.] Determine the concentration of a food dye present in a commercial product. [No specific steps of how to carry out the quantification were provided.]
2. Follow a set of steps to determine the wavelength associated with the emission lines from a hydrogen discharge tube. 3. Determine the percentage of relative deviation between the experimental wavelengths and those predicted by Rydberg’s equation for the hydrogen atom. 4. Follow a set of steps to generate the emission spectra of known ions in solution (e.g., Na+, Sr2+). 5. Compare the emission spectra of known ions with that of unknown mixtures of ions to identify main components.
Level 3, Guided: Light Properties Instructions Determine the efficiency of different light sources (e.g., incandescent light bulb, fluorescent light bulb, LED) using a spectrometer. [Students were asked to design their own experimental procedures after a discussion of the concept of efficiency.] Generate an additional question about the different light sources that they could answer with the resources provided.
Table 3. Coding Categories for Students’ Written Reflections with Some Example Text Extracts General Codes Knowledge
Specific Codes Factual Knowledge Conceptual Knowledge Procedural Knowledge Metacognitive Knowledge
Evaluation
Methodology Assumption Application
Improvements
Methodology Design
Example Quotes from Students’ Writing I learned what the actual meaning of “energy efficiency” is. I had heard the term before, but I now know the exact meaning of that: I know that it means the percentage of visible energy given off out of the total energy. This lab directly tied into what we have been learning in lecture about intermolecular forces within a substance. In order for a solvent to separate the pigments in a substance, it has to overcome the intermolecular forces holding the molecules closely together. I learned the amount of food dye in a sample can be calculated by taking an absorption spectrum and by doing mathematical manipulations with the values obtained through the spectroscopy and physical experiment like making different concentrations. A question did arise during experimentation, which I still do not fully understand. Our group had some trouble with some of the anion/reagent solutions not reacting the same as the week before when applied to the cation solution the second week. I am not sure if it was because the solutions sat around for a week or if it was some sort of error that we were introducing... Originally, we thought that the best way to measure volume would be the water displacement method, but since we had regular shapes, using the caliper would have been more effective. We thought that once we found the densities, it would be easy or straightforward to determine what types of plastic each of the unknown were, but that was not the case. The values for density only narrowed down the possibilities. This data would be very helpful in defending a campaign that wants to replace incandescent light bulbs with compact fluorescent light bulbs. To make the experiment more accurate instead of using the water displacement to find volume to use the vernier calipers. If I were to improve what I did I would use different brands of bottled water to see if there was a difference between the companies.
paper applied it to the analysis of the lab reports for one lab experiment. During this task, coding segments were defined at the statement level in the analysis of students’ reflections. Both the proposed segmentation of the lab reports and the specific codes assigned to each subunit were then independently reviewed by the second researcher, who either agreed with the assignations or proposed alternative segmentations or codes. Comparison and discussion of ideas helped refine the coding system and the identification of boundaries between statements. Once agreement was reached in the analysis of the initial data set, the process was repeated with more lab reports until achieving over 90% agreement in 8/36 (22.2%) of the collected reports. The more refined analytical procedure was then applied by the first author to analyze the totality of the data.
Numerical distributions for the total number of words in students’ reflections (N) in each of the three inquiry levels were similar as revealed by analysis of variance and by each distribution’s mean (M), standard deviation (SD), and skewness (Sk): Verification (N = 7, M = 203, SD = 41, Sk = 0.5), Structured (N = 15, M = 182, SD = 85, Sk = 0.7), Guided (N = 14, M = 182, SD = 87, Sk = 0.6). All of the reflections for verification labs were written by participants who also wrote reflections associated with the other two types of labs, and at least two-thirds of the reflections for structured and guided inquiry labs were written by the same students. All of our comparisons indicated that, on average, none of the inquiry groups included students who were significantly more succinct or verbose in their reflections than the others, and that the level of inquiry did not significantly affect the total number of words written by the participants. Thus, we deemed appropriate the use number of words to elicit the relative weight of the various types of reflective statements at different inquiry levels. In general, data analysis was completed in various steps. Once an initial analytical framework and coding system was discussed by the two main researchers, the first author of this
■
MAJOR FINDINGS As indicated above, our analysis of students’ written reflections elicited three major types of reflective statements, categorized as Knowledge, Evaluation, and Improvements. The presence of these types of statements was likely influenced by the specific questions used in the observed labs to guide students’ 24
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
written reports across all types of labs. On average, emphasis on each of the other knowledge categories was quite similar: Conceptual (20.3%, SD = 13.4); Procedural (17.5%, SD = 16.9); and Metacognitive (21.4%, SD = 19.1). Most statements categorized as metacognitive knowledge were indicative of students’ awareness of gaps in their declarative or procedural knowledge, but not necessarily indicative of control or regulation of cognition.41 As shown in Figure 2, students’ reflections related to these different types of knowledge seemed to be influenced by the
reflections. (For example, what did you learn in this lab? What do you not completely understand? How have your ideas changed as a result of this lab? What new questions do you have? How would you improve what you did?) Overall, comparison of the number of written words in all lab reports across different types of experiments indicated that students invested the largest portion of their reflections (51.6%, SD = 8.7) in writing about knowledge acquired as a result of lab work (Knowledge). Second, students made evaluative statements about lab methods and results (Evaluation: 31.7%, SD = 12.9); last, they reflected upon potential improvements to their lab work or to the actual experiments (Improvements: 16.8%, SD = 8.5). However, as shown in Figure 1, the relative weight of these different areas of reflection changed with the level of inquiry of the lab experiments.
Figure 2. Effect of the level of inquiry of lab experiments on type of knowledge targeted in reflections in the learning area.
level of inquiry of the experiments. A chi-test analysis of these data revealed a significant change with the inquiry level (χ2 = 663.8; df = 6;p < 0.01) mostly associated with the decreased focus on factual knowledge and increased focus on procedural and metacognitive knowledge in students’ reflections associated with experiments with higher inquiry levels. Working on lessstructured labs seemed to promote more reflections not only about knowledge of experimental skills or procedures, but also about what was understood or not as a result of lab work. However, we did not observe any correlation between the level of inquiry and the amount of reflection on chemical concepts or ideas.
Figure 1. Effect of the level of inquiry of lab experiments on the types of reflective statements in students’ written reflections.
Statistical analysis of these data using a chi-test indicated the existence of a significant difference in focus of attention across inquiry levels (χ2 = 100.6; df = 4; p < 0.01). A subsequent posthoc test used to evaluate the standardized residuals with Bonferroni correction for multiple pairwise comparisons (at a level of significance of 0.05 before correction) indicated that contributions to this significance were mainly associated with the decrease of reflections about Knowledge and the increase of reflective statements about Improvements in going from verification to guided inquiry labs. Working in more open experiments seemed to prompt students to make more propositive statements about how to improve experimental methodology or design.
Focus on Evaluation
As part of their reflections, students also made evaluative statements about their lab work. As exemplified in Table 3, our analysis elicited three major types of reflective statements concerning evaluation. First, Methodology denotes instances in which students evaluated their performance in the lab in terms of their methodological choices and actions. Overall, 69.0% (SD = 24.8) of all of the written reflections in the Evaluation category were of this type. Second, Assumptions indicates statements in which students reflected on changes in prior assumptions or beliefs as a result of completing the lab (21.1%, SD = 20.4). Third, Applications designates statements in which students reflected on the application of their results to solve or understand other problems (9.9%, SD = 20.4). The data distributions for these two latter categories were rather skewed, with at least half of the students’ reports not including any reflections in these two areas. Figure 3 shows the effect of the level of inquiry of different lab experiments on the distribution of students’ reflections among these different categories; chi-test analysis revealed that this effect was significant (χ2 = 138.2; df = 4;p < 0.01). A subsequent posthoc test with
Focus on Knowledge
We used the revised Bloom’s Taxonomy39 to characterize students’ reflective statements about different types of knowledge gained as a result of lab work: Factual (knowledge of facts or terminology) Conceptual (knowledge of interrelationships between concepts or ideas) Procedural (knowledge of techniques, methods, skills, and algorithms) Metacognitive (knowledge or awareness of what is understood or not) Table 3 presents examples of students’ reflections in these different categories. Overall, reflections about learning factual knowledge were the most prominent (40.8%, SD = 31.1) in 25
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
about the extent to which relevant concepts, ideas, or methodologies were understood. Unfortunately, we did not detect a significant correlation between the level of inquiry and reflections on conceptual knowledge in chemistry. Although the relative weight of students’ evaluation of laboratory work in their written reflections was similar across different types of experiments, reflections associated with guided-inquiry labs extended to different areas. In particular, students were more inclined to reflect on the relevance or value of their experimental results for solving problems beyond the scope of the laboratory. Engagement in guided-inquiry laboratories seemed to reduce the emphasis on the evaluation of methodological errors or difficulties that was characteristic of the reflections associated with verification and structured labs. Experiments with higher levels of inquiry also prompted students to move beyond reflecting on how they could have improved their experimental performance (reflection-onaction), to begin suggesting improvements in the actual experimental design in order to collect better or more useful data (reflection-for-action). Specific examples of some of the major shifts that were observed can be seen in Table 4, which presents the full reflections written by the same student working on experiments with different levels of inquiry. Notice that the lab experiment corresponding to the guided-inquiry level was performed before the other two lab tasks, indicating that experience gained writing lab reports using the SWH was not responsible for the shifts observed in these written reflections. The results of our study are in line with those of related investigations that highlight the cognitive benefits of engaging college general chemistry students in more open investigations.21,22 However, our findings also underscore the need for explicit interventions to improve the quality of students’ reflections, particularly in the area of conceptual knowledge. Only a few of our study participants focused their reflections on how the results of their investigations contributed to their understanding of the models and theories developed by chemists to make sense of the properties and behavior of chemical substances and processes. We contend that reflective thinking in this area needs to be explicitly modeled by lab TAs and carefully scaffolded with guiding questions that direct students’ attention to conceptual issues. Otherwise, the procedural and methodological challenges and demands of more open-inquiry labs are likely to capture most of our students’ attention. From our perspective, scaffolding reflective thinking requires that we help TAs to become better at motivating, pressing, and guiding students to talk about chemistry concepts and ideas in the laboratory. In this regard, we would like to suggest that the training of college chemistry TAs should include opportunities to analyze and apply discourse tools such as those generated by Windschitl and colleagues.42 These tools are designed to engage students in three types of discourse activities: (i) eliciting hypothesis and ideas, (ii) making sense of material activity, and (iii) generating evidence-based explanations. These types of resources could help TAs initiate and sustain group conversations that focus on conceptual issues rather than on procedural matters. These types of discourse tools could also be useful in generating and guiding small group conversations in which TAs engage in the critical analysis of students’ reports. TAs need to learn how to better analyze student work to provide the type of formative feedback that
Figure 3. Effect of the level of inquiry of lab experiments on evaluation of lab work and lab results.
Bonferroni correction indicated that the significance was largely associated with an increased focus on the evaluation of both prior assumptions and potential applications of experimental results in written reflections for guided inquiry experiments. Focus on Improvements
Students also reflected on potential improvements to lab work; see Table 3 for examples. They offered suggestions related to changes to their experimental procedure in order to solve problems that they encountered (Methodology: 89.1%, SD = 11.6) or they discussed potential modifications to the design of the experiments to enrich or improve the results (Design: 10.1%, SD = 11.6; highly skewed data distribution). This latter category of reflections was observed in reports associated with structured- and guided-inquiry labs (with no significant differences between results from these two levels of inquiries), but not in reports from verification labs.
■
FURTHER DISCUSSION AND IMPLICATIONS Our study was designed to explore the effect of the level of inquiry of lab experiments on students’ reflections in written lab reports in college general chemistry. Given the limitations associated with the small number of lab groups involved in our study, as well as with the limited number of experiments at each level of inquiry, our results should be taken cautiously. However, we believe that our findings elicit trends and highlight issues that can help educators identify strategies to better support and scaffold student reasoning in the laboratory. According to the data from this study, changes in the level of inquiry of lab experiments seemed to correlate with significant changes in the nature of dominant reflective statements. These changes were noticeable in the areas of Knowledge, Evaluation, and Improvements. In the case of Knowledge, the results were particularly interesting as higher inquiry levels of experimentation were associated with a smaller proportion of reflections in this area. However, these reflections shifted from mostly focusing on factual knowledge to largely concentrating on procedural knowledge and metacognitive knowledge. Student engagement in the actual design of the experimental procedures to solve a problem in guided-inquiry experiments seemed to shift students’ reflections from the facts that they had learned to the problem-solving skills that they had developed. It also motivated a larger proportion of metacognitive reflections 26
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
Table 4. Full Reflection Examples Written by the Same Student for Experiments with Different Levels of Inquirya Level 1, Verification: Emission Spectroscopy Student Quote
Level 2, Structured: Absorption Spectroscopy Student Quote
Level 3, Guided: Light Properties Student Quote
A huge source of error that was probably made is if the nicrome wire was not cleaned before testing a different substance, which would have skewed the data. Another error that was observed is the presence of something in the gas of the flame which showed an extra peak in the spectrum. The helium spectrum was measured and the experimental results were compared to those of known literature values, which showed the accuracy of the spectrometer that was used in the experiment. In other words, the flame itself was making a peak shown in the graph. What could have been done different in this lab to make the wavelengths and intensities of the unknown substances easier to find the substances that they are made of? In many cases, it was really difficult to figure out what the unknown was made of. This lab ties perfectly with combustion reaction, which occurs with burning of substances which in this case, products light and color. Combustion reaction is defined as a chemical reaction where a substance combines with oxygen to form of heat and light in the form of a flame, just like it was seen in this particular lab experiment. In this case, when the ions cool after they rise from the flame, they give out photons, which created the emission spectrum that was seen in all the substances that were tested. These measurements were used to determine the deviation and the average relative deviations.
There were some sources of error in this experiment. The pipet could have affected the data because it sometimes did not let out the measured amount of the liquid. The instrument measurements could have skewed the data. Another error that is seen in the experiment is in the absorbance vs concentration graph because the y-intercept should have been 0, but the graph shows that the y-intercept is −0.224. This miscalculation could have skewed the rest of the data and results. A new question that I have after doing this lab would be, if there was more than one color in the solution, would it have affected the results that were obtained from the color that was measured? Absorption spectroscopy, which is used in this lab, relates to what was talked about in class because it is a direct measurement of the number of molecules in a solution, which makes a very reliable and valuable device. What we did could be applied to the testing of foods to make sure that the food is safe to eat. Since there were many concerns about food coloring, the FDA tested and made sure that the chemicals that the foods were safe to consume.
My ideas about a light bulb and fluorescent light have changed after this lab experiment. Before the experiment I thought that a light bulb would be more efficient than a fluorescent light. This lab concluded the opposite. It concluded that the fluorescent light is more efficient with an efficiency of 84.5% whereas the incandescent light bulb is only 71.8% efficient. This lab ties very well with the concepts that were discussed in class about electromagnetic radiation, which is energy transmitted. The frequency range of these waves is tremendous. The radiation that is released can hint properties of the system or the changes that are occurring. One way this lab could have improved is to remove any other light sources in the room where the experiment is taking place. In other words the removal of the room lights would have resulted in more accurate results. The room light source, which was not measured in the experiment, could have emitted significant electromagnetic radiation. Our data would be very helpful in defending a campaign that wants to replace incandescent light bulbs with compact fluorescent light bulbs. The data clearly shows the huge efficiency difference between incandescent light bulbs and fluorescent light bulbs. The efficiency of fluorescent light bulbs is 85.5% whereas the efficiency of incandescent light bulbs is only 71.8%.
a
This set of reflections illustrates some of the major changes observed in our pilot investigation: increased focus on procedural versus factual knowledge, increased focus on evaluation of assumptions, and applications of results versus evaluation of errors. (9) Lunetta, V. N.; Hofstein, A.; Clough, M. In Handbook of Research on Science Education; Lederman, N., Abel, S., Eds.; Lawrence Erlbaum: Mahwah, NJ, 2007; pp 393−441. (10) National Research Council. America’s Lab Report: Investigations in High School Science; Singer, S. R., Hilton, M. L., Schweingruber, H. A., Eds.; National Academies Press: Washington, DC, 2006. (11) Psillos, D.; Niedderer, H., Eds. Teaching and Learning in the Science Laboratory; Kluwer: Dordrecht, The Netherlands, 2002. (12) Leonard, W. H. In Handbook of College Teaching; Prichard, K. W., Mclaran-Sawyer, R., Eds.; Greenwood Press: Westport, CT, 1994. (13) McNeal, A.; D’Avanzo, C. Student-Active Science: Models of Innovation in College Science Teaching; Saunders College Publishers: Philadelphia, PA, 1997. (14) Johnstone, A. H.; Al-Shuaili, A. Univ. Chem. Educ. 2001, 5, 1− 10. (15) Reid, N.; Shah, I. Chem. Educ. Res. Pract. 2007, 8 (2), 172−185. (16) Burke, K. A.; Greenbowe, T. J.; Hand, B. M. J. Chem. Educ. 2006, 83 (7), 1032−1038. (17) Hand, M. M., Ed. Science Inquiry Argument and Language: A Case for the Science Writing Heuristic; Sense Publishers: Rotterdam, The Netherlands, 2008. (18) Lamba, R. S.; Creegan, F. J. In Process Oriented Guided Inquiry Learning; Moog, R. S., Spencer, J. N., Eds.; ACS Symposium Series No. 994; American Chemical Society: Washington, DC, 2008; Chapter 16, pp 186−199. (19) Poock, J. R.; Burke, K. A.; Greenbowe, T. J.; Hand, B. M. J. Chem. Educ. 2007, 84, 1371−1379. (20) Krystyniak, R. A.; Heikkinen, H. W. J. Res. Sci. Teach. 2007, 44 (8), 1160−1186. (21) Hand, B.; Choi, A. Res. Sci. Educ. 2010, 40, 29−44. (22) Russell, C. B.; Weaver, G. C. Chem. Educ. Res. Pract. 2011, 12, 57−67. (23) Sandi-Urena, S.; Cooper, M. C.; Gatlin, T. A.; Bhattacharyya, G. Chem. Educ. Res. Pract. 2011, 12, 434−442. (24) National Research Council. Inquiry and the National Science Education Standards; National Academies Press: Washington, DC, 2000.
can transform students’ views about the role of reflections in written lab reports.
■
ASSOCIATED CONTENT
* Supporting Information S
Rubric used to characterize levels of inquiry of laboratory experiments. This material is available via the Internet at http://pubs.acs.org.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. Notes
The authors declare no competing financial interest.
■
REFERENCES
(1) American Association for the Advancement of Science. Benchmarks for Scientific Literacy; Oxford University Press: New York, 1993. (2) National Research Council. National Science Education Standards; National Academies Press: Washington, DC, 1996. (3) National Research Council. Taking Science to School: Learning and Teaching Science in Grades K−8; Duschl, R. A., Schweingruber, H. A., Shouse, A., Eds.; National Academies Press: Washington, DC, 2007. (4) Osborne, J. F.; Dillon, J. Science Education in Europe; Nuffield Foundation: London, U.K., 2008. (5) Hofstein, A.; Lunetta, V. N. Rev. Educ. Res. 1984, 52 (2), 201− 217. (6) Hofstein, A.; Lunetta, V. N. Sci. Educ. 2004, 88, 28−54. (7) Lazarowitz, R.; Tamir, P. In Handbook of Research on Science Teaching and Learning; Gabel, D. L., Ed.; Macmillan: New York, 1994; pp 94−130. (8) Lunetta, V. N. In International Handbook of Science Education; Fraser, B. J., Tobin, K. G., Eds.; Kluwer: Dordrecht, 1998. 27
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28
Journal of Chemical Education
Article
(25) Halliday, M. A. K.; Martin, J. R. Writing Science: Literacy and Discursive Power; University of Pittsburgh Press: Pittsburgh, PA, 1993. (26) Lemke, J. L. Talking Science: Language, Learning and Values; Ablex: Norwood, NJ, 1990. (27) Rivard, L. P.; Straw, S. W. Sci. Educ. 2000, 84, 566−593. (28) Yore, L. D.; Bisanz, G. L.; Hand, B. M. Int. J. Sci. Educ. 2003, 25 (6), 689−725. (29) Hohenshell, L. M.; Hand, B. Int. J. Sci. Educ. 2006, 28 (2−3), 261−289. (30) Bereiter, C.; Scardamalia, M. The Psychology of Written Composition; Lawrence Erlbaum: Hillsdale, NJ, 1987. (31) Prain, V.; Hand, B. Sci. Educ. 1999, 83, 151−162. (32) Wray, D.; Lewis, M. Extending Literacy: Children Reading and Writing Non-Fiction; Routledge: London, 1997. (33) Hand, B.; Keys, C. W. Sci. Teach. 1999, 66 (4), 27−29. (34) Keys, C. W.; Hand, B.; Prain, V.; Collins, S. J. Res. Sci. Teach. 1999, 36 (10), 1065−1084. (35) Hand, B.; Wallace, C. W.; Yang, E. Int. J. Sci. Educ. 2004, 26 (2), 131−149. (36) Rudd, J. A.; Greenbowe, T. J.; Hand, B. M.; Legg, M. J. J. Chem. Educ. 2001, 78, 1680−1686. (37) Abrams, E.; Southerland, S. A.; Evans, C. A. In Inquiry in the Science Classroom: Realities and Opportunities; Abrams, E., Southerland, S. A., Silva, P., Eds.; Information Age Publishing: Greenwich, CT, 2007. (38) Fay, M. E.; Grove, N. P.; Towns, M. H.; Bretz, S. L. Chem. Educ. Res. Pract. 2007, 8 (2), 212−219. (39) Krathwohl, D. R. Theor. Pract. 2002, 42 (4), 212−218. (40) Schön, D. The Reflective Practitioner; Basic Books: New York, 1983. (41) Schraw, G.; Moshman, D. Educ. Psych. Rev. 1995, 7 (4), 351− 371. (42) Windschitl, M. A.; Thompson, J. J. Tools for Ambitious Science Teaching. http://tools4teachingscience.org/ (accessed Nov 2012).
28
dx.doi.org/10.1021/ed3002368 | J. Chem. Educ. 2013, 90, 21−28