Research: Science and Education
Advanced Chemistry Classroom and Laboratory
Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course
edited by
Joseph J. BelBruno Dartmouth College Hanover, NH 03755
Katherine C. Lanigan Department of Chemistry and Biochemistry, University of Detroit Mercy, Detroit, MI 48221;
[email protected] The phrase “analytical method development” is not commonly used in chemical education terminology, even though it is a critical component of chemical research. In fact, there have only been three articles in this Journal with either the title or keyword containing this phrase since 1987 (1–3). In general, analytical method development means to improve chemical analysis techniques and is commonly associated with method assessment and validation by industry and government agencies. Method development requires application of skills involving critical thinking and problem solving. To teach these skills, chemical educators have developed alternative teaching methods such as problem-based learning (PBL), case studies, inquiry-based learning (IBL), and process models like processoriented, guided-inquiry learning (POGIL) (4–10). Ironically, one of the articles referred to above from 1987 with the title “Analytical Methods Development” actually described an earlier version of problem-based learning in which students were asked to formulate a simple method for metals detection after being provided with minimal guidance (1). Analytical method development and problem-based learning have similar features. PBL experiments generally begin with a problem (e.g., lead in urban housing) that needs to be solved (detection of concentrations) and end with students developing and carrying out the experiment (11). Students are typically given less information than traditional methods and may be required to engage in literature searches for detection methods (8). PBL has many positive attributes conducive to student learning and is being incorporated into teaching laboratories with more frequency. However, barriers to the implementation of PBL in classrooms include large class sizes and the need for the development of sophisticated problems that can support PBL exercises (12). This article explains an alternative approach for teaching problem-solving skills through adoption and adaptation— method development of experiments from the literature. The difference between PBL and method development is that more information is given to students in the latter case, making it more predictable. Method development is therefore a useful compromise between PBL and traditional cookbook methods. In both PBL and analytical method development described here, students are required to contribute their own ideas to accomplish the project, learn to work in groups, and improve their communication skills. The pedagogical approach described here utilizes groupmeeting style discussions and method development to teach problem-solving skills. In these ways it is similar to undergraduate research and shares some of the same benefits (13, 14). Just as with research, there is no manual provided, simply the primary literature. In addition, the experiments do not always work the first time. Contrary to student perception, experiment failure is usually an opportunity for greater learning. However, an advantage to the approach used here is that because the instructor 138
has prior knowledge of the experiments, major problems can be avoided so that success can be achieved in the end. This promotes students’ confidence in their work and alleviates some of the frustration that can be associated with experiments that do not give useful results in the time allotted. Another similarity to undergraduate research is the sense of ownership of the experiments by students, knowing that they have some role in deciding how the experiment will be carried out. General Description of Course The objective of the method development component of the instrumental analysis course is for students to gain handson experience in adopting, adapting, and assessing experiments from the literature. To accomplish this objective, students are required to work in small groups, to participate in pre-lab and post-lab group meetings, and to contribute ideas for the adaptation of experiments. In order for all students to present the same type of experiment at each group meeting, the instrumental analysis course is run using one instrument at a time. As a result the material presented by all groups at each meeting is very similar. The familiarity with the material and open format of the group meetings gives students the confidence to participate in the discussions to a greater degree than in past semesters when this format was not used. Pre-Lab Group Meeting Groups of two students are each given different journal articles describing laboratory experiments that utilize instrumentation for chemical analysis. The assignment has two components: one is to create a single-page outline and the other is to give a 10-minute presentation describing the experiment. Each student within every group is in charge of one of the two components and is assigned a grade for it. These details must be included: title, author, main objective, experimental methods, and basic description of results. Journal articles are carefully chosen ahead of time from chemical education journals and research journals such as Analytical Chemistry. Suitability of experiments is determined by equipment and chemicals required, the cost of purchasing the new equipment or chemicals, the level of difficulty, and the potential for success. Experimental Development Following the pre-lab group meetings, groups are assigned experiments. Depending on the lab, students carry out different experiments (their pre-lab experiments), the same experiment, or similar experiments. When the students carry out the same experiment, the class evaluates the common results during the post-lab meeting. In other cases, they will perform very similar experiments. For example, groups are assigned to present journal articles describing HPLC experiments that examine caffeine content. The differences between the experiments are the type
Journal of Chemical Education • Vol. 85 No. 1 January 2008 • www.JCE.DivCHED.org • © Division of Chemical Education
Research: Science and Education
Table 1. Learning Outcomes by Method Development Tasks Assignment
Skills
Pre-lab group meeting
Primary literature comprehension Communication (verbal and written) Team work
Method development (adopting, adapting and developing the experiment)
Primary literature comprehension Experimental Design Problem solving Time management Team work
Post-lab group meeting
Data assessment Data presentation Communication (verbal and written) Team work
of samples (coffee, soda, analgesic tablet) and experimental parameters (mobile phase, column type, etc.) (15–18). In this experiment, students are assigned to analyze caffeine content in a variety of samples while using different experimental parameters. At the post-lab, the class compares results to literature values to assess which protocol gives the most reasonable results. After experiments are assigned, students begin the design of the experiments. Students are responsible for fleshing out the necessary details such as chemicals and equipment needed from the articles. This is often a challenge for undergraduate students and therefore an opportunity for scientific growth. It should be noted that the supplements containing step-by-step procedures that accompany some of the published experiments are not provided to students. The design step also includes organizing classroom time for sample preparation and instrument use. Groups meet with the instructor for 10–30 minutes to discuss their plans. The instructor gives feedback and makes suggestions based on what is available for the experiment. Ideas for similar, yet novel experiments are encouraged since this gives students a sense of ownership over the experiment and therefore motivation to make it work. Post-Lab Group Meeting The post-lab meeting is meant to be a wrap-up session. This component was not used before the fall semester of 2006. Students in 2002 and 2004 had difficulty knowing what to include in the results–discussion section, therefore in order to assist them in this endeavor students are now required to discuss and present their results in class prior to writing the reports. Students are assigned a 10–15 minute presentation. This is an opportunity to report on and assess their results, to describe problems encountered, and to receive feedback from the instructor and peers. This is as vital to students as it is to every scientist writing a research paper for publication. Just as with the pre-lab group meetings, one partner creates a single-page results handout while the other gives an oral presentation. Outcomes Assessment Prior to the use of pre-lab group meetings, students carried out their experiments without meaningful discussion during the laboratory period. As a result of the pedagogical changes described here, students asked questions and offered suggestions to their peers during the pre-lab group meeting and during the lab. Not only did this help them understand their own experiment, but they could then apply what they had learned to the understanding of their peers’ experiments. Additionally, because
the experiments were similar, there was a greater comfort level with the material (experimental parameters, type of results, etc.). Other evidence of success was that students were better prepared to actually carry out the experiments. Based on conversations, students seemed to better understand what they were doing and the goals of the experiments compared to courses in previous semesters that did not require pre-lab group meetings. The learning outcomes from the method development component are shown in Table 1. This does not include traditional learning outcomes (i.e., understanding chemistry principles and instrumentation; data analysis; notebook and report writing skills, etc.) as these learning outcomes are well documented for instrumental analysis courses. Students’ Assessment and Comments In 2006, students were given the opportunity to assess the instrumental analysis course using two online attitude surveys located in the course’s Blackboard Web site and on the SALG Web site (19–20). The survey questions and full results are included in the online supplement. In the Blackboard survey, which specifically asked about pre-lab and post-lab assignments, the strongest sentiment obtained through Likert-type questions was toward the post-lab presentations. These were considered most helpful when compared to pre-lab presentations, pre-lab outlines, and post-lab outlines. The survey, taken by six of the seven students in the 2006 course, indicated that 67% “strongly agreed” that the post-lab presentations were “helpful to their understanding of the instrumentation and the experimental results”. The other components also received “strongly agree” ratings although in ranges of 33–50%. Results from a SALG survey gave similar approval ratings. The SALG survey revealed two important trends. The first was that post-lab meetings were considered more helpful than pre-lab meetings: 57% thought that post-lab meetings were “very much help” as compared to 29% for pre-lab meetings. The second trend was found in the responses to several questions concerning whether or not the course added to the students’ skill sets on topics including adaptation of experiments, working with analytical instrumentation, analyzing and interpreting data, oral presentations of data and results, and writing lab reports. There was 86% agreement that the course helped “a lot” in adding to each of these skill sets. Reading and interpreting journal articles also scored high marks with 57% for “a lot” and 29% for “a great deal”. Written comments also attested to the success of the new format in terms of attitudes toward the course. Table 2 includes the most pertinent student comments (approval and suggestions) from the University’s course evaluations (2002, 2004, and 2006) and a Blackboard survey tool (2006). A comment provided on a course evaluation in 2004 concerned waiting time. This was, for the most part, not in regard to waiting for the instruments, rather in regard to waiting to meet with the instructor to discuss the experiment design. In 2006, the instructor abbreviated design discussions to 10 minutes and encouraged students to use any “waiting” time for cleaning glassware, preparing notebooks, or planning. Another comment concerned the amount of work required. This concern was also evident by the dropping score in student evaluations in response to the question, “the amount of work required was reasonable”. The score for this question dropped over the last three times that the course was taught from 4.00 for the 2002 semester (traditional format, five students) to 3.85 for the 2004 semester (method development with pre-lab meetings,
© Division of Chemical Education • www.JCE.DivCHED.org • Vol. 85 No. 1 January 2008 • Journal of Chemical Education
139
Research: Science and Education Table 2. Typical Student Evaluation Comments by Laboratory Format Year
Laboratory Format
2002
Traditional
“Practical experience on many commonly used instruments in chemistry.”
Example Approval Statements
“Give guidelines before the report is due as to what is required.”
Example Suggestion Statements
2004
Method development with pre-lab meetings
“The small class size, and the way the teacher ran the lab, not like a lab course, but as if it was like undergrad research.”
“The organization of the laboratory sessions could be improved. There were many times when the majority of the class sat waiting.”
“I learned a lot about reading scientific papers & was exposed to aspects of instruments I didn’t know about before & instruments I’d never used before.” 2006
Method development with pre-lab meetings and post-lab meetings
“The fact that it stimulated me to learn the lab before trying to just do the lab.” “The post lab helped greatly in summing up what should be included in the lab report, and also helped assess the data that other groups produced as well as their thoughts on the results which benefited the class as a whole.”
“The amount of work was a bit much. I feel that I spent more time on this 2 credit hours class than I did a couple of 3 credit hours class combined.” “The pre labs seemed a bit unnecessary and made for extra work.”
“The post lab meetings were very helpful in understanding the data and writing the lab reports.”
seven students) and then to 3.60 for the 2006 semester (method development with pre-lab meetings and post-lab meetings, seven students). This style of lab required more work than traditional cookbook-style labs for both students and instructor. This fact also speaks to the level of work required to gain problem-solving skills as well as actually solving problems concerning experiment adaptation and data analysis. A third criticism of the course was one concerning the necessity of pre-lab presentations. The pre-lab presentations served two purposes. The first was to get students to actually read the lab experiment ahead of time so that they could understand the basic principles prior to working at the bench. The second purpose was to give students a chance to become the “experts” within the class on a particular experiment. This gave students a sense of responsibility for the details of the experiment and thus a motivation to read and learn the experiment. Nevertheless, a useful tool for quantitative assessment of the pre-lab presentations would have been to give quizzes before the pre-lab assignments and after the pre-lab meetings. This would have evaluated students’ gain in understanding of basic principles of instrumentation as a direct result of the pre-lab group meetings. Conclusions The pedagogical approach described here utilizes adoption and adaptation of experiments along with pre-lab and post-lab group meetings for teaching problem-solving skills. It has some of the same benefits as undergraduate research and problembased learning. Yet since this approach is not as open ended, it may be more palatable for faculty with more traditional style courses or for those making the transition to PBL. Acknowledgments I would like to thank the students in the instrumental analysis course during 2002, 2004, and 2006 for their patience during the development of this course and helpful comments and suggestions in course evaluations and surveys. 140
Literature Cited 1. Basel, C. L. J. Chem. Educ. 1987, 64, 528–529. 2. Street, K. W., Jr. J. Chem. Educ. 1988, 65, 914–915. 3. Remcho, V. T.; McNair, H. M.; Rasmussen, H. T. J. Chem. Educ. 1992, 69, A117–A119. 4. Adami, G. J. Chem. Educ. 2006, 83, 253–256. 5. Draper, A. J. J. Chem. Educ. 2004, 81, 221–224. 6. Arnold, R. J. J. Chem. Educ. 2003, 80, 58–60. 7. Ram, P. J. Chem. Educ. 1999, 76, 1122–1126. 8. Wenzel, T. J. Anal. Chem. 1999, 71, 693A–695A. 9. Woodget, B. W. Anal. Chem. 2003, 75, 307A–310A. 10. The POGIL (Process oriented guided inquiry learning) Project Home Page. http://www.pogil.org/ (accessed Oct 2007). 11. Fitch, A.; Wang, Y.; Mellican, S.; Macha, S. Anal. Chem. 1996, 68, 727A–731A. 12. Wenzel, T. J. Anal. Chem. 1998, 70, 790A–795A. 13. Wenzel, T. J. Anal. Chem. 2000, 72, 547A–579A. 14. Murray, R. W. Anal. Chem. 2001, 73, 237A. 15. Ferguson, G. K. J. Chem. Educ. 1998, 75, 467–469. 16. Beckers, J. L. J. Chem. Educ. 2004, 81, 90–93. 17. McDevitt, V. L.; Rodriguez, A.; Williams, K. R. J. Chem. Educ. 1998, 75, 625–629. 18. Bergen, H. R. III; Benson, L. M.; Naylor, S. J. Chem. Educ. 2000, 77, 1325–1326. 19. Blackboard Academic Suite Home Page. http://www.blackboard.com/ products/Academic_Suite/index (accessed Oct 2007). 20. Student Assessment of Learning Gains Home Page. http://www.wcer. wisc.edu/salgains/instructor/ (accessed Oct 2007).
Supporting JCE Online Material
http://www.jce.divched.org/Journal/Issues/2008/Jan/abs138.html Abstract and keywords Full text (PDF) Links to cited URLs and JCE articles Supplement SALG and Blackboard online survey questions Student response data
Journal of Chemical Education • Vol. 85 No. 1 January 2008 • www.JCE.DivCHED.org • © Division of Chemical Education