Realizing Workplace Skills in Instrumental Analysis - Journal of

Jun 1, 2005 - Undergraduates entering the workforce often lack key technical skills. This paper describes an approach to teaching an instrumental anal...
0 downloads 0 Views 74KB Size
In the Classroom

Realizing Workplace Skills in Instrumental Analysis

W

John H. Kalivas Department of Chemistry, Idaho State University, Pocatello, ID 83209; [email protected]

Using problem-based learning (PBL), and cooperative learning (CL) methods in both analytical lecture and laboratory courses provides advantages to students compared to passive, conventional teaching approaches students often encounter (1–7). Applying PBL in the laboratory means that students are given little information on how to perform experiments, thereby engaging students in active learning by necessitating that they make their own decisions. Cooperative learning essentially involves students working in groups, allowing them to share knowledge and explain concepts to peers. To incorporate PBL and CL into the analytical curriculum, numerous investigators have successfully integrated real-world experiences into analytical laboratory courses (1, 2, 8–16 and references therein). College graduates who majored in chemistry typically enter the workforce lacking specific skills required in industry (1, 17–22). For example, companies hiring chemists are usually looking for job candidates who can perform in a teamwork environment to define a problem, collect and analyze samples, validate through quality assurance and control, and effectively communicate recommendations for solutions based on analysis results. These skills can be cultivated by using appropriately designed PBL and CL laboratory courses with real-world samples. This paper describes a new approach to teaching an instrumental analysis laboratory that provides students with a fundamental analytical curriculum in chemistry and tries to prepare students to graduate with skills that will correspond with common expectations of the chemical industry. The approach described here can be applied by chemistry educators to better develop students’ practical as well as theoretical skills in analytical instrumentation. Course Overview Instrumental analysis at Idaho State University (ISU) is a three-hour lab taught twice per week in the spring term preceded by a lecture course in instrumental analysis offered in the fall. In the new course described here, students acquire greater familiarity with a few instruments rather than obtaining operating experience with numerous instruments. The trade-off is that the students should achieve an enhanced understanding of the instruments they do use—learning more with less. The course is divided into two sequential parts: development and evaluation of instrument standard operating procedures (SOPs) and a real-world research project. Nine weeks are used for SOP developments and evaluations followed by seven weeks for the research project. Students are divided into groups of two or three such that a mixture of scholastic abilities is obtained. During the SOP phase, each group develops a protocol and determines specifications for an instrument. Subsequently, two other groups evaluate and edit each SOP. Towards the end of the SOP phase, groups write a project www.JCE.DivCHED.org



research proposal for approval by the instructor. Because SOPs have been written for most instruments, the focus of the student work is on the research problem, not learning a new instrument. The course concludes with written and oral project reports. Standard Operating Procedure Using the manuals, the students find the information to write an SOP for that instrument. Because instrument manuals are the only information provided, most students are at first intimidated with the idea of working on an instrument that they have only seen illustrations of and learned about in lecture. It is not uncommon in commercial and research labs for an employee to be assigned an SOP development for a new instrument that the employee has never worked with. The SOP must provide enough fundamental details in order for a new user to operate the instrument, run a sample, and obtain a spectrum or chromatogram, as the case may be. Students decide on a simple analyte and use least-squares to build a calibration model using a selected wavelength, massto-charge ratio, retention time, and so forth, depending on the instrument. The figures of merit (precision, accuracy, sensitivity, detection limit, linear dynamic range, and selectivity) are determined for the calibration and these values represent the instrument’s specifications. An instrument block diagram is required as well as an analyte spectrum or chromatogram. If it is possible for their instrument, students must also include a procedure for exporting spectra or chromatograms into Excel. Students are allowed five weeks to develop their first SOP. This provides ample time for students to read manuals and develop a protocol for running the instrument. While no realworld samples are involved, the students do gain the advantages of PBL and CL. Students in a group work together to agree on what to include and what not to include in the SOP. Group members operate as a team with each student responsible for different aspects of the SOP. Because the group writing the initial SOP has gone through the manuals, these students are considered the resident experts for that instrument. All future questions concerned with the instrument are directed towards them. Student names, phone numbers, and e-mail addresses are listed on the SOP. This group mimics technical service for the instrument manufacturer. After an initial SOP is written it is distributed to other groups for evaluation and editing. In the workplace, employees are often required to critique and edit a colleague’s report. Two weeks are allowed for the groups to review the SOP by making solutions and measuring samples to obtain the calibration curve in the SOP, to compare figures of merit, and to assess the clarity of the SOP. While a reviewing group can consult the manual if the group is unsure of a step described in the SOP, this is discouraged. Instead, reviewing groups are

Vol. 82 No. 6 June 2005



Journal of Chemical Education

895

In the Classroom

expected to contact the authors of the original SOP. At the end of the two week period, the groups give the instructor a rewritten SOP, an itemized list of changes, and their evaluation. Copies of the itemized list of changes and evaluation are provided to the original authors of the SOP. The revised SOPs are distributed to new groups and the students are allowed another two weeks to work through the SOP and revise it. The same tasks described for the first SOP rotation are required for the second. The two SOP rotations are designed to provide students with experience on molecular and atomic spectrometers and a chromatographic system. Teaching with this approach has changed students’ outlook on learning new instrumentation. We have found—from class evaluations, informal instructor observations during the course, comments from faculty with past enrolled students performing research, and chemistry graduating student exit interviews with the department chair—that at the completion of the SOPs, students are no longer apprehensive about using a new instrument. Confidence increases and they have developed a sense of independence in the lab, which is useful for the project portion of the course. This newly found confidence stems from the fact that the students had to figure out how to run an instrument that for the most part is alien to them. Students overcome their fear of breaking the instrument. Because they have ample time to learn the instrument, they soon adapt to trying out buttons and switches that they initially were not confident with from reading the manual. They realize that the instrument is not always going to break when a new feature is explored. Additionally, instruments commonly need repair during the semester. Having students troubleshoot malfunctions provides invaluable experience that develops competence and boosts confidence. Research Project In the seven-week research project portion of the course, students solve a real-world problem instead of executing an analysis. The students select a sample type, analyte, and design the analysis using at least one instrument. If pertinent, students can consider why the analysis did not work, for example, determining what the possible sources of error may be and what can be done to test for the presence of these errors. The project portion of the course concludes with students presenting written and oral reports summarizing their work. Different aspects of the research component of the course are described below.

Proposal Regardless of whether students pursue industrial or academic paths, chances are they will have to prepare a proposal with a budget. To provide this kind of experience, students write a proposal for their project due towards the end of the SOP portion. It must contain a title, abstract, project narrative with objectives, experimental plan, equipment needed with costs (obtained from the Fisher Scientific catalog for chemicals, etc. and from an instrument manufacturer for the chosen instrument), and references. If possible, the official reference method of analysis for the analyte must be included, for example, American Society for Testing and Materials (ASTM), Environmental Protection Agency (EPA), or the Standard Methods for Examination of Water and Wastewater (23). 896

Journal of Chemical Education



Paper Review Because most students have not had experience writing a formal lab report, a published paper for class discussion is provided during the last week of the SOP phase. The first lab period of the project portion of the course is devoted to understanding the paper and its structure. Students turn in a written overview of the paper and each group is assigned a different section of the paper to orally present. After oral presentations, the class engages in a discussion of that paper, addressing topics such as conciseness and clarity of the introduction section to introduce subsequent text, completeness of the experimental section, thoroughness of the data analysis, and whether the conclusions are warranted. Discussing structure of the different sections and respective content teaches students the proper way to write the project report.

Quality Assurance and Quality Control After the SOP phase, students should be knowledgeable and experienced in obtaining necessary analytical figures of merit to assist in validating their results. Additionally important to any analysis is quality assurance and quality control (QA/QC) (23–26). Various methods for assessing QA/QC exist including recovery of known additions, analysis of externally supplied standards, analysis of duplicates, control charts, and so forth. References 23 and 25 provide a complete list and description of these methods. For the projects, students perform recovery of known additions to the realworld samples. Evaluating the SOPs and Projects The SOPs are graded for technical correctness, clarity, organization, ease of use, and whether all required material is included. Written reports are graded in a manner similar to the SOPs. Oral presentations of the paper review are graded individually, as are project presentations. Paper review presentations are primarily graded on quality rather than content, for example, organization and visual aids. Oral reports are graded based on how well the presentation summarizes the written report, that is, concise descriptions of experimental design, instrument theory and operation, results, and conclusions. Project presentations are also graded on quality— organization, vocabulary, and visual aids. Not all projects result in a complete analysis due to circumstances beyond the control of the students. In this case, written and oral report grades reflect the organization of approaches and steps taken to obtain meaningful results. In order to minimize situations where a weaker student relies on a stronger student in a group, peer evaluations are performed and represent 5% (half a letter grade) of the final course grade. Conclusion A redesigned instrumental analysis course that includes PBL and CL with SOPs and a research project provides students with an emphasis on the depth of problem solving rather than on the extensiveness of analytical methods, the usual focus in a conventional approach. Additionally, in conventional laboratory experiments, students learn that mistakes are usually costly, both in time and with their grade. In the

Vol. 82 No. 6 June 2005



www.JCE.DivCHED.org

In the Classroom

instrumental analysis course, the students realize that mistakes are really learning experiences, not something to fear. The students gain experience in the complete process of performing an analysis starting with identifying the problem, collecting samples, conducting sample workup and pretreatment, making measurements, analyzing data, and validating the results. W

Supplemental Material

An example instrument standard operating procedure (SOP) is available in this issue of JCE Online. Literature Cited 1. Kuwana, T. Curricular Developments in the Analytical Sciences: A Report from NSF Workshops (October 28–30, 1996 and March 13–15, 1997). http://www.chem.ku.edu/ TKuwana/CurricularDevelopment/PDF_report.pdf (accessed Mar 2005). 2. Wenzel, T. J. Anal. Chem. 1999, 71, 693A–695A. 3. Ross, M. R.; Fulton, R. B. J. Chem. Educ. 1994, 71, 141– 143. 4. Wright, J. C. J. Chem. Educ. 1996, 73, 827–832. 5. Wenzel, T. J. Anal. Chem. 1998, 70, 790A–795A. 6. Wenzel, T. J. Anal. Chem. 2000, 72, 293A–296A. 7. Wenzel, T. J. Anal. Chem. 2000, 72, 359A–361A. 8. Walters, J. P. Anal. Chem. 1991, 63, 1179A–1191A. 9. Hughes, K. D. Anal. Chem. 1993, 65, 863A–889A. 10. Wenzel, T. J. Anal. Chem. 1995, 67, 470A–475A.

www.JCE.DivCHED.org



11. Wilson, G. S.; Anderson, M. R.; Lunte, C .E. Anal. Chem. 1999, 71, 677A–681A. 12. Hope, W. W.; Johnson, L. P. Anal. Chem. 2000, 72, 460A– 467A. 13. Wener, T. C.; Tobiessen, P.; Lou, K. Anal. Chem. 2001, 73, 84A–87A. 14. Phillips, D. N. Anal. Chem. 2002, 72, 427A–430A. 15. Calascibetta, F.; Campanella, L.; Favero, G. J. Chem. Educ. 2000, 77, 1311–1313. 16. Houghton, T. P.; Kalivas, J. H. J. Chem. Educ. 2000, 77, 1314– 1318. 17. DePalma, R. A.; Ullman, A. H. J. Chem. Educ. 1991, 68, 383– 384. 18. Thorpe, T. M. J. Chem. Educ. 1986, 63, 237. 19. Thorpe, T. M.; Ullman, A. H. Anal. Chem. 1996, 68, 477A– 480A. 20. Kratochvil, B. J. J. Chem. Educ. 1991, 68, 838–839. 21. McKinnon, I. R.; Nunn, E. K.; Patti, A. F. What laboratoryrelated skills do employers want from chemistry graduates? Proceedings of the Royal Australian Chemical Institute Chemical Education Division Conference, University of Central Queensland: Rockhampton, Australia, July 1998; pp 155–156. 22. Beck II, C. M. Anal. Chem. 1991, 63, 993A–1003A. 23. Standard Methods for Examination of Water and Wastewater, 19th ed.; American Public Health Association: Washington DC, 1975. 24. Laquer, F. C. J. Chem. Educ. 1990, 67, 900–902. 25. Marcos, J. M.; Ríos, A.; Valcárcel, M. J. Chem. Educ. 1995, 72, 947–949. 26. Bell, S. C.; Moore, J. J. Chem. Educ. 1998, 75, 874–877.

Vol. 82 No. 6 June 2005



Journal of Chemical Education

897