Chemical Education Today
NSF Highlights Projects Supported by the NSF Division of Undergraduate Education
LIMSport: Optimizing a Windows-Based Computer Data Acquisition and Reduction System for the General Chemistry Laboratory by Ed Vitz and Brenda P. Egolf
The guiding philosophy of the LIMSport Program over its 15-year history at Kutztown University has been to provide a mechanism for automatically acquiring data from a variety of sensors into a spreadsheet, so that students and teachers need only spreadsheet skills to acquire and analyze data. Two goals were set for the current project: to develop a robust Windows/Excel data acquisition system for LIMSport, and to evaluate its effectiveness in promoting student learning in general chemistry laboratories. Our focus has been on teaching general chemistry students data reduction and presentation techniques that are used in the research or industrial environment, while keeping the details of data acquisition in the background. Students in all general chemistry laboratories at Kutztown University learn spreadsheet techniques for data analysis and presentation, in preparation for the variety of computer applications that they will practice in subsequent courses. Our goal of direct data acquisition into spreadsheets has led to a controlled evolution of the LIMSport program from early designs involving assembly language, BASIC, and Lotus running under DOS (1), to the current version (2) which uses an ActiveX strategy to embed LabVIEW Virtual Instruments (VIs) into Excel running under Windows 2000. Excel-style toolbuttons are used for data acquisition tasks, just as they are used for graphing, formatting, and editing purposes (3). They are implemented by attaching an Excel “Add-In” called LIMS_LV.ADN, which controls the execution of several LabVIEW VIs. The name LIMSport is based on the standard acronym for Laboratory Information Management System, and its goal is to import and organize data from a variety of laboratory sources. Our general chemistry laboratory has 12 computers, upgraded to current technology every three years, each equipped with a HP LaserJet 1100 or 4050N printer, and with National Instruments PCI 1200 multifunction cards, but the LIMSport program is largely hardware-independent and can be implemented through a variety of commonplace strategies. Early versions actually achieved similar results by using a combination of standard interfaces (serial, parallel, game ports, and ISA multifunction cards) in 286 computers with no hard drives. It is relatively easy to maintain our consistent general approach while incrementally upgrading the various elements—operating system, productivity software, data acquisition hardware, and sensors—that comprise it. The system includes devices and interfaces for measuring temperature, pH, optical absorbance in three spectral regions, conductivity, Geiger counts, and chemical vapors via a Taguchi sensor (4). The entire suite of sensors can be 1060
assembled for less than $300 per computer, with the Geiger counter ($150) and pH electrode ($50) accounting for most of the expense. Because the student interface uses only Excel commands, no major changes are required in the locally-written laboratory manual or the tested student templates when the data acquisition strategy or operating system change, but revisions are occasionally required to keep abreast of changes in spreadsheet functionality. We used LabVIEW and a fairly expensive multifunction board because this strategy eliminates the need for driver development and employs high-level development languages, and it encourages gradual incorporation of native LabVIEW programming into subsequent courses. LabVIEW is underutilized in general chemistry, and in the future we may develop strategies using only Visual Basic and one of the many inexpensive, commercially available multifunction boards. Incidentally, the computer systems in the laboratory also support the usual suite of molecular modeling, Web access, word processing, and other scientific software. We generally make spreadsheet templates for each experiment available to students, both in the manual and as Excel files in the laboratory, so that students need not spend a lot of time on layout. Of course, students might also be presented with only a blank spreadsheet when they arrive at the laboratory, and possibly be required to design a template as a prelaboratory assignment, forcing them to plan the experiment carefully. We do this for only one experiment, in which students determine the rate law for phosphorescence decay, after they have done other kinetics experiments that employ similar methods. Computer Pedagogy Because LIMSport uses standard productivity software, designing a new experiment to take advantage of the computer’s data acquisition capabilities is not difficult. It is also easy to change the style of each template so that different laboratory sections can be presented with different versions of the same experiment to evaluate the effectiveness of each style. We did this in the current two-year study, where some sections of general chemistry used templates that almost completely automated the calculations and graphing associated with the experiment (control groups), while other sections (experimental groups) had much sparser templates that required students to develop experimental strategies and design templates to obtain final results. The latter approach would be supported by the hypothesis that students might learn more if they were required to think more about data
Journal of Chemical Education • Vol. 79 No. 9 September 2002 • JChemEd.chem.wisc.edu
Chemical Education Today edited by
Susan H. Hixson National Science Foundation Arlington, VA 22230
Richard F. Jones Sinclair Community College Dayton, OH 45402-1460
analysis during the laboratory, integrating the theoretical and empirical aspects of the experiment. The alternate, control approach would be supported by Johnstone’s suggestion (5) that students suffer from information overload in the laboratory. Having many of the details of data analysis done for them might help the control groups focus on the empirical strategy of the experiment, and avoid “missing the forest for the trees”. Students in both groups were required to write laboratory reports with detailed sample calculations for all results, and all attended the same lecture course. All groups used computers because our chemistry faculty think that the advantage is so great that it would have been unethical to deprive students of the experience. We were interested in learning how much computer use was optimal. Generally, laboratory instructors were each assigned one experimental group and one control group. While about 135 students began each year of the two-year study, fewer than 100 completed the course, and roughly two-thirds were assigned to
experimental groups. Laboratory reports were judged by the laboratory instructor using a standardized Laboratory Report Scoring Rubric (6). Occasionally the reports were rated by two instructors to ensure reliability of scoring. The Myers– Briggs Personality Inventory, as well as locally designed surveys of prior laboratory experience and science curiosity (6) were administered. Under these conditions, where there were differences between the two groups, they were barely significant. The group that used the control templates usually performed better. These templates automated data analysis and provided results—including “live” graphing—as data were acquired, so that reasonableness could be monitored, providing “closure” (7) during the execution of the experiment. Examples of the experimental and control group templates for one experiment, in which the half-life of barium-137 is determined, are available on the LIMSport Web site (8), along with the table of contents and a selection of introductory essays for a few
JChemEd.chem.wisc.edu • Vol. 79 No. 9 September 2002 • Journal of Chemical Education
1061
Chemical Education Today
NSF Highlights experiments from the laboratory manual. Student laboratory report rubric scores (on a 6 point scale) were higher for experimental groups in 1997–1998 (fall, 3.97; spring, 4.18) than for control groups (fall, 3.87; spring, 4.01), but this marginal difference is anomalous. For the 1998–1999 year, the control groups appeared to increase more from first to second semester (fall, 3.10; spring, 3.60) than the experimental groups (2.97 to 3.37). When the effects of students dropping the course and switching from control to experimental groups are considered, the effects are more revealing. The performance of students in experimental groups both semesters increased 0.19 from first to second semester the first year and declined 0.35 the second year, while performance of those switching from experimental to control groups increased slightly. Scores of students in control groups both semesters declined, but less than those in experimental groups. Myers–Briggs NT (intuitive/thinker) types in control groups showed the greatest increase, while scores of NF (intuitive/feeler) types in experimental groups actually declined. The control group did marginally better in other areas such as lab grade, tests, the final exam, and the final grade. There was no correlation between SAT scores and rubric scores, but class rank did correlate (0.45, P = 0.01). Surprisingly, previous computer experience even with spreadsheets had a small negative correlation with performance on laboratory reports. Instructor comments and these data support the information overload hypothesis: Delaying detailed analysis until after the experiment has been completed is helpful, and students perform better when given more prompting by the computer in the laboratory. A somewhat more cynical view might be that students do not do very much analysis after the experiment in either case, but they respond better to “spoonfed” information because they are in the habit of memorizing information that is presented to them, rather than actively processing information to gain a deeper, and probably more long-lasting, understanding of the overarching concepts. But our impression is that even better students, who do think about the experiment afterwards, gain by having the spreadsheet-processed results, and rarely complain that they’ve
1062
been cheated of a “eureka experience”. We found that clear statistical evidence was difficult to obtain because of the number of students who drop general chemistry, switch sections, or perform sporadically because they are not committed to a science career; but instructors agree on the general comments above. Copies of the detailed report describing this study, along with copies of the rubric, informed consent release forms, and other survey forms and documents used in the study are available on the Web site (6). The LabVIEW VIs and Excel Visual Basic code that implement LIMSport (for both Windows2000 and 9X), and a selection of templates are available without charge for downloading (8). Acknowledgment The authors gratefully acknowledge the support of the National Science Foundation (DUE 9652855) for development and evaluation of the LIMSport program. Literature Cited 1. Vitz, E.; Reinhard, S. J. Chem. Educ. 1993, 70, 245. 2. Beasley, C.; Vitz, E. Scientific Computing and Instrumentation, April 1999, pp 29–38. 3. C. Beasley and T. A. Betts contributed Excel VBA programming and toolbutton design. 4. Vitz, E; Chan, H. J. Chem. Educ. 1995, 72, 920 and references therein. 5. Johnstone, A. H. J. Chem. Educ. 1993, 70, 701–705. 6. LIMSport Downloads: http://faculty.kutztown.edu/vitz/limsport/ limsdls.htm (accessed July 2002). 7. Pickering, M. J. Chem. Educ. 1987, 64, 521. 8. LIMSport Home Page: http://faculty.kutztown.edu/vitz/limsport/ limshome.html (accessed July 2002).
Ed Vitz is in the Chemistry Department, Kutztown University, Kutztown, PA 19530;
[email protected]; Brenda P. Egolf is in the Center for Social Research, Lehigh University, Bethlehem, PA 18015.
Journal of Chemical Education • Vol. 79 No. 9 September 2002 • JChemEd.chem.wisc.edu