Teaching Students To Be Instrumental in Analysis: Peer-Led Team

Oct 31, 2017 - Many instrumental analysis students develop limited skills as the course rushes through different instruments to ensure familiarity wit...
0 downloads 12 Views 316KB Size
Article Cite This: J. Chem. Educ. XXXX, XXX, XXX-XXX

pubs.acs.org/jchemeduc

Teaching Students To Be Instrumental in Analysis: Peer-Led Team Learning in the Instrumental Laboratory Jacob L. Williams, Martin E. Miller,† Brianna C. Avitabile, Dillon L. Burrow, Allison N. Schmittou, Meagan K. Mann, and Leslie A. Hiatt* Department of Chemistry, Austin Peay State University, Clarksville, Tennessee 37044, United States S Supporting Information *

ABSTRACT: Many instrumental analysis students develop limited skills as the course rushes through different instruments to ensure familiarity with as many methodologies as possible. This broad coverage comes at the expense of superficiality of learning and a lack of student confidence and engagement. To mitigate these issues, a peer-led team learning model was developed to give each student in-depth experiences operating and troubleshooting six common instruments. Electronic cigarette solutions were chosen for all work because of their current relevance. Small groups became “class experts” on their assigned instrument. Students were responsible for teaching their peers how to utilize their instrument for experimentation. Each student rotated through their peer’s instruments, learned to apply the knowledge they gained from one instrument to others, while they answered questions from peers. The students developed troubleshooting and communication skillsfoundational tools for chemists. This model proved successful in promoting cognitive flexibility and critical thinking about experimental design, as reflected by oral quizzes and peer teaching. This adaptation of peer-led team learning helped engage students, promote confidence, and a facilitate a deeper understanding of instrumentation. KEYWORDS: Upper-Division Undergraduate, Analytical Chemistry, Laboratory Instruction, Collaborative/Cooperative Learning, Inquiry-Based/Discovery Learning, Problem Solving/Decision Making, Instrumental Methods

O

study, the nicotine-containing solution in electronic cigarettes (e-cigs) was used to teach an entire one-semester instrumental analysis lab course. This solution was chosen for several reasons. In recent years, the number of young adults using ecigs has grown, and the controversy about health consequences has increased rapidly.10,11 Additionally, the structural properties of nicotine make it an ideal model for learning a wide variety of instruments. Its structural core of pyridine and pyrrolidine rings make nicotine a diprotic acid, chromophoric, and it is sold as the single bioactive enantiomer of (S)-nicotine. It is not enough to simply study an interesting solution; students in instrumental analysis need to learn to utilize instrumentation to solve diverse problems and to gain confidence working with the instruments. By designing their own experiments, students work independently and engage in deeper critical thinking than would otherwise be possible in traditional teacher-directed laboratories.12 This type of problem-based learning allows students the opportunity to improve their critical thinking skills, instrumental skills, and soft skills such as self-confidence, communication, and independence. This makes them more marketable and desirable in the work force to potential employers.13,14 This study used a

ne of the biggest issues facing chemistry educators is relevance. If a student cannot see the relevance of a specific discipline or concept and how it helps promote their long-term goals, they are more likely to have difficulty learning the material.1 Students who do not appreciate the application of the material in a course often fail to commit the time and cognitive resources needed to be successful. For many decades, textbooks have included real-world applications to encourage students to discover the relevance of the material to their lives, and educators have recognized and attempted to incorporate relevance for decades. These observations are not merely anecdotal; it has been shown that students have a greater understanding of course content if they are provided with a real-world application that resonates with their interests.2−5 General, organic, and biochemistry courses often explore potential biological or medical applications to help engage the preprofessional students that often predominate those classes. We adapted this approach to Instrumental Analysis by using a medically interesting solution as a vehicle for students to learn instrumental and experimental design. Using a single sample throughout the course has been shown to help students focus on the instrumentation without getting lost in the details of an experiment and also helps students see how their work fits into the broader scope of communal scientific knowledge.6,7 They are able to use problem-based learning in a real-world context, which improves engagement and enhances learning.8,9 In this © XXXX American Chemical Society and Division of Chemical Education, Inc.

Received: May 2, 2017 Revised: October 8, 2017

A

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

example, the AA students were told tobacco products may have heavy metal contamination and were shown what lamps were available. The students made the decision of what metals to look for in e-cig samples based on our instrument capabilities. They performed calibration curves using the AA on multiple metals before determining what metals the other groups should look for. All the groups had to make decisions like this one, with instructor guidance helping them find necessary resources (see the list of provided resources in the Supporting Information). Whenever possible, students were allowed autonomy in the research process.

version of peer-led team learning (PLTL) to immerse students in the course content. The PLTL method is an apprenticeship model for learning that relies upon a lead student mastering a topic. This student then teaches their classmates who emulate their behavior and strive to achieve similar understanding of the instrument.15,16 When students teach each other, they gain a deeper understanding of course content and feel more ownership for their work.17 In this study, student teams were assigned individual instruments available in our department. Each team was responsible for a self-study of the characteristics, use cases, and operation of their instrument. After the team developed sufficient mastery of their instrument, they were required to teach their peers the basics of that instrument. These skills and understanding were evaluated in lab through oral quizzes and in lecture through presentations and exams. Incorporating experimental design in PLTL further enhances critical analysis and problem solving skills.18



Operation and Procedure Manuals Written by Students

The operating manual designed by each group included the instrument schematic, an explanation of how each part worked, the importance of each part, a description of how the entire instrument worked, and a procedure explaining how to operate the software that runs their instrument as well as details explaining how to use their instrument to analyze their e-cig sample. Students were expected to update this document every week based on observations from their peers who were given this manual to run their experiments. Even though this regular updating was not a formal part of their grade, the students updated their procedure regularly.

METHODS AND FRAMEWORK

Development of Instrument Experts

The instruments selected for this work were atomic absorption spectrometry (AA, PerkinElmer AAnalyst 400), electrochemical analysis (EChem, BASi 100B), mass spectrometry (MS, ThermoFinnigan LCQ), fluorescence spectrometry (Fluorescence, PerkinElmer LS55), and high-performance liquid chromatography (HPLC, Waters 1525 with UV−vis detector). The class consisted of 11 students, and the students were divided into five groups of two, with one student performing an independent study. The students were required to operate the above five instruments plus the UV−vis (Hewlett-Packard 8453) during the semester. Gas chromatography (GC), Fourier transform infrared spectroscopy (FTIR), and nuclear magnetic resonance (NMR) were all used extensively in prerequisite courses and therefore were not required for instrumental lab. Each group chose one of the five instruments listed above to become the “expert” on by using it regularly for the first 4 weeks of the 15-week semester. Student groups took on these responsibilities to • Learn and design an instrument schematic. • Understand how to operate and troubleshoot issues. • Design an experiment for analyzing e-cig samples. • Determine positive and negative controls and desired results of their experiment. • Write operations manual/experimental directions for their peers to follow. The students then exchanged these written documents with other groups. As they learned the other instruments, they had to make sure their own directions were understood and help when troubles arose. The objective for the rest of the semester was to analyze the one e-cig sample their group was assigned using every instrument in the curriculum. If the students had questions about any instrument or experimental protocol, they had to ask the “instrument experts” before asking the instructor. Each group was responsible for answering questions posed by their peers about their assigned instrument during lecture and lab. On the first day of class, an oral tour of the lab was used to discuss every instrument. This included how to operate them, how they worked, what to avoid to keep them functional, and some basic ideas for how the instruments could be used to analyze e-cigs solutions. Interestingly, the students had more questions at the end of the period than at the beginning. For

UV−vis Spectroscopy Experimental Design

As almost all students had used the UV−vis spectrophotometer in previous courses, and it was not of sufficient difficulty for the students to master, no group was assigned responsibilities for writing this operations manual. Instead, this instrument was used as a teaching tool to remind students how to design an experiment. Every group was required to analyze their assigned e-cig sample using this instrument. The students were not given any directions on how to operate, collect data, or develop an objective using this instrument, but they were encouraged to incorporate appropriate positive and negative controls in their experiment. They were expected to operate this instrument, design an experiment, and troubleshoot any problems that arose in one class period. Oral Quizzes

Oral quizzes were used to improve student communication and assess verbal reasoning skills. The students could take a maximum of two quizzes per week and were required to complete at least one quiz, on any instrument, by midterm. These quizzes covered the six instruments mentioned previously, plus FTIR and GC, which were covered in lecture. The students could take the quizzes in front of their peers or alone. Every quiz began with two questions. (i) Do you have any questions about this instrument for your instructor? (ii) What can you tell me about this instrument? On the basis of the response to the second question, the instructor would follow up with additional questions, as necessary. Questions would continue until all the instructor’s questions were fully answered or missed, with the number of questions asked depending on the student’s answers and the complexity of the instrument. These quizzes lasted from 5−15 min, at the end of which the student was assigned a numerical grade, with 5 points out of 100 lost for every incorrect answer. If they were not able to use reason to answer the question correctly with a little prompting, 5 more points were lost. The lowest scoring quiz was dropped from the final scores. B

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Student Survey

to perfect or demonstrate their ability to teach their peers prior to teaching. The instructor wanted the students to learn about their weaknesses from each other, not from the instructor. For this reason, the students were given 2 weeks to conduct their first peer experiment. It is hard for the students to use an instrument on which they have not been trained, especially with unclear instructions written by students. During the first weeks of lab that used peer protocols, instructor input was greater. The instructor listened for groups expressing confusion and asked groups guiding questions to ensure critical thinking. With time, the manuals improved, and the overall end-of-semester survey results appeared to indicate increased satisfaction with group dynamics (Table 2).

At the end of the semester, one ordinal survey was disseminated through SurveyMonkey to determine how the students felt about different aspects of the course: skills, knowledge, collaborations, oral quizzes, and teamwork. The survey asked students to respond to questions by indicating their agreement using a Likert scale, with “Strongly Disagree” given a value of 1 and “Strongly Agree” given a value of 5. Their responses were classified into several subcategories. Additionally, the students were asked to reflect on their initial thoughts about the course.



RESULTS AND DISCUSSION

Course Goals: Improvements in Student Experience and Abilities

Table 2. Comparative End-of-Semester Student Rankings of Collaborative Aspects of the Course

The students responded positively to the format of this course. As they worked in groups, their fear of failure diminished, their leadership skills flourished, and their competency in real-world applications increased. While there was a lot of initial apprehension, the end-of-semester survey results indicated positive reflections on the course (Table 1). The last four

Collaborative Aspecta Learning Peer lab work Comfort

Table 1. Comparative End-of-Semester Student Rankings of Course Objectives Course Objectivea Career preparation Experimental design Critical thinking Confidence Understanding Experience

Survey Statements for Response

x̅ ± sb

This course helped prepare me to use any instruments I might need to use in my future career. I feel more confident in my ability to design an experiment as a result of this course. My critical thinking skills improved this semester as a result of this course. I feel more confident in my ability to use instruments as a result of this course. I was able to see how the different instruments could help further the analysis of my samples. This course was a positive experience for me, overall.

4.5 ± 0.8

Partner enjoyment Partner collaboration Class collaboration

Survey Statements for Response

x̅ ± sb

I learned more working in a group than I would have individually. I liked performing experiments my peers created. I was comfortable with the group requirements of this course. I enjoyed working with a partner and getting feedback from other groups. My partner and I worked well together.

3.7 ± 1.4

4.3 ± 1.1

The class mostly worked well together.

4.9 ± 0.3

3.7 ± 1.1 3.9 ± 0.9 4.2 ± 0.8

a

Students ranked collaborative aspects of the course using a scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). bN = 11.

4.6 ± 0.5 4.7 ± 0.5 4.7 ± 0.5

Working with a partner in lab has been shown to have many positive outcomes, but fear persists that one person in the partnership will not contribute. Using the PLTL model, students were not able to get by with simple surface learning; they were forced to meet the demands of their nongroup peers by improving their troubleshooting skills. Partnerships were generally positive with both partners demonstrating engagement. Some groups naturally had a stronger “leader”, but both students could help their nongroup peers. Anecdotally, students with higher lecture scores were often not as good at explaining the instruments as the students who had to work harder to master them. The lowest score on the survey was student enjoyment working with their peer created laboratories. This was hard for them but seemed to get better as the semester progressed. The highest score was how well the class worked together. These results might demonstrate that the format of this class helped increase overall cooperativity, or perhaps the format of the course might be irrelevant and the high score is inflated due to how well this specific group worked together. While students were forced to master difficult information as they learned new techniques and instrumentation, they also developed many soft skills, most notably, their communication and critical thinking skills.19 As one student commented, “The most valuable aspect of this course was my peers. We spent a lot of time talking through schematics and detectors and teaching one another about our instruments.” While there were frustrations, the students learned to teach and help each other succeed.

4.7 ± 0.6 4.7 ± 0.5

a

Students ranked aspects of the course objectives using a scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). bN = 11.

statements had the highest score (4.7 ± (0.5, 0.5, 0.6, and 0.5) respectively) and demonstrated confidence in the major goals for the course: critical thinking and understanding of performing analysis with instrumentation. The instructor felt the students performed met or exceeded expectations. All students grew stronger in the skills desired in Table 1. By comparing 2016 instrumental analysis lab final grades with a previous cohort, one who did not examine e-cigs for a whole semester but did use project-based learning, their final average was 8% higher. Mimicking Collaborations in the Work Force

The students were required to work as a class and within their individual groups to mimic collaborations in the work force. Interpersonal dynamics assessed included enjoyment, comfort, and cooperativity. Enjoyment was an interesting factor with PLTL, as students were willing to share the frustrations they felt while performing their peer’s laboratories at midterm. Typically, the first time a group performed the experiment designed by their peers, there were many difficulties and confusion. No requirements were made that the students had C

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Semester-Long Improvements in Instrument Knowledge

In addition to surveying student confidence in instrumentation, student familiarity was assessed informally through observations during lab time and formally through lab quizzes and lecture exams. The lab protocol for electrochemical analysis generated the most questions for the instructor and the EChem “expert” group. This observation matched the difficulty students reported having with this instrument. It appears not all questions were answered in lab as the lowest quiz scores and the largest number of students dropping a quiz were recorded for this instrument. Possibly because of the poor quiz scores in lab, the students also scored lowest on the EChem section of their final exam in lecture. The last question on their final exam asked students to draw a schematic for six of eight instruments, explaining in detail the workings and purpose of the instrument. The students scored the lowest on electrochemistry (76%), with 6/11 students skipping this instrument section. In this case, the lack of confidence in lab directly translated to lack of confidence in lecture. The student’s mastery of instrument concepts was strong, averaging 90% or higher on the final exam for the HPLC, AA, MS, and UV−vis, and 80% or higher for the Fluorometer, GC, and IR. While students are sometimes self-aware, their perception is sometimes lacking. The student rankings of their confidence with instruments did not correlate perfectly with their final exam or quiz scores. For instance, the students ranked their understanding of HPLC as 4.2/5.0, the second lowest ranked instrument, but the students scored the highest on both the HPLC quizzes and the HPLC section of the final exam. Additionally, no student chose to skip the HPLC section of the final exam, possibly due to their confidence in their ability to explain and reason about this instrument. The subconscious confidence found through the HPLC lab work and quizzes appears to have translated well to confidence on the final exam.

The students were asked to indicate their level of agreement with the following statements: “I have a working knowledge of the operation and design of the: EChem, HPLC, GC, MS, Fluorometer, IR, AA, and UV−vis.” The lowest average score was a 4.1, which was between “agree somewhat” and “strongly agree” on the survey scale (Table 3). Unsurprisingly, the Table 3. Comparative End-of-Semester Student Rankings of Instrument Operation and Design Instrumenta EChem HPLC GC MS Fluorometer IR AA UV−vis

x̅ ± sb 4.1 4.2 4.2 4.2 4.4 4.4 4.5 4.7

± ± ± ± ± ± ± ±

0.8 1.0 0.8 0.4 0.7 0.7 0.7 0.5

a

Students ranked their understanding of instrument operation and design by responding to this prompt“I have a working knowledge of the operation and design of the [instruments listed]”using a scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). bN = 11.

students ranked their familiarity with the UV−vis the highest. The data demonstrates that students feel the most comfort with simple instruments that they had previous experience operating. Of interest, the next highest scoring instruments were all lightbased instruments. One of the goals of the class was to demonstrate to students that they can take concepts learned on one instrument and apply them to similar instruments. This may explain why the next highest scores after the UV−vis were the IR, AA, and Fluorometer, all schematically like the UV−vis. Students scored the MS as 4.2 ± 0.4. This number might have been higher if the students would have had a working MS available throughout the semester. While there was a MS in the instrumental lab, technical difficulties resulted in only one exposure to using MS at a satellite location in the latter half of the semester. Chromatography was also ranked high, and one might wonder if the high standard deviation for HPLC is due to the experts becoming better at training their peers later in the semester. Some groups were effective teachers from the beginning, while others had a steeper learning curve. The survey did not list the instruments in the order listed below, they were listed randomly, and yet the confidence levels the students expressed grouped the instruments into categories. While four spectroscopic instruments were available for class discussion, only one electrochemical workstation was available. It appears the students had the hardest time working with this instrument, possibly due to the dissimilarity of this instrument compared to the others. With other instruments, there ways to take apart and visualize their function. There was no easy way for them to visualize the important electrical components of electrochemical analysis. The abstract nature of electrons and circuitry was hard for the students to grasp. Additionally, in the courses taken before instrumental analysis at our institution, spectroscopy was used routinely, whereas electrochemistry was not. The difficulty students had with electrochemistry was important to note in finding ways for improving electrochemical education in future semesters.

Effects of Oral Quizzes on Learning

The use of oral quizzes in place of written prelab quizzes better suited the asynchronous format of this course, where each lab group was using a different instrument each week. This allowed for assessment of deeper knowledge instead of the surface level knowledge traditional prelab quizzes seem to assess. The ability to explain how something works in a highly technical manner and answer questions simultaneously is a desirable skill in the workplace. In addition to improving communication skills, oral quizzes helped students focus on how specific instruments operate and were designed. One of the main goals of this class was preventing the “black box” view of instruments: where you put your sample in and an answer is magically given.20 It was the instructor’s desire to utilize quizzes that might improve student understanding of instrument design. One positive aspect of oral quizzes is the opportunity for the student to immediately reason toward a correct answer. Failure is no longer a permanent condition, but something they could face and overcome. Students seemed scared and reluctant to take the quizzes in the first half of the semester, as most had never taken an oral quiz before. The students did not think they had enough knowledge or experience with the instruments until after midterm. The survey helped determine the initial intimidation level, changes in intimidation level with time, and student preparation and learning. The statements students were asked to think about for the quizzes are shown in Table 4. D

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

every quiz. Oral quizzes demonstrated a chance for students to work hard, risk failure and discomfort, but have high probability for improvement of soft skills and instrumentation knowledge.

Table 4. Comparative End-of-Semester Student Rankings of Qualities of the Oral Quizzes Oral Quiz Aspectsa Intimidation Improvements in comfort Studying required Challenging Learning

Survey Statements for Response

x̅ ± sb

I was intimidated by oral quizzes. I was more comfortable with oral quizzes with each quiz I took. I studied more for oral quizzes than I would have for paper quizzes. I found oral quizzes more challenging than paper quizzes. Oral quizzes helped me learn more than paper quizzes would have done.

4.0 ± 0.8 4.0 ± 0.8

Importance of Work Ethic on Learning

The students were asked to reflect on their work ethic. They were asked their agreement with the following statements: • “I gave my best effort in this course” (4.4 ± 0.7). • “The skills I learned from this course are equivalent with the effort I put into this course” (4.3 ± 1.0). The students responded more positively to the first statement with 10/11 students agreeing or strongly agreeing. From a professor’s vantage, the students worked very hard with each other and with the material. Most also felt they gained skills equivalent with their effort (9/11 agreed or strongly agreed). They also felt lab reinforced the ideas learned in lecture (3.9 ± 1.1).

4.3 ± 1.1 4.4 ± 0.9 4.5 ± 0.8

a

Students ranked their perceptions of qualities of the oral quizzes using a scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). bN = 11.

Overall, the oral quizzes were found to be intimidating at first (4.0 ± 0.8), but students grew more comfortable with each quiz taken (4.0 ± 0.8) (Table 4). These values indicate that the number of students who were intimidated at first matches the number of students who said they became more comfortable with oral quizzes with time. This study would have benefited from a prequiz that assessed their confidence and intimidation at the beginning of class. We did not survey the students at the beginning of the semester and instead asked the students to reflect on their feelings about the course. The following three questions in this category ask students to reflect on their preparation and learning. The students reported studying more (4.3 ± 1.1), finding oral quizzes more challenging (4.4 ± 0.9), and reported learning more from oral quizzes (4.5 ± 0.8) compared to paper quizzes. This self-reporting suggests that students thought they studied more, but there was more variability among the students for this question than for others in this category. It might be easy to think that the more the students were intimidated by quizzes, the more they would study, but this just was not the case. By looking at individual respondent data, there is little correlation between the level of intimidation and how hard the students studied (correlation coefficient of 0.2). In fact, the best, but still not significant, correlation coefficient was found between how comfortable students felt with time and how much they thought the oral quizzes helped them learn (correlation coefficient of 0.63). It is possible that with a larger sample size one could see whether improvements in self-confidence lead to increased learning, but this data do not show this. The data do show that, overall, the students were intimidated by oral quizzes but became more confident with time. It might be worth noting that feeling a little intimidation in a safe, low risk environment is beneficial to student growth, especially when giving opportunities to reason to a correct answer. It also demonstrates that students thought the oral quizzes were more challenging than paper quizzes. The highest score reported was on student learning: the students reported feeling like they learned more from oral quizzes. From the instructor’s perspective, the students grew better at explaining themselves as the semester progressed. From their first quiz (91% average, 3 students failed to take one by midterm) to their last quiz (94% summative average), the students grew in their understanding and ability to answer difficult questions, especially noting the questions grew harder with time. Since they were given a chance to think and reason through wrong answers, the students learned to think aloud and learn during

Overall Student Engagement and Reception

As others have found, students enjoyed the freedom they had to work independently and saw improvement in their lab skills.14 The students designed their own experiments, drew block diagrams of instruments, gained hands-on experience with the instruments, taught their peers, performed oral quizzes, wrote about their work, and then gave oral presentations in lecture. These processes have been demonstrated in the literature to be effective ways of teaching student mastery of instrumental analysis, and this study provides additional evidence.21 The students did make mistakes; none of their procedures was perfect, yet in general these imperfections led to greater learning. Easy success is not worth the low price paid; true learning requires the students to confront their misunderstandings and revise their practices to match the evidence presented.22 Many instrumental groups designed experiments that contained flaws. The students were much better at finding problems with positive or negative controls, sensitivity, or selectivity in their peer’s work than their own. By the time students came to these realizations, other students had already rotated through their station and collected insufficient data to draw good conclusions. The students had to work quickly to overcome the deficiencies in their own work, as they learned from their peers, to try to find workable solutions. They learned as much from their mistakes as from their successes. The students shared the responsibility for their data collection, analysis, and instrumental skills, much like the independent work required of chemists in the real world.23 This study engaged students in a real-world application while having a positive effect on their collaboration and interactions. The students not only enjoyed their independence, but also gained competence with instrumentation while understanding the strengths and weaknesses inherent in their techniques.24 Problem-based learning relies on uncertainty and challenging conditions to promote student learning.25 This study used e-cig liquids to challenge students’ critical thinking skills and used problem-based and peer-led learning to increase student interaction and independence with instrumentation.25 While course-specific knowledge is traditionally valued, these life lessons will make students better able to contribute to the scientific community. The students in this study truly took ownership of their instruments and projects. They developed a sense of community that will help them for the rest of their lives. E

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education



Article

(2) Hughes, K. D. Using an Aquarium to Teach Undergraduate Analytical Chemistry. Anal. Chem. 1993, 65 (20), 883A−889A. (3) Porter, V. J.; Sanft, P. M.; Dempich, J. C.; Dettmer, D. D.; Erickson, A. E.; Dubauskie, N. A.; Myster, S. T.; Matts, E. H.; Smith, E. T. Elemental Analysis of Wisdom Teeth by Atomic Spectroscopy Using Standard Additions. J. Chem. Educ. 2002, 79 (9), 1114−1116. (4) González-Ruiz, V.; Martín, M. A.; Olives, A. I. An Easily Built Smoking Machine for Use by Undergraduate Students in the Determination of Total Particulate Matter and Nicotine in Tobacco Smoke. J. Chem. Educ. 2012, 89 (6), 771−775. (5) Fan, X.; Lam, M.; Mathers, D. T.; Mabury, S. A.; Witter, A. E.; Klinger, D. M. Quantitative Determination of Nicotine and Cotinine in Urine and Sputum Using a Combined SPME-GC/MS Method. J. Chem. Educ. 2002, 79 (10), 1257. (6) Fitch, A.; Wang, Y.; Mellican, S.; Macha, S. Peer Reviewed: Lead Lab: Teaching Instrumentation with One Analyte. Anal. Chem. 1996, 68 (23), 727A−731A. (7) Kesner, L.; Eyring, E. Service-Learning General Chemistry: Lead Paint Analyses. J. Chem. Educ. 1999, 76 (7), 920−923. (8) Cancilla, D. A. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory. J. Chem. Educ. 2001, 78 (12), 1652−1660. (9) Adami, G. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry. J. Chem. Educ. 2006, 83 (2), 253. (10) Singh, T.; Arrazola, R. A.; Corey, C. G.; Husten, C. G.; Neff, L. J.; Homa, D. M.; King, B. A. Tobacco Use Among Middle and High School Students - United States, 2011−2015. CDC Morb. Mortal. Wkly. Rep. 2016, 65 (14), 361−367. (11) Palazzolo, D. L. Electronic Cigarettes and Vaping: A New Challenge in Clinical Medicine and Public Health. A Literature Review. Front Public Heal. 2013, 1 (56), 1−20. (12) Gao, R. Incorporating Students’ Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum. J. Chem. Educ. 2015, 92 (3), 444−449. (13) Kalivas, J. H. Realizing Workplace Skills in Instrumental Analysis. J. Chem. Educ. 2005, 82 (6), 895. (14) Grannas, A. M.; Lagalante, A. F. So These Numbers Really Mean Something? A Role Playing Scenario-Based Approach to the Undergraduate Instrumental Analysis Laboratory. J. Chem. Educ. 2010, 87 (4), 416−418. (15) Wilson, S. B.; Varma-Nelson, P. Small Groups, Significant Impact: A Review of Peer-Led Team Learning Research with Implications for STEM Education Researchers and Faculty. J. Chem. Educ. 2016, 93 (10), 1686−1702. (16) Gosser, D. K.; Kampmeier, J. A.; Varma-Nelson, P. Peer-led Team Learning: 2008 James Flack Norris Award Address. J. Chem. Educ. 2010, 87 (4), 374−380. (17) Shen, H.-Y.; Shen, B.; Hardacre, C. Using a Systematic Approach To Develop a Chemistry Course Introducing Students to Instrumental Analysis. J. Chem. Educ. 2013, 90 (6), 726−730. (18) Lanigan, K. C. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course. J. Chem. Educ. 2008, 85 (1), 138. (19) Wells, M.; Clougherty, R. Use of Wikis in Chemistry Instruction for Problem-Based Learning Assignments: An Example in Instrumental Analysis. J. Chem. Educ. 2008, 85 (10), 1446. (20) Bougot-Robin, K.; Paget, J.; Atkins, S. C.; Edel, J. B. Optimization and design of an absorbance spectrometer controlled using a raspberry Pi to improve analytical skills. J. Chem. Educ. 2016, 93 (7), 1232−1240. (21) King, D.; Fernandez, J.; Nalliah, R. Writing Instrument Profiles for Mastery of Instrumental Analysis. J. Chem. Educ. 2012, 89 (6), 728−731. (22) Vitt, J. E. Troubleshooting 101: An Instrumental Analysis Experiment. J. Chem. Educ. 2008, 85 (12), 1660. (23) Schaber, P. M.; Dinan, F. J.; St. Phillips, M.; Larson, R.; Pines, H. A.; Larkin, J. E. Juicing the Juice: A Laboratory-Based Case Study

CONCLUSIONS This study demonstrates a novel method for teaching instrumental analysis using a PLTL model. It sought to incorporate elements of work force preparedness, experimental design, hands-on instrument usage, connections to the real world, and troubleshooting, all while destroying the “black-box” view many students have of instruments. Use of a single, medically relevant solution helped increase student interest in their work and let them focus on the instruments rather than chemical details. By forcing the students to become the class experts on an instrument, the students engaged in the class, and a truly collaborative environment was established. The students learned critical thinking skills while relying on their peers instead of their professor. At the end of the semester, the students expressed the confidence they felt in solving problems on their own and that they no longer felt afraid to try something that might not work. The oral quizzes also helped prepare the students for the oral quizzes they will receive daily in their careers. The students in this study experienced a taste of the real world within the comfort of academia.



ASSOCIATED CONTENT

S Supporting Information *

The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.7b00285. Student analysis of e-cigs (PDF) Atomic absorption schematic (PDF) Resources for instrumental analysis (PDF) Instrumental analysis (PDF) Instrumental analysis lab (PDF)



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. ORCID

Leslie A. Hiatt: 0000-0002-8737-4376 Present Address †

M.E.M., Union University College of Pharmacy, 1050 Union University Drive, Jackson, Tennessee 38305, United States. Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS This work was supported by the Austin Peay State University Chemistry Department’s operating budget. We would like to thank John McLean and Andrzej Balinski from Vanderbilt University for allowing our students to utilize your mass spectrometers when our instrument was overheating. We would like to thank all the students who participated in Instrumental Analysis in Spring 2016 who made this class the success it was. We would also like to thank our research students who helped plan and develop the idea that many instruments could possibly be used to analyze and learn about nicotine content in electronic cigarettes.



REFERENCES

(1) Zaimi, O.; Blizzard, A. C.; Sorger, G. J. Teaching Water Quality Analysis with Community Collaboration: Helping Students Connect Important Concepts in the Science Class with Issues in the Real World. J. Coll. Sci. Teach. 1994, 24 (2), 105−110. F

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

for an Instrumental Analytical Chemistry Course. J. Chem. Educ. 2011, 88 (4), 496−498. (24) Feng, Z. V.; Buchman, J. T. Instrumental Analysis of Biodiesel Content in Commercial Diesel Blends: An Experiment for Undergraduate Analytical Chemistry. J. Chem. Educ. 2012, 89 (12), 1561− 1565. (25) Stetzik, L.; Deeter, A.; Parker, J.; Yukech, C. Puzzle-Based Versus Traditional Lecture: Comparing the Effects of Pedagogy on Academic Performance in an Undergraduate Human Anatomy and Physiology II Lab. BMC Med. Educ. 2015, 15 (1), 107.

G

DOI: 10.1021/acs.jchemed.7b00285 J. Chem. Educ. XXXX, XXX, XXX−XXX