Research: Science and Education edited by
Chemical Education Research
Christopher F. Bauer University of New Hampshire Durham, NH 03824-3598
Comparing the Effectiveness on Student Achievement of a Student Response System versus Online WebCT Quizzes
W
Diane M. Bunce,* Jessica R. VandenPlas, and Katherine L. Havanki Department of Chemistry, The Catholic University of America, Washington, DC 20064; *
[email protected] One of the basic tenets of constructivism is that knowledge cannot be transferred to a passive receiver. New information must be integrated with previous knowledge in the mind of each individual learner. This integration or reflection on incoming information may require a reorganization of the student’s existing knowledge structure. While the reorganization of knowledge is ultimately an individual undertaking, social interaction (or group learning) can play a vital role (1). Conventional lectures are not always effective in facilitating learning, since students are rarely given time to process incoming information and integrate it into existing knowledge structures. Eric Mazur’s Peer Instruction method (2) is based on the premise of social constructivism and allows students to engage in active learning by incorporating cooperative activities into conventional lectures. A key feature of Peer Instruction is the incorporation of ConcepTests, short conceptual questions on the subject being discussed, at various points during lecture. As part of each ConcepTest, students are given time to formulate and discuss the question before recording their answers. Mazur suggests that this not only forces students to think through the arguments being developed, but also provides both teacher and learner with a way to assess understanding of the given concept. The use of computers to gather student responses is not new to science education. Use of electronic response systems, especially in large lectures, dates back to the 1960s (3). Research on the effectiveness of this approach has been limited to its influence on increased rates of passing the course (4). More work is needed to test the effectiveness of computers and ConcepTests on student achievement. It is this question that led to the incorporation of the Student Response System (SRS) into a second-semester nursing course. SRS is a Webbased questioning system (5) designed to assist instructors in receiving and analyzing student responses to questions posed in lecture or recitation. In this study, the electronic student response system, SRS, was used primarily as a means of delivering electronic ConcepTests for students working in pairs. SRS was matched with daily online quizzes delivered outside of class via WebCT. Research shows that increasing quiz frequency results in higher student achievement levels in chemistry (6). In this study, both SRS and daily quizzes were thought to provide opportunities for students to practice applying the knowledge they had just learned. Both SRS and online quizzing provide immediate feedback on whether or not students have reached an understanding of the material. One of the differences between SRS and online quizzing is 488
Journal of Chemical Education
•
that students did not have access to the SRS questions following lecture. WebCT quiz questions were available to students for review throughout the semester. Some of the SRS questions used in class were paralleled by WebCT quiz questions; others were not. The goal was to investigate which of these two teaching–learning approaches was more effective in helping students achieve success in chemistry. The interaction of the two techniques was also investigated. Research Questions The following research questions were explored in an attempt to see whether the use of SRS and WebCT online quizzing had an effect on student achievement on both teacher-written exams and an American Chemical Society (ACS) final exam (7). 1. What is the effect of each of the four within-subject treatment groups (SRS, WebCT, SRS–WebCT combined, and neither SRS nor WebCT) on student achievement for • Teacher-written hour-long exam questions? • American Chemical Society exam questions? 2. Do students with different logical reasoning ability achieve differentially on each of the two achievement measures in the presence or absence of • SRS? • WebCT quiz questions? 3. What is the effect of SRS questions on WebCT quiz scores? 4. What effect does the use of SRS have on student attitudes towards the course?
Research Sample Population The study population was 41 second-semester nursing students enrolled in a spring 2004 General, Organic, and Biochemistry course in a small private university in the MidAtlantic region. Most of these students had been enrolled in the first semester of this course in fall 2003 with the same instructor. They had used WebCT online quizzing on a regular basis during the fall semester and were familiar with its use at the start of the spring semester. None of these students had used computer-assisted SRS in any other chemistry
Vol. 83 No. 3 March 2006
•
www.JCE.DivCHED.org
Research: Science and Education
course previous to the spring semester. During the fall semester, students had used a generalized ConcepTest approach without computers. Institutional Review Board (IRB) approval was granted based on the fact that the experimental design was in keeping with normal classroom procedures.
the instructor in order to preserve the objectivity of the research. Questions for the WebCT online quizzes were judged by a panel of four chemical educators, including the course instructor, to be of equal difficulty to those used for the inclass SRS component. A sample pair of SRS and online WebCT questions is included in Table 1.
Research Methodology
Teacher-Written Hour-Long Exams Student achievement was measured using three one-hour teacher-written exams that contained open-ended questions. These one-hour exams were administered approximately three weeks apart. The questions for the hour exams were written by the instructor and, in some cases, based upon assigned homework questions from the course textbook (11). Questions were graded allowing for partial credit, and the score for each question was recorded as a percentage of total possible points. For example, if a student earned 4 of the possible 8 points for a given question, the score was recorded as 50% in our database. Student achievement on exam questions related to SRS or WebCT online quizzes were compared to achievement on questions that did not have an SRS or WebCT parallel.
The study was completed over nine weeks in the second semester course. Students completed the GALT test (Group Assessment of Logical Thinking) (8) at the beginning of the semester and an online survey on the usefulness of SRS at the end of the semester. In addition, students completed a paper-and-pencil course evaluation on the last day of class.
GALT Test The GALT test measures logical reasoning ability. Students complete a total of 12 questions in this paper-and-pencil test, providing both an answer and a reason for that answer. The range of GALT scores is 0–12. Students’ GALT scores were categorized as high (10–12), medium (8–9), and low (0– 7) levels. These cut-off points were determined to assure equal distribution of students among high, medium, and low levels. The class was divided into pairs based on students’ GALT scores. Each pair was assigned so that partners had GALT scores within three points of each other. One wireless laptop was assigned to each pair of students. This method was chosen so that the pairs of students working together on a single computer had similar logical reasoning ability levels. Electronic Student Response System Questions were projected onto a classroom computer screen using SRS. When multiple-choice questions were used, student computer screens displayed only the answers to the questions. As students responded via the Web, a graph was updated on the classroom screen. This graph displayed the number of groups choosing each answer. Over the course of a 16-week semester, SRS was used during the last 9 weeks. Of these 9 weeks, the first 2 weeks of SRS data were not included in the analysis. These data were viewed as unduly affected by the newness of the procedure and therefore not accurate measures of its effectiveness. The questions chosen for the SRS component of the project were multiple-choice questions selected from the practice tests included in the course textbook’s study guide (9). Questions were selected by four chemical educators, including the course instructor. Students answered SRS questions immediately after a concept was introduced in lecture. For the purpose of this research, student answers were tracked by computer IP address. WebCT Online Quizzes WebCT online quizzes were available to students six hours after each lecture, and due prior to the next lecture. Students could log in to their WebCT account to complete the quiz from any computer on or off campus. WebCT online quiz questions were multiple-choice questions selected from the course textbook test bank (10). Questions for the WebCT online quizzes were chosen by the course instructor and based upon the lecture just completed. Questions for both SRS and WebCT online quizzes were not written by www.JCE.DivCHED.org
•
American Chemical Society Exam The final exam for the course consisted of the Organic and Biochemistry subsections of the ACS General, Organic, and Biochemistry Exam Form 2000 (7). ACS tests contain multiple-choice style questions, thus questions were graded as right (“1”) or wrong (“0”). Student achievement on ACS exam questions related to SRS or WebCT online quizzes were compared to achievement on ACS questions that did not have an SRS or WebCT parallel. Online Survey During week 14 of the course, students were asked to complete an online survey evaluating the usefulness of SRS. The questions on this survey were developed by four chemical educators, including the course instructor. The survey consisted of both Likert-type and free response answers and was completed anonymously. A copy of this survey is included in the Supplemental Material.W
Table 1. SRS and WebCT Sample Question and Answers SRS Question (9)
WebCT Question (10)
Denaturation of a protein:
Denaturation of a protein:
A.
Occurs at a pH of 7.
A.
Changes the primary structure of a protein.
B.
Causes a change in protein structure.*
B.
Disrupts the secondary, tertiary, or quaternary structures of a protein.*
C.
Hydrolyzes a protein.
C.
Is always irreversible.
D.
Oxidizes the protein.
D.
Hydrolyzes peptide bonds.
E.
Can only occur in a protein with quaternary structure.
*Correct response
Vol. 83 No. 3 March 2006
•
Journal of Chemical Education
489
Research: Science and Education
On the survey, students were asked to rate the effectiveness of using SRS on their learning of chemistry, their preparation for the WebCT online quizzes, and teacher-written hour-long exams. Students were also asked about their confidence in selecting the correct answers for the SRS questions.
Paper-and-Pencil Course Evaluation On the last day of class, students were asked to complete a short paper-and-pencil course evaluation, included in the Supplemental Material.W The evaluation asked students how often they used each of several features of the course, such as office hours, test reviews posted on WebCT, and reviewing the WebCT quiz questions to prepare for the teacher-written hour exams. In addition, students were asked open-ended questions regarding what both they and the teacher could have done differently to help them succeed. Space was provided
for additional comments. Students completed the surveys anonymously. Results All 41 students in this study experienced the same SRS in-class questions and took the same WebCT online quizzes. Questions on each of the two achievement measures, teacherwritten exam and ACS exam, were analyzed and divided according to whether they had an SRS and/or a WebCT parallel question. Student responses to 32 teacher-written and 20 ACS exam questions were therefore divided into four within-subject treatment groups: 1. SRS 2. WebCT 3. SRS–WebCT combined 4. Neither SRS nor WebCT
Table 2. Number of Achievement Questions by Each Within-Subject Treatment Group Within-Subject Treatment Group
Teacher-Written Exam Questions by Group
ACS Exam Questions by Group
0SRS
09
04
0WebCT
06
04
0SRS–WebCT Combined
07
04
0Neither SRS nor WebCT0
10
08
0Total Questions Analyzed
32
20
Table 3. Mean Scores for Teacher-Written Exams by Treatment Group Treatment Group
Mean Scores
Standard Deviations
0SRS
71.35
13.66
0WebCT
89.87
12.25
0SRS–WebCT Combined
78.47
16.70
0Neither SRS nor WebCT0
75.18
15.41
Note: Maximum score is 100 points; n = 41. Table 4. Mean Scores for ACS Final Exam by Treatment Group Treatment Group
Mean Scores
Standard Deviations
0SRS
.70
.20
0WebCT
.59
.24
0SRS–WebCT Combined
.59
.29
0Neither SRS nor WebCT
.64
.21
Note: Maximum score is 1 point; n = 41.
490
Journal of Chemical Education
•
Questions within each treatment group cover several chemistry topics, but no chemistry topic appears in more than one group. Thus, the groups are non-overlapping in respect to chemistry content. The four within-subjects treatment groups and the number of achievement questions corresponding to each group are summarized in Table 2.
Comparing Student Achievement by Treatment Group To answer the first two research questions, one overall analysis of variance (ANOVA) was used. Student scores on the teacher-written and ACS exams were analyzed in a 4 ⫻ 3 comparison ANOVA with treatment (SRS, WebCT, SRS/WebCT combined, and neither SRS nor WebCT) as a within-subject factor and GALT level (high, medium, and low) as a between-subjects factor. For teacher-written hour-long exams, the main effect of treatment was significant: Wilks’s λ = .24, F (3, 38) = 40.07, p = .00, partial η2 = .76. The means of each treatment group are given in Table 3. Questions were graded for partial credit, and the score for each question was recorded as a percentage of total possible points (0–100). Scores were then averaged using a two-step process. First, a mean was established for each student for all questions within a treatment group (e.g., SRS, WebCT, etc.). Next, an average of all 41 students was calculated for each treatment group. To investigate the differences between treatment groups, post-hoc comparisons were performed using the LSD confidence interval adjustment. Pair-wise comparisons show students do not perform significantly differently when both SRS and WebCT are present (M = 78.47, SD = 16.70) than when both are absent (M = 75.18, SD = 15.41). Compared to these treatments, students perform significantly higher when teacherwritten exam questions have only a WebCT parallel (M = 89.87, SD = 12.25), and significantly lower when questions have only an SRS parallel (M = 71.35, SD = 13.66). Of the four treatments, the use of WebCT alone resulted in the highest achievement level on teacher-written exams. For ACS exam questions, the main effect of treatment group was significant: Wilks’s Λ = .77, F (3, 38) = 3.73, p = .02, partial η2 = .23. The means of each treatment group are given in Table 4. Questions were graded as right (“1”) or
Vol. 83 No. 3 March 2006
•
www.JCE.DivCHED.org
Research: Science and Education
wrong (“0”), and scores were again averaged using a two-step process. First, a mean was established for each student for all questions within a treatment group (e.g., SRS, WebCT, etc.). Next, an average of all 41 students was calculated for each group. Because of the grading scale used, means range from 0 to 1. To investigate the differences between treatment groups, post-hoc comparisons were performed using the LSD confidence interval adjustment. Pair-wise comparisons show students perform significantly better with SRS alone (M = .70, SD = .20) than with either WebCT alone (M= .59, SD = .24) or SRS–WebCT combined (M = .59, SD = .29). There is no significant difference between WebCT alone or WebCT–SRS combined. Students do not perform significantly differently when neither SRS nor WebCT is present (M = .64, SD = .21) than under any other treatment. In summary, although the use of SRS is significantly better than WebCT alone or SRS–WebCT combined, none of these treatments results in a significant increase in achievement compared to using neither SRS nor WebCT.
Comparing Student Achievement by GALT Level
SRS Effects on Student Attitudes Two instruments were used to survey student attitudes: one concerning use of SRS (online survey instrument), and the other regarding student use of WebCT questions to prepare for teacher-written tests (written course evaluation). Online Survey Students were asked to respond to 12 questions on this Likert-type survey using a scale ranging from 1 (strongly agree) to 5 (strongly disagree). The survey was completed anonymously during week 14 of the 16-week semester. To facilitate interpretation of the results, scores of 1 (strongly agree) and 2 (agree) were combined into a category called “agree” in the following tables. Likewise, scores of 5 (disagree strongly) and 4 (disagree) were combined into a category called “disagree”. Table 6 shows that a large majority (71%) of the 41 students agree that the use of SRS questions helped them learn the material in class. The students were not as positive concerning the use of SRS questions to help prepare for either the WebCT quizzes (49%) or the teacher-written exams (37%). Some typical student comments on these points include the following:
Students were assigned to one of three GALT levels as previously described. GALT level was used as a between-subjects variable in the mixed between–within subjects ANOVA. This test investigated both the main effect of GALT level and the interaction between GALT level and treatment on the achievement measures of teacher-written exam questions and the American Chemical Society exam questions. For teacher-written hour exams, the main effect of GALT level was not significant: F (2,38) = 1.91; p = .16. In addition, the interaction between treatment group and GALT level was not significant: Wilks’s Λ = .73, F (6,72) = 2.05, p = .70. For ACS exam questions, the main effect of GALT level was not significant: F (2,38) = 2.83, p= .07. In addition, the interaction between treatment group and GALT level was not significant: Wilks’s Λ = .83, F (6,72) = 1.17, p = .34.
0SRS
12
Comparing the Effect of SRS on WebCT Achievement
0No SRS
20
A paired-samples t-test was used to test the effect of SRS on WebCT quiz questions, with student WebCT quiz question scores as the achievement measure. For this comparison, the two treatment groups were WebCT questions with SRS parallels and WebCT questions without SRS parallels. Student achievement for a total of 32 WebCT quiz questions was analyzed: 12 questions with SRS parallels and 20 questions with no SRS parallels. The two groups and the number of achievement questions in each are summarized in Table 5. Questions were graded as right (“1”) or wrong (“0”), and scores were averaged using a two-step process. First, a mean was established for each student for all questions within each treatment group (SRS parallel question, no SRS parallel question). Next, an average of all 41 students’ scores was calculated for each group. Because of the grading scale used, means range from 0 to 1. There was no significant difference in WebCT quiz scores for questions that had an SRS parallel (M = .89, SD = .12), and those that did not have an SRS parallel question: M = .87, SD = .13; t (39) = 0.80, p = .43.
0Total Questions Analyzed
32
www.JCE.DivCHED.org
•
It was helpful to see sample questions like would be on the quiz. (Student 23) It reinforced the lecture and allowed me to see the types of questions that could be asked on an exam about one particular topic. (Student 8)
Table 5. WebCT Quiz Questions by Treatment Group Treatment Group
Number of WebCT Online Quiz Questions
Table 6. Student Agreement–Disagreement Responses to Selected Online Survey Attitude Questions Response Statements (n = 41)
Agree, %
Disagree, %
OI enjoyed the SRS questions.
67
05
OThe SRS questions helped me learn Oto the material covered in class.
71
12
OThe SRS questions helped me Oprepare for the WebCT quizzes.
49
22
OThe SRS questions helped me Oprepare for the hour exams.
37
29
OI was confident in my and my Opartner’s answers to the SRS Oquestions.
83
02
Vol. 83 No. 3 March 2006
•
Journal of Chemical Education
491
Research: Science and Education
On the same survey, students were asked to choose characteristics they found useful about SRS from a list provided. Students could check all that applied (Table 7). Analysis of the responses found that students chose the following as most useful: SRS helped me reinforce what I learned in class (83%); having the ability to see how my answer compared to others in the class helped me learn the topic more effectively (63%); and SRS kept me interested in class (46%). This student comment on the topic is typical: I think it’s (SRS) a good way to reinforce what we just learned in class. It also helps to talk to your classmates more to see how far they are in understanding the material. (Student 1)
In addition to the way students felt SRS helped them, how students used SRS was also investigated. To address this issue, we return to Table 6 where the data show that 83% of the time, students were confident that their answers to SRS questions were correct. The data in Table 8 show that if students were unsure of the answer to an SRS question, they guessed about 42% of the time and in 32% of the time, waited to see the graph projected showing the number of groups that chose each answer before responding. Typical student comments regarding student action when unsure of an answer to an SRS question include: In class if the graph comes up before everyone turns in an answer… people who don’t know the answer simply wait for the graph and no real learning occurs.” (Student 7) I found that sometimes if my group was unsure about the answer we would wait to see what the rest of the class put. Maybe, the graph results should be kept hidden from the students or only for the teacher to see until everyone in the class has finished answering. (Student 40)
Paper-and-Pencil Course Evaluation On the last day of class, students completed a short openended survey anonymously. The evaluation asked them how often they used features of the course. Of the students responding, 61% reported reviewing the WebCT quiz questions at least sometimes to prepare for the teacher-written exams.
answer, and depending on group responses, either moves on with lecture or reviews the material just covered. The questions and the correct answers are not available to the students at any other time for reflection or review. WebCT quiz questions, on the other hand, are presented to students in a testing situation. Within 12 hours of completing the quiz, questions and correct answers are available for student reflection and review. Access to previous WebCT quizzes and answers is available throughout the semester. It has been shown that WebCT quiz questions have a significantly positive effect on student achievement on teacherwritten exams. SRS questions do not. The ability to review was not an option with SRS questions. This fact alone could contribute to the differential effect of the two interventions on student achievement on the teacher-written exams. On the paper-and-pencil survey, a majority of students supported this hypothesis by reporting that they reviewed the WebCT questions in preparation for the teacher written exams. No such review was possible with SRS. In this study, the ability to reflect and review questions used in the learning process has shown itself to be important in constructing student knowledge as measured by student achievement on exams.
Table 7. Percentages of Students Choosing Answers Regarding the Usefulness of SRS What did you find useful about the SRS questions?
Distribution, % (n = 41)
OHelped me reinforce what I learned in class
83
OHaving the ability to see how my answer Ocompared to others in the class helped me Olearn the topic more effectively
34
OImproved my scores on WebCT quizzes or Ohour exams
15
OHaving an opportunity to talk over the Omaterial with my partner(s) helped me to Olearn the material
63
OKept me interested in class
46
OMade class more enjoyable
29
ONote: Percentages do not sum to 100% because students could choose Omore than one response.
Conclusions The results found in this experiment have as much to say about how the treatment is implemented as they do about its results. The main difference between SRS and WebCT is the availability of each outside of class. SRS is available to students only during lecture. Within lecture, an SRS question is presented to students at appropriate times, the lecture is stopped, and students discuss the answer to the SRS question with their partner. This provides an opportunity to practice the material just learned. As student groups submit their answers, a graph is displayed and continually updated, showing the total number of groups choosing each answer. After a short amount of time, the teacher reveals the right
492
Journal of Chemical Education
•
Table 8. Student Actions When Unsure of an Answer to an SRS Question If Your Group Was Unsure of an Answer to a SRS Question, What Did You Do?
Distribution, % (n = 41)
OGuessed
42
OWaited to see the graph and then chose Othe most common answer
32
OThis never happened to us
17
OOther
10
Vol. 83 No. 3 March 2006
•
www.JCE.DivCHED.org
Research: Science and Education
The importance of review and reflection on new knowledge is also seen in the inability of SRS use to significantly affect student achievement on the ACS exam over conditions where neither SRS nor WebCT were used. This suggests that the proposed benefit of SRS was not realized in this study. WebCT shows a similar inability to significantly affect students’ achievement on the ACS exam even though the use of WebCT significantly improved scores on teacher-written exams. One explanation for this may be that students did not review the WebCT quizzes available to them online in preparation for the final exam due to time constraints experienced by students preparing for a comprehensive exam. Both SRS and WebCT questions are used to provide practice in applying concepts from lecture. The data indicate that using SRS questions does not significantly affect achievement on WebCT quizzes, possibly due to students not actively engaging with the SRS questions in class. Support for this comes from the student surveys in which students report not submitting SRS answers until viewing the graphical display of other groups’ responses. Early display of class results, which is meant as an aid to students and teachers, may prevent late-responding or non-confident students from meaningful practice of concepts. Providing teacher control over the timing of the graphical display may result in more productive use of SRS. The role of reflection in the learning process is to provide an opportunity to integrate incoming knowledge with students’ existing knowledge structures. SRS is designed to incorporate reflection by having the teacher review with the class why one answer is better than another. If time constraints truncate this process, the main effect of reflection may be lost. Teachers may not be aware of how such a small change in their teaching can have a large impact on student learning. SRS, without the opportunity for reflection, could fall short of its full potential as a learning tool. WebCT quizzes, on the other hand, afford students the opportunity to reflect when they compare their answer to the correct answer shortly after taking the quiz. The time constraint is not an issue with WebCT reflection because students use it outside of lecture. Students of different logical reasoning levels (GALT) would be expected to achieve differentially on both teacherwritten and ACS exams. That expectation was not demonstrated in a statistically significant way in this study. Neither were any of the treatments differentially effective for different GALT levels. One explanation may be that students develop unique learning strategies to compensate for differences in logical reasoning ability levels. These strategies may include emphasis on one or more of the following: familiarity with instructors’ teaching style, use of an online message board, daily support from professor or teaching assistants, use of online posting of test reviews, small-group activities in recitation, use of advance organizers in lecture, and a detailed
www.JCE.DivCHED.org
•
lecture schedule with suggested homework problems, all of which were available in this course. The implications of this research for teaching are that all three components of the learning process (practice, reflection, and review) are necessary for effective student learning. Small changes in teaching style can both positively and negatively impact the effectiveness of these learning components on student achievement. Such changes include teacher control of graphical display of SRS class results to encourage student engagement in meaningful learning (practice). In addition, provision of adequate opportunity in lecture for students to think about the learning activity (reflection) and the availability of materials at students’ convenience (review) are of paramount importance. W
Supplemental Material
The text of the student online attitudinal survey and the written student course evaluation are available in this issue of JCE Online. Literature Cited 1. von Glaserfeld, E. Questions and Answers about Radical Constructivism. In The Practice of Constructivism in Science Education; K. Tobin, Ed.; Lawrence Erlbaum Associates: Hillsdale, NJ, 1993. 2. Mazur, E. Peer Instruction: A User’s Manual; Prentice Hall: Saddle River, NJ, 1997. 3. Judson, E.; Sawada, D. JCMST 2002, 21 (2), 167–181. 4. Poulis, J.; Massen, C.; Robens, E.; Gilbert, M. Am. J. Phsy. 1998, 66, 439–441. 5. Ward, C. R.; Reeves, J. H.; Vetter, R. Student Response System (SRS); Department of Chemistry, University of North Carolina at Wilmington: Wilmington, NC, 2003. 6. Duty, R. C. J. Chem. Educ. 1982, 59 (3), 218–219. 7. American Chemical Society General-Organic-Biochemistry Examination. (Form 2000), ACS Exam Institute: Clemson, SC, 2000. 8. Roadrangka, V.; Yeany, R. H.; Padilla, M. J. Group Assessment of Logical Thinking; University of Georgia: Athens, GA, 1982. 9. Timberlake, K. C. Study Guide with Selected Solutions to Accompany Chemistry: An Introduction to General, Organic, and Biological Chemistry, 8th ed.; Benjamin Cummings: San Francisco, 2003. 10. Timberlake, K. C.; Carlson, C. L.; Timberlake, W. Test bank to accompany Chemistry: An Introduction to General, Organic, and Biological Chemistry, 8th ed.; Benjamin Cummings: San Francisco, 2003. 11. Timberlake, K. C. Chemistry: An Introduction to General, Organic, and Biological Chemistry, 8th ed.; Benjamin Cummings: San Francisco, 2003.
Vol. 83 No. 3 March 2006
•
Journal of Chemical Education
493