Information • Textbooks • Media • Resources edited by
Teaching with Technology
James P. Birk Arizona State University Tempe, AZ 85287
LUCID: A New Model for Computer-Assisted Learning Troy Wolfskill and David Hanson Department of Chemistry, SUNY at Stony Brook, Stony Brook, NY 11794-3400; *
[email protected] An ever-growing body of materials provides computerassisted learning for students in introductory chemistry courses. Most of these materials do not employ artificial intelligence (1), but instead provide supplements or enhancements to traditional modes of instruction (2). A number of these materials provide enhancements to traditional textbooks or lectures with sound, video, animations, or interactivity such as the ability to rotate images of molecules or vary parameters in graphs (3–5). Others enhance traditional homework assignments by providing instant feedback regarding the correctness of an answer (3, 4, 6–8). Email and Internet discussion boards have been used to enhance students’ interactions with each other and the instructor outside of the classroom (9, 10). In a few cases, whole courses are delivered on computers (11). While these efforts to provide more stimulating and accessible learning materials are commendable, there is a need for materials that more effectively engage students in the learning process. While video, sound, animation, and interactivity may be stimulating, they do not necessarily promote critical thought. While instant grading eliminates the lengthy delays associated with the return of homework, it does not directly develop skills in solving complex problems. While email and Internet discussion boards provide an additional means through which students can interact, they do not necessarily improve understanding. In this paper we present ideas for promoting student engagement in the learning process that we have incorporated in a software product called LUCID, for Learning and Understanding through Computer-based Interactive Discovery. We also report on student assessments of LUCID. These ideas and the results of the student assessments should be of interest not only to developers and users of educational software but also to anyone who is searching for new ways to engage students in learning, even in the absence of technology. Process Workshops—A Context for Engagement In our first effort to increase student engagement, we replaced recitation sections for General Chemistry with process workshops (12). In these workshops, students work in teams of three or four on activities that are facilitated by a graduate student teaching assistant. In the activities (12–14), students examine models or examples and respond to criticalthinking questions that compel them to process information, verbalize and share their understanding, and make inferences and conclusions; that is, to construct knowledge. They then apply this knowledge in simple exercises and to problems that require higher-order thinking. The teams report their work to the class using a chalkboard, assess how well they have done and how they could do better, and reflect on what they
have learned. This format, which could be used in all class meetings, emphasizes subject mastery as well as skill development in the key areas of teamwork, communication, management, assessment, information processing, critical thinking, and problem solving. These workshops produced remarkable improvements in student efforts, attitudes, and accomplishments (12). LUCID was developed to enhance workshop activities in a number of ways. For example, text-based models are static; LUCID provides interactive models to increase student engagement. As students progress through exercises and problems they often are slowed by their lack of confidence in their answers; LUCID provides instant feedback through pop-up windows to promote confidence. Reporting the team’s work on the chalkboard is time-consuming and disruptive; LUCID provides network reporting to instantaneously share detailed solutions to problems. Students often hesitate to challenge the work of other teams or express their lack of understanding; the reporting utility in LUCID provides peer assessment features to promote critical review and discussion. Students are reluctant to self-assess their own performance; LUCID provides class performance distributions to aid self-assessment. Features of LUCID
Modular Activities Are Organized around Major Course Concepts and Skills Typically, two to four activities cover the material from a chapter in a standard general chemistry text. For example, four activities are provided for gases: Measurement of Pressure, The Ideal Gas Law, Gas Mixtures, and Kinetic–Molecular Theory. The current version includes a total of 63 activities. Students Navigate Freely through Activities A variety of navigational tools provide access to any part of any activity at any time. These tools include a table of contents with hyperlinks to each activity; buttons, menu items, and hot keys to move forward and backward within or between activities; hyperlinks to prerequisites; and a “Going On” menu that presents related activities and provides alternate routes through the curriculum. An Orientation Prepares Students for the Learning Process Each activity begins with an orientation that provides the motivation for the learning both within the context of the course and beyond, two to three learning goals, and performance criteria to help students determine the level of mastery expected of them. The orientation also lists new concepts, needed vocabulary, and prerequisite concepts and skills, with hyperlinks to prerequisite activities as noted above.
JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education
1417
Information • Textbooks • Media • Resources
Key Questions Guide the Exploration of Interactive Models and the Development of Understanding Rather than organizing and presenting information as in texts or lectures, LUCID aims to develop conceptual understanding through guided discovery. This process is structured by providing five to ten questions that guide students’ exploration of interactive models that embody particular concepts or skills. Interactivity is typically provided with controls that allow parameters to be varied. Examples of such models are provided in Figures 1 and 2. Sample questions accompanying the model for kinetic–molecular theory are presented in Box 1. Students answer these questions individually prior to a workshop. During a workshop they discuss the answers within their teams and work to produce higherquality answers that may be shared with other teams. Features not found in our models are notable. We do not generally employ animation or video, as we view these as more suited to the presentation of information. Instead, we provide simulations that students can control. Nor do we attempt to simulate a laboratory environment in which needed devices must be identified, retrieved, and assembled. While a number of our models incorporate virtual instruments, our focus is on providing concrete examples from which students can develop understanding, not on developing laboratory skills.
Figure 1. An interactive model for the kinetic–molecular theory of gases. This model provides a simulation of gas particles confined to a cylinder. Small circles represent the particles that are in constant motion, colliding with the walls and each other. Students can change the volume or temperature by moving the sliders located on the volume and temperature controllers above and below the cylinder. Particle identity can be changed with the radio buttons at the bottom right. As the temperature or particle mass changes, the speeds of the balls change and the probability distribution shifts. Students can determine which balls are moving in a particular speed range by clicking on a bar in the graph. This sets the color of the bar in the chart and changes the color of balls moving within that speed range.
1418
Box 1. Key Questions Accompanying the Model for Kinetic– Molecular Theory 1. What is the effect of each of the following on the average speeds of particles? changing the container volume changing the atomic mass
changing the temperature
2. What is the effect of each of the following on the distribution of particle speeds? changing the container volume changing the atomic mass
changing the temperature
3. In terms of the dynamics of the gas particles, why does increasing the quantity of gas increase its pressure when the temperature and volume are held constant? 4. In terms of the dynamics of the gas particles, why does increasing the volume of a gas decrease its pressure when the temperature and moles of gas are held constant? 5. In terms of the dynamics of the gas particles, why does the pressure increase when the temperature is increased at constant volume and moles of gas? 6. Why is the pressure of one mole of krypton the same as that of one mole of helium at the same volume and temperature when the average speed of the krypton atoms is slower? 7. What are two situations in which the distribution of speeds in a gas would play a role?
Figure 2. Interactive model of atomic orbital hybridization. Students can combine the 2s and 2p atomic orbitals (top) to produce the hybrid atomic orbitals represented at the bottom of the model. The orbitals are combined by spinning the “Orbital Mixer” in the center of the model. The equations below the hybrid atomic orbitals show the fractional contributions of the 2s and 2p orbitals. The questions accompanying the model help students consider how and why the shape of the orbital would change as the 2s and 2p combine.
Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu
Information • Textbooks • Media • Resources
Easy-to-Use Tools Support a Variety of Modes of Communication and Improve Understanding A major area of research in chemical education is the study of student misconceptions (15). This work has revealed the difficulty students encounter in identifying appropriate pictorial representations for chemical concepts. For example, as students think of gases as lighter than air, they are apt to picture a gas as a collection of particles clustered at the top of a container instead of dispersed throughout the container. We view this as a particular case of the general problem of translating understanding from one mode of representation to another. We have identified six modes for representing understanding that chemistry students need to be adept in: (i) verbal, (ii) symbolic, (iii) mathematical, (iv) tabular, (v) graphical, and (vi) pictorial. LUCID provides easy-to-use tools for students to work in each of these modes as follows. A Rich Text Editor that supports verbal constructions in the text fields associated with each question and problem. This editor supports the inclusion of characters such as subscripts, superscripts, Greek letters, and common chemical symbols.
correct, students must use available resources to develop their understanding of the remaining components that are incorrect. A structured reflection at the end of each activity (discussed below) encourages them to extract general learning and problem-solving strategies from this process. We do provide hints in one instance—when an answer is fundamentally correct but either incomplete or of low quality. In such cases, students are informed that the answer is close and are directed to the issues they need to address, such as significant figures for numerical problems or the states of the species in a chemical reaction equation. An example of a team’s interaction with our feedback system for chemical reaction equations is presented in Box 2.
Box 2. A Team Interaction with the Multilevel Feedback System
Students were asked to write a net ionic equation for the reaction between aqueous lead nitrate and aqueous sodium chloride. After a brief discussion, one of the teams entered the following.
An Equation Solver that supports the evaluation of mathematical expressions embedded within the above text fields.
ANSWER: PbNO3 + NaCl → PbCl + NaNO3 FEEDBACK: Incorrect, though the correct symbols for the elements are present and the equation is balanced.
A Table Editor that supports the tabulation and manipulation of information and data obtained with virtual instruments.
After some discussion, the team recalled that the oxidation state for lead was +2 and tried again.
A Graphing Tool that supports the plotting of functions and tabulated data. A Drawing Tool that supports the pictorial representation of ideas. A Lewis Structure Tool that supports the inclusion of Lewis structures.
The information created with these tools is automatically saved and retrieved as needed.
Instant Multilevel Feedback Promotes Confidence While Developing Problem Solving Skills As students work through exercises and problems, their progress is often slowed by a lack of confidence. One strategy for promoting confidence is to provide instant feedback to answers. This is often done through hints, answers, or solutions. We view such devices as undermining the learning process, however. Our experience suggests that effective problem solving requires effort, proceeds from understanding, enriches understanding, and yields general strategies that are transferable to new contexts. The provision of hints, answers, or solutions short-circuits this process and undermines the effectiveness of problem solving as a learning activity. To avoid these problems we developed a series of algorithms to analyze student answers. These algorithms determine not only whether an answer is correct or incorrect, but whether there is anything correct about it. For example, in writing a molecular formula for a compound, are the correct symbols for the elements included? are the atoms in standard order? and are the subscripts correct? Students are then informed through pop-up windows of the aspects of their answer that are correct. To arrive at an answer that is completely
ANSWER: Pb(NO3) 2 + 2NaCl → PbCl2 + 2NaNO3 FEEDBACK: Incorrect, though the correct symbols for the elements are present and the equation is balanced.
This produced some consternation until one student noted that a net ionic equation was needed. After reviewing the text, the students became quite excited and fairly quickly entered the following. ANSWER: Pb2+ + 2Cl᎑ → PbCl2
As the student at the computer raised her hand to press the Enter key, her teammates sat back to relax, confident that the correct answer had been reached. As her hand left the keyboard, the response came back— FEEDBACK: Close, though what are the states for each species?
The students jumped forward in their seats, staring at the monitor and asking each other “States? What do they mean by states?” In the ensuing conversation, one student suggested that they needed to know the pressure, volume and temperature of the system, while another tried to determine why they needed the “state” and not the “path”. After considerable research and debate, several attempts that repeatedly failed to produce the correct answer, and some questioning by the instructor, the students were able to determine that the ions are dissolved and that solubility rules predict that the lead chloride must be a solid precipitate. ANSWER: Pb2+(aq) + 2Cl᎑(aq) → PbCl2(s) FEEDBACK: Correct! Good job.
We were fascinated by this observation. We had assumed that students could not correctly answer this question without understanding the nature of a precipitation reaction. We saw students arrive at an answer that would normally have passed for correct simply by matching a pattern from the text. The majority of the learning occurred only after they had achieved this “correct” but poorly understood answer.
JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education
1419
Information • Textbooks • Media • Resources
Figure 3. A sample network report with peer assessment. A typical page is illustrated. This report covers the nine questions accompanying the interactive model for VSEPR theory. Each page presents one question (top left), with the responses of two randomly selected teams beneath it. Students navigate from question to question with the “Previous” and “Next” buttons (bottom right). Questions can also be accessed by clicking on the bars in the vertical bar chart at the right. The radio buttons above each response, labeled “Good”, “OK”, and “Challenge”, are used by the teams to assess the reported responses. Immediately to the right of each response is a column of +, ✓, and ᎑ signs. These indicate each team’s assessment of that response; + indicates “Good”, ✓ indicates “OK”, and ᎑ indicates “Challenge”. All assessments for all questions in the current report are summarized in the bar chart (right), in which the light bars to the right show the relative number of assessments of “Good” and the dark bars to the left show the relative number of assessments of “Challenge”. A quick glance thus suffices to determine the degree of consensus among the teams.
Network Reporting and Peer Assessment Promote Critical Review and the Achievement of Consensus Our feedback system is used only for answers to exercises and problems having a single correct answer. To provide feedback for more complex work, such as answers to key questions or solutions to problems requiring estimates and approximations, we rely on a reporting process. In our text-based workshops, these reports are written periodically on a chalkboard. This is disruptive in that it requires the spokesperson to leave the team. Students are apt to minimize the disruption by reporting abbreviated solutions. While discussions of these reports help eliminate errors and misconceptions documented on the board, it is difficult to identify the uncertainties that remain within each team. The network reporting system in LUCID provides rapid sharing of such work along with utilities for students to assess each other’s answers (16 ). This system randomly selects a few teams to report for each question. As teams review the reports, they assess each of them as “good”, “ok”, or “challenged”. The assessments are summarized in a graph that shows the relative number of good and challenged evaluations for each report. The facilitator then can focus class discussion on challenged answers to improve quality and achieve consensus. A sample report is shown in Figure 3. 1420
Performance Distributions Provide a Perspective for Self-Assessment At the close of each workshop, students reflect on what they have learned and assess their performance by identifying strengths, areas for improvement, and strategies to improve. This process is quite a challenge for them. To assist them, LUCID provides four performance distributions in the form of bar graphs that summarize the work of all the teams in a section. Graphs are included for the distributions of the quantity of work completed, the care with which that work was completed, the quality of the reports that were made, and the quality of the evaluations of other teams’ work. These distributions are illustrated and discussed in Figure 4. Assessments of LUCID Two methods were used to assess the effectiveness of LUCID. The first was a survey of students’ attitudes toward the computer-based activities. The second was a comparison of grades earned by students assigned to computer-based workshops with grades of students assigned to text-based workshops. Both studies involved students registered for the second semester of a two-semester general chemistry course during the spring semester of 1998. This course included three
Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu
Information • Textbooks • Media • Resources
Figure 4. A sample reflection and self-assessment activity with performance distributions. A typical page is illustrated. Each team enters their response to the questions in the boxes provided. The responses are saved to a network drive from which the instructor can print them for review. The bar graphs to the right constitute the performance distributions discussed in the text. The darker bar indicates the range in which the team performed, providing an easy comparison with the performance of other teams. The basis of the graphs is as follows. Questions Answered: the relative number of answers completed by each team during the workshop. Questions per Attempt: the relative number of questions completed for each attempt; this discourages students from entering hasty answers and encourages careful thinking so that the first answer is correct. Reports: the relative quality of reports as determined by peer assessments; the most points are awarded to reports that receive a final assessment of “good” and are unmodified by the reporting process, the fewest are awarded to reports that receive a final assessment of “ok”. Evaluations: the relative quality of peer assessments; the most points are awarded to challenges that lead to modifications of reports, the fewest are awarded to challenges that fail to produce modifications.
55-minute large-class lectures each week and one 80-minute workshop (12). A total of 779 students were registered for 27 sections of the course; of these, 108 were assigned to three computer-based workshops, all taught by the same instructor. The remaining 671 students were assigned to 24 text-based workshops taught by an additional eight instructors.
Assessment of Student Attitudes During the final week of classes, students using LUCID in two of the three workshops were asked to assess the activities. A number of these students had experienced both text-based and computer-based workshops. The version of LUCID they were using included interactive models and multilevel feedback. Network reporting was not successfully implemented until the following semester. The first part of the assessment requested students to agree or disagree with positive statements about the software. Figure 5 presents a graphical overview of student responses. Students were generally quite positive about the software, the most positive responses being for statements 1, 3, and 5. All of the statements are listed in Table 1, which also shows the percentage of students who were positive in their response and the percentage who were not negative. The most positive responses were associated with the multilevel feedback, inter-
active models, and overall enjoyment of using the software. The lowest positive response was for statement 8 regarding the value of our approach for individual study. It is interesting to compare this response to the considerably higher positive response to statement 9, which was aimed at determining the value of the workshop environment for students. Our interpretation is that while students see the software as potentially valuable for individual use, they recognize the enhanced value provided in the process workshop environment by interactions with their peers and the instructor. The second part of the assessment asked students to freely answer three questions about the most valuable aspect of the software, the greatest disadvantage of the computerbased activities, and the area for greatest improvement. The questions and the most frequent responses are listed in Table 2. Students valued the instant feedback and interactive models most highly and suggested that the greatest disadvantage lay in the use of the Lewis structure and drawing tools. These tools have since been redesigned and are now significantly easier to use. With regard to the area for greatest improvement, it is not surprising that many students wanted more explanations and solutions to problems. We have no intent to provide these, however, because their absence is one of our principal strategies for engaging students in discussion, thinking, and
JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education
1421
Information • Textbooks • Media • Resources A–Strongly Agree
B–Mildly Agree
D–Mildly Disagree
C–Nuetral
E –Strongly Disagree
Table 1. Summary of Student Responses to Positive Statements Regarding LUCID Responses (%)
20
Number of Response
Statement
15
10
5
0 1
2
3
4
5
6
7
8
9
Statement Number Figure 5. An overview of student assessments of LUCID from the spring of 1998. Student responses to the positive statements concerning LUCID are illustrated. The questions to which the statement numbers refer are presented in Table 1.
learning. Students also noted that better recognition of answers was required, particularly with regard to the input of chemical reaction equations. We have eliminated this problem by developing algorithms that analyze reaction equations, chemical formulas, and isotopic symbols to support more flexible entry.
Analysis of Student Grades Grades were determined from six components: weekly computer-based quizzes, weekly workshops, three midsemester tests, and a final examination. The weekly quizzes were prepared with the CAPA system (6 ), which provides each student with a unique set of 5 to 10 problems. Workshop grades are assigned to each team by the workshop instructors and are primarily determined by the quality of teamwork and reports. All tests use multiple-choice questions that emphasize problem solving and are prepared from commercially available test banks supplemented with questions prepared by the course lecturers. The overall grade is determined with weights of one-sixth for each test, one-third for the final, and onetwelfth each for the quizzes and workshops. Table 3 summarizes average performances for students assigned to computer-based workshops and students assigned to text-based workshops in each graded component of the course. Students assigned to the computer-based workshops performed better on every component of the grade and had an overall average 4% higher than that earned by students in text-based workshops. While the greatest improvement (11%) was associated with weekly quizzes, students in computerbased workshops also performed 4% above students in textbased workshops on the first test and the final. Discussion The positive student assessments of LUCID, the higher grades for students in computer-based workshops, and the improvements in student attitudes, retention, and performance 1422
PosiNontive negative
1. Computer-based activities are more enjoyable than textbased activities.
90
97
2. Computer-based activities help me learn more than text activities.
81
97
3. Computer models … help me understand concepts better than models I can’t interact with.
87
97
4. Instant feedback … increases the speed at which I can learn.
84
94
5. Instant feedback improves my skill in applying what I have learned.
90
97
6. Instant feedback increases my self-confidence in my learning.
84
97
7. I would recommend computer workshops over text workshops to a friend.
77
94
8. The computer activities would be helpful even without a workshop.
71
84
9. Working with others in a workshop with an instructor present helps me learn more than I would on my own.
84
94
NOTE: All nonnegative responses are included on the recommendation of administrators, who tell us that, from their perspective, neutral responses to general chemistry are positive.
Table 2. Most Common Responses to Open-Ended Assessment Questions Regarding LUCID Question
Common Responses
% of All Responsesa
What was the most valuable Instant feedback aspect of the computer activities? Interactive models
53 16
What was the greatest disadvantage of the computer activities?
None noted Drawing was too time consumingb
22 19
What was most in need of improvement?
Nothing noted More explanations neededc Better recognition of answersd
19 16 16
aResponses
given by fewer than 10% of the students are not included. early version of the Lewis structure drawing tool that prompted this response has since been improved. cStudents continue to express a strong desire for explanations and solutions, which we do not intend to provide. dThis response was primarily associated with the need to enter chemical reaction equations according to a strict format requiring the exact placement of spaces. This problem was eliminated by developing algorithms to analyze students’ answers, allowing a significantly more flexible entry. bThe
that accompanied our earlier implementation of process workshops (12) have been very encouraging. We believe many of the features of LUCID would be valuable in other teaching and learning contexts as well. In more traditionally structured courses, interactive models could be projected during lectures, and exercises and problems with multilevel feedback could be assigned as homework. For independent study, our interactive models, critical thinking questions, and multilevel feedback provide features not currently found in the CD-ROM products that accompany most texts. While LUCID should have value in these other contexts, we would not expect the results reported above. These positive results rose out of the positive environment created in a process workshop, where students are engaged with learning, with each other, and with the facilitator. They frequently find themselves
Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu
Information • Textbooks • Media • Resources Table 3. Comparison of Grades for Students Assigned to Computer-Based and to Text-Based Workshops Group
No. of Students
Score (%) Home- Workworka shopsb
Testc 1
2
3
Final OverExamd alle
All Students
779
81
72
42
48
45
42
49
Text-Based Workshops
671
80
71
41
47
45
42
49
Computer-Based Workshops
108
91
77
45
50
46
46
53
aComputer-based
personalized homework is assigned each week using the CAPA system. bTeams are assigned weekly workshop grades based on the quality of their teamwork and reports. cEach of the three midsemester tests provides 20–25 multiple-choice questions that focus on problem solving. dThe final consists of 50 multiple-choice questions with an emphasis on problem solving. eEach test contributes one-sixth, the final contributes one-third, and the quizzes and workshops each contribute one-twelfth to the overall grade.
actually enjoying the learning process. Affective issues, such as frustration, which can arise and not be addressed in independent study, are addressed constructively in the learning teams through reflection and assessment activities and encouragement from the instructor. While LUCID provides a model for enhancing this environment with technology, we view the environment itself as fundamental to creating positive student attitudes. As used in process workshops, we see LUCID as a new model for computer-assisted learning. In this model, the authority for the learning is not placed in the artificial intelligence of the computer, but in the real intelligence of the students and the instructor. The primary role of the computer is not to enhance presentations or provide drill and practice, but to facilitate the expression, communication, discussion, and improvement of ideas, problem-solving strategies, and the learning process itself. Within the development of this new model, a number of issues remain to be addressed.
Availability and Technical Issues A beta version of LUCID can be obtained from the Office of Learning Communities, State University of New York, Stony Brook, NY 11794-3357. LUCID currently requires a PC running Windows 3.1, 95, 98, or NT. The network reporting and peer assessment utilities require a LAN. While our software is readily adopted, installation of the network features may require assistance from a system administrator, and authors will require expertise in Asymetrix Toolbook (17). To address these issues, we are porting LUCID to a Web-compatible format. The Web version will include authoring tools to facilitate the adoption, adaptation, and extension of our work by others. Student work will be stored in a database where it will be more easily shared, tracked, and studied. Research tools will be included to support the study of student learning. Such studies should provide insights for refining our instructional strategies. Curriculum Design Issues Our design for activities is intended to structure the learning process for students so that they are more readily engaged in it. One aspect of the learning process that we do
not currently address is reading. While guided readings (reading assignments accompanied by critical-thinking questions) are a natural way to provide such structure, they need to be adapted for each text. A more general approach may be to use a formal reading journal (18) that students could submit periodically for peer review and assessment. Placing such a journal on the computer for use with information resources on the Web or a CD-ROM would greatly facilitate peer review and assessment. Our greatest challenge remains that of developing students’ problem-solving skills. Network reporting promotes the sharing of solutions, but students want these solutions to appear neat and ordered as they are in texts. We believe improvements in problem solving may come by focusing students on the messier thinking process from which these solutions are generated. We are currently exploring ways for students to more clearly document the solution process for peer and self-assessment.
Issues of Extension: Other Courses, Other Disciplines, Other Levels Our approach is readily adapted to other courses and other disciplines, and we are actively seeking collaborations for this purpose. Our activity design should be well suited to courses in the natural sciences, social sciences, mathematics, and other disciplines. Even in the humanities, we believe significant value may be found in our strategy for organizing discussions, in which individuals record their personal responses and this is followed first by team and then by interteam discussion. There may be great value in extending our approach to the secondary level of education. Selected cases suggest that junior high students are enthusiastic about our interactive models and sufficiently mature to answer key questions, and that they enjoy this mode of learning, which is inherently social and fun. We do not see any value of extending our approach beyond the introductory college level, however, without significant modification. Our approach promotes student engagement by structuring the learning process. Mature learners need the ability to structure their own learning. In making this transition from structured learning to more independent learning, our guided discoveries should be replaced by reading or research assignments, our easy-to-use tools should be replaced by professional tools, and our feedback mechanisms should aim at developing the ability of learners to validate their own solutions. Reflection, peer assessment, and self-assessment activities, however, should continue to play a central role in developing effective learning processes. Conclusion Teaching methods in the sciences often aim to structure and present information in a competitive environment in which the work of individual students is evaluated by authorities in order to assign a grade. In process workshops we aim to structure the learning process by asking questions in a supportive environment in which student work is assessed by the students themselves in order to improve the quality of that work. LUCID’s interactive models, easy-to-use tools, multilevel feedback, network reporting, peer assessment, and performance distributions significantly enhance this approach. When used in this context, LUCID provides a new model for computer-
JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education
1423
Information • Textbooks • Media • Resources
assisted learning in which the role of the computer is to facilitate the expression, communication, discussion, and improvement of conceptual understanding, problem solving strategies, and the learning process itself. Acknowledgments We would like to thank the National Science Foundation, the Long Island Consortium for Interconnected Learning, and the Chemistry Department at SUNY Stony Brook for financial support of this project. We also recognize Dan Apple of Pacific Crest Educational Technologies for introducing us to many of the ideas from which process workshops and LUCID were developed, and John Ranck and the members of the MoleCVUE Project for discussions that underlie many of the interactive features of the program. This project would not have been realized without the support of the Instructional Computing staff at SUNY Stony Brook; we extend particular thanks to Eric Johnfelt, Behzad Barzideh, and Nancy Duffrin. Literature Cited 1. Cognitive Tutor Programs; Carnegie Learning: Pittsburgh, PA, 1999; http://www.carnegielearning.com/ (accessed Jun 2001). 2. Bell, M.; Gladwin, R.; Drury, T. J. Chem. Educ. 1998, 75, 781–784. 3. Burke, K. A.; Greenbowe, T. J.; Windschitl, M. A. J. Chem. Educ. 1998, 75, 1658–1661.
1424
4. Appling, J.; Frank, D. Discover Chemistry 2.0 [CD-ROM]; Brooks/Cole: Pacific Grove, CA, 1999. 5. Interactive General Chemistry CD-ROM 2.5 [CD-ROM]; Saunders: Philadelphia, PA, 1999. 6. Morrissey, D. J.; Kashy, E.; Tsai, I. J. Chem. Educ. 1995, 72, 141–146. 7. Wegner, P. Mastering Chemistry; California State University: Fullerton, CA, 1998; http://titanium.fullerton.edu/mc/ (accessed Jun 2001). 8. Spain, J. Chemi-Skill-Bildr, version 4.1 [CD-ROM]; Electronic Homework Systems: Pendleton, SC, 1995. 9. Pence, L. E. J. Chem. Educ. 1999, 76, 697–696. 10. Paulisse, K. W.; Polik, W. F. J. Chem. Educ. 1999, 76, 704–708. 11. Harcourt e-Learning. Archipelago Distributed Learning Courses; http://www.archipelago.com/archdl/index.html (accessed Jun 2001); see the General Chemistry section on this page. 12. Hanson, D. M.; Wolfskill, T. J. Chem. Educ. 2000, 77, 120–130. 13. Hanson, D. Foundations of Chemistry, 2nd ed.; Pacific Crest Software: Corvallis, OR, 1996. 14. Hanson, D. Discovering Chemistry: A Collaborative Learning Activity Book; Houghton-Mifflin: New York, 1997. 15. Gabel, D. J. Chem. Educ. 1999, 76, 548–553. 16. For another implementation of peer assessment in general chemistry, see Russell, A.; Chapman, O.; Wegner, P. J. Chem. Educ. 1998, 75, 578–579. 17. Toolbook II Instructor; Asymetrix: Bellevue, WA, 1996. 18. See for example Carroll, S.; Beyerlein, S. The Learning Assessment Journal; Pacific Crest: Corvallis, OR, 1996.
Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu