Problem Analysis: Lesson Scripts and Their Potential Applications

lesson scripts like the one discussed in this article. Lesson scripts can then be used as the foundation for instructional materials including highly ...
0 downloads 0 Views 103KB Size
Information • Textbooks • Media • Resources

Teaching with Technology

Problem Analysis: Lesson Scripts and Their Potential Applications Maria Oliver-Hoyo Department of Chemistry, North Carolina State University, Raleigh, NC 27695; [email protected]

In the quest to help students learn how to solve chemistry problems, diverse theories of learning and teaching strategies have been put to the test. Relying on documented problemsolving difficulties and strategies is helpful (1–4). However, this study showed that additional information could be learned by basic data collection that in turn could be used to develop lesson scripts like the one discussed in this article. Lesson scripts can then be used as the foundation for instructional materials including highly interactive computer materials. Data Collection Chemistry instructors predicted wrong answers to common problems in general chemistry that included nomenclature, stoichiometry, and mass percentages. To collect as many inputs from students as possible, questions to be answered were included on exams given to groups of 300 to 350 general chemistry students. All answers were examined and sorted. For exercises that required text input such as nomenclature, reasonable inputs were readily collected. For problems with numerical responses the task was more tedious. All incorrect answers had to be closely examined to see if they could have arisen from some recognizable line of thinking. For each type

of problem a sample question was examined to find typical inputs. Sample Question: Mass Percentages Eight possible mistakes were predicted by seven instructors for a question taken directly from an 1867 textbook (5), hence the appearance of the pound unit. “In 14 pounds of iron-rust (Fe2O3) how much O?”

The question appeared on a test where work had to be shown by students in order to get credit for the answer. Students were instructed to express their answer in pounds. Conversion factors including pounds to grams were given along with a periodic table. Each one of the 348 student answers was individually examined and sorted. Results Of the eight wrong answers predicted by seven instructors to the sample question above, seven appeared on the students’ tests; however, eleven others were not anticipated. Incorrect answers for which a line of reasoning could be identified include a few that appear just a single time and occasionally

Table 1. Examination of Written Input for the Mass Percentage Sample Question Input Input Range 1 4.0–4.4

Frequency Reasoning 192

Reply

Correct

You are right. Your answer in grams is correct, but the question asked for lb.

2 1800–2000

2

Answer: grams

3 30–31

2

Answer: mass %

You were asked for lb of O, not mass %.

4 0.28–0.32

0

Given: 1 lb Fe2O3

You are close, but made your calculation for 1 lb instead of 14.

5 0.10–0.12

8

Given: 1 mol Fe2O3

You started with 1 mole Fe2O3 instead of 14 lb Fe2O3.

6 1.4–1.5

3

Given: 14 mol Fe2O3

Check to see if you found lb O in 14 mol instead of 14 lb of Fe2O3.

7 0.42–0.49

4

Mole ratio: 1O:3Fe2O3

Check to see if your mole ratio is upside down by using 1 mol O for 3 mol Fe2O3.

Mole ratio 1O:1Fe2O3 or 4.2 g O3 = 1.4 g O

First see if you used the right number of moles of O for 1 mole of Fe2O3. If so, note that 4.2 g O3 is 4.2 g O.

8 1.2–1.4

15 4 2

Mole ratio: 3O:5Fe2O3

Find the moles of O in 1 mole of Fe2O3 (not in 5).

10 5.2–5.3

2

M = 128 from (Fe2O)

You found the molar mass for Fe2O.

11 4.6–4.7

1

M = 144 from (Fe2O2)

You found the molar mass for Fe2O2.

9 0.80–0.86

12 6.4–6.5

2

M = 104 from (FeO3)

You found the molar mass for FeO3.

13 7.8–7.9

3

M = 86 from (F2O3)

Remember to look up the atomic weight of iron (Fe) not fluorine (F).

14 2.6

1

M = 86 (F2O3); mole ratio 1O:1Fe2O3 Be sure to look up the atomic weight of iron (Fe) not fluorine (F).

15 8.4

6

3

16 2.1–2.2

3 2

M O2 = 16 or 4.2 g O3 = 1.4 g O

Using 3 mol O2:2 mol Fe2O3 is ok but then use 1 mol O2 = 32 g (not 16). Or note that 4.2 g O3 contains 4.2 g O.

17

40–44

3

Given: 141 lb Fe2O3

The given amount was 14 lb, not 141 lb.

18

3.2–3.4

3

48 g O in 14 lb Fe2O3

There are 48 g O in 1 mol, not in 14 lb. Find the molar mass of Fe2O3.

19

6300–6400

4

14 lb Fe2O3 to g

Next find the number of moles of Fe2O3.

20

Other

Unrecognizable

Try again or type h or help for a hint.

86

/5 × 14

The formula gives the number of atoms, but they have different masses.

JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education

1425

Information • Textbooks • Media • Resources

some that did not turn up at all (answers predicted by instructors), but are possible in a larger sample. These answers are identified as “inputs” in Table 1. More than half of the 348 student papers had a correct answer (input 1) accompanied by a consistent line of reasoning. Depending upon how students did their arithmetic, answers ranged from 4.0 to 4.4. No answers in this range were arrived at by incorrect reasoning. About 75% of the correct answers arose from a solution that resembled the one below in which students first found the mass percentage of oxygen, then converted to pounds.

48 g O = 0.30 = 30% ; 160 g Fe2O3

then 14 lb × 0.30 = 4.2 lb

The rest used a string of conversion factors: 14 lb Fe2O3 ×

454 g 1 mol Fe2O3 × × 3 mol O × 1 lb 160 g Fe2O3 1 mol Fe2O3 16 g O × 1 lb = 4.2 to 4.4 lb 1 mol O 454 g

There were a few variations on this method using mol O2 or O3 and corresponding molar masses of 32 and 48. When using the “string of factors” everyone first converted pounds to grams then to moles, then back to grams then pounds, rather than using the pound-mole. It is interesting to note that none of the students using this method realized the redundancy involved. Inputs 2 and 3 have minor arithmetic errors or a unit that is readily converted to the requested one. Misreading the question by starting out with 1 lb Fe2O3 or 14 mol Fe2O3 or, most often, 1 mol Fe2O3 led to inputs 4–6. Troubles in finding the relation between mol O and mol Fe2O3 produced inputs 7–9, including the single most common error in which students ignored the formula and used 1 mol O for 1 mol Fe2O3 (in italic below): 14 lb Fe2O3 ×

454 g 1 mol Fe2O3 × × 1 mol O × 1 lb 160 g Fe2O3 1 mol Fe 2O 3 16 g O × 1 lb = 1.2 to 1.4 lb 1 mol O 454 g

Other misinterpretations of the formula were 1 mol O/ 3 mol Fe2O3 (upside down) or 3 mol O/5 mol Fe2O3. The molar mass was often incorrectly calculated even though the formula was given in the original question. In inputs 10–13, this was the only error (solutions still used 3 mol O/1 mol Fe2O3). For the atomic weight of iron some students looked up F rather than Fe (inputs 13, 14). Another common error (input 15) was finding the percentage of O atoms in the formula then multiplying by the given quantity, 3⁄5 × 14 lb. In this case every answer was 8.4, since the arithmetic was straightforward. When a diatomic element is involved in a question, its molar mass, should that be needed, is often miscalculated. Here students used 16 as the molar mass of oxygen rather than 32 (input 16). Some inputs came from combinations of errors (see input 14). Inputs 17–19 are tied specifically to the sample problem and not likely to be reproduced for others that are similar. An arithmetic error crept 1426

Table 2. Results of the Examination of Student Responses Question Mass Percentage

No. of Students 348

No. of Predicted Wrong Answers

No. of Unanticipated Answers

% of Predicted Incorrect Answers

8 (includes one not found on students’ papers)

11

39

Stoichiometry (a)

347

6

6

50

Stoichiometry (b)

330

7

6

54

into the problem here because of the use of the pound (lb) unit. Students saw the letter “l” in lb as the number one and included it in their given quantity, for example 141 lb iron oxide rather than 14 lb (input 17). The final few responses are very far from the complete answer (inputs 18, 19). Input 20 lumped together those labeled as unrecognizable. Two different incorrect approaches lead to ranges 1.2– 1.4 and 2.1–2.2. For instance, the range 1.2 to 1.4 (input 8) can be achieved by using a 1:1 ratio for mol O to mol Fe2O3. The identical answer arises when the molar ratio is correct; however, the student thinks 4.2 g O3 is only 1.4 g O. Both answers are off by a factor of three, but for entirely different reasons. It is advisable to craft questions for which this will not happen, but it is nearly impossible to foresee all the approaches from the start. Using the spreadsheet templates described in the next section helps to identify questions like this. Two stoichiometry problems were also examined: 1. How much H2O can be obtained from 3 lb of sal ammoniac (NH4Cl)? (5) 2. How much O can be obtained from 6 oz. of KClO3? (5)

The gathering and examination of responses was conducted the same way for these stoichiometric problems as described for the oxygen in iron oxide sample question. The results are summarized in Table 2. The following points should not be overlooked: 1. There were no answers in the range of correct response that were arrived at by incorrect reasoning. 2. Students are so mechanically inclined in the use of conversion factors that no student used the pound-mole concept. 3. The nature of incorrect responses included grammatical, mathematical, and conceptual errors.

Creating Lesson Scripts The extensive lists of questions and answers were collected and sorted into reasonable lines of thinking. These reasonable inputs became the set of anticipated answers that would serve as the database for developing lesson scripts. Replies tailored to match each input are organized into lesson scripts. These replies are tabulated in Table 1 along with their corresponding inputs. A comprehensive collection of anticipated inputs is needed to create effective lesson scripts. The information they provide can be utilized in directing how to teach certain topics, assess students with grading tools, and provide student feedback. Lesson scripts were transformed into interactive applications using an authoring system, Director, from Macromedia (6 ).1 The programming necessary to incorporate these results is left to the choice of the developer.2

Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu

Information • Textbooks • Media • Resources

Template for Mass % Problems

Fe2O3 CaS SO2

Question: How much B is present in given amount of AaBb? A a B b Given 56 2 16 3 14 23.4 1 32 1 50 32 1 16 2 90

No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Reasoning Correct: b B: 1 AaBb Answer: grams Answer: mass % Given 1 unit Given 1 mol Given 'x' mol 1B to b AaBb (upside down) 1B to 1AaBb (1:1) bB to (a+b)AaBb M = AaB M =AaBa M= ABb Correct, but with A' (not A) 1B to 1A'aBb Atom % * given

Fe2O3 4.2 1907 30 0.3 0.11 1.48 0.47 1.4 0.84 5.25 4.67 6.46 7.81 2.6 8.4

CaS 28.9 28.9 57.8 0.58 0.03 1.56 28.9 28.9 14.4 28.9 28.9 28.9 36.4 36.4 25

[Unit] lb g g

A' 19 12 23

SO2 45 45 50 0.5 0.13 11.3 11.3 22.5 15 60 60 45 52.4 26.2 60

Figure 1. Spreadsheet template.

involving S and O may cause arithmetic accidents because the atomic weight of S is twice that of O. For SO2 it is not possible to see whether the user made an error in reading the formula or calculated “atom percent”, since in both cases, the numerical answer is 60. The spreadsheet shown can be expanded to include longer formulas, AaBbCc…, questions about the amount of B rather than A, additional choices for A′ as well as B′, and so on. Using spreadsheets to predict answers and modify questions can be added to the long list of spreadsheet uses such as grading (7, 8), scientific model building (9, 10), and providing students with feedback (11). Results for the impact on the student population are summarized in Table 3. The percentage of students with correct answers ranged from 38 to 55%. Therefore, 45–62% had difficulties in solving these problems. Addressing only those students with incorrect answers, 36–57%, received a specific response to their inputs. The impact of this approach cannot be underestimated. Reaching this percentage of students who have difficulty with problem solving makes this labor-intensive effort invaluable. Applications

Table 3. Results on the Impact on Student Population % of Students with No. of Students

No Appar- No Attempt Correct Incorrect ent Rational at Solving Answers Answers Reasoning the Problem 9.8

% of Incorrect Answers That Received a Specific Response

348

55

45

19

347

44

56

20

15

36 38

330

38

62

17

10

57

The material described here can be used as a teaching tool in various ways, such as: 1. To improve teaching by noting the types of errors that appear and emphasizing or pointing these out when teaching a topic. 2. To create better multiple choice questions. 3. To assemble information needed to create grading guides for examinations. 4. To develop highly interactive computer programs to use as an electronic study guide.

Spreadsheet Template Numerical answers for similar questions were calculated by using a spreadsheet that generated all answers corresponding to errors previously identified. Inspecting the results reveals at a glance whether incorrect reasoning leads to a correct answer and whether the different lines of incorrect reasoning produce answers far enough apart to be distinguished from one another. The spreadsheet in Figure 1 is designed to produce answers corresponding to the errors encountered in the mass percentage question subjected to student survey. A new question is defined by inserting the formula and the given quantity (see the area surrounded by the black border). Atomic weights are entered for A and B. It is easy to make guesses about what element students may think A′ is. For example, Ca is often confused with C. This is what happened in the sample question above when eight students looked up the atomic weight of F instead of Fe. Subscripts a and b as well as the given quantity are entered (grams are used for all but the sample problem above). Formulas are set up to reproduce the errors encountered in the survey for O in 14 lb of Fe2O3. Inputs are assigned the same numbers as in Table 1. Flawed questions that produce the same answer for two different lines of reasoning can easily be spotted. For example, formulas of simple binary compounds such as CaS will not test the students’ ability to interpret a formula because inputs 7, 8 and 10–12 are identical to the correct answer. A formula

For computerized instructional materials, lesson scripts can be the core required in high interactivity. For high interactivity a “two-way communication” must take place between the computer and the student. This implies that the computer will address specific inputs in a specific manner. Human–computer interaction, HCI, is defined as “the information sent back to the user by a computer system resulting from a user’s input to the system” (12, 13). The problem analysis method described here can be used to create material that will produce a specific reply to match anticipated inputs to a given question or problem. Table 1 gives student inputs, their frequency, and suggested replies given to students in the instructional module. Replies to student inputs were designed to guide the student toward working out the correct answer, not to give limited feedback of a right or wrong response. After a student submits a new input, a different response is given as a guide toward the correct solution. For correct answers, photographs, films, or animations related to the question can accompany the text reply. The relevance of the computerized instruction application cannot be underestimated. The use of computers in chemistry dates back four decades now. As early as the 1970s computerized surveys were developed to collect student responses where “responses are measured against ‘decision rules’ established by the professor” (14 ). Computers have provided different levels of interactivity in tutorial programs (15–24); however, the format is limited to multiple choice, hot spots on graphical

JChemEd.chem.wisc.edu • Vol. 78 No. 10 October 2001 • Journal of Chemical Education

1427

Information • Textbooks • Media • Resources

images, and simple textual responses as styles used in the production of question banks. Technological advances have brought more software and hardware power. Critics have stated that the quality of computer-assisted instruction, CAI, has not kept up with advances in computing power (25, 26 ). Alfred Bork, a pioneer in computing science, criticized the way computers have been used in the educational arena and the absence of high-level interactivity (25). Examples of weak interaction given by Bork are the use of multiple choice and of quizzes and programs that depend only on pointing with a mouse. According to Craig Bowen, current CAI is no more effective than the modules of the 1970s in spite of computing power that allows the development of more elaborate CAI materials (26 ). Development of software that recognizes and responds specifically to user input has enormous potential to improve the effectiveness of the computer as a teaching and learning tool (25, 27–29). Unfortunately, little attention has been devoted to exploring models of feedback in the interactive context or to developing a useful theory of feedback (30, 31). The effectiveness of computers as teaching tools has already been documented (32–34). Although the challenges in implementing CAI are not trivial (35), efforts could be repaid in terms of better outcomes for students if the appropriate database is used as the foundation of interactivity. Conclusions Results clearly indicate that predicting student errors by even the most experienced of instructors is insufficient. The instructors predicted only 39% of the possible mistakes for the sample question discussed in detail in this article. The percentages went up to 50% and 54% for the stoichiometry problems. Extensive surveying is essential in the development of effective lesson scripts. Of the 45–62% of students who had wrong answers in this study, 36–56% were assisted by having their inputs addressed with specific responses. This means that more than a third of the students who had difficulties could benefit from this type of problem analysis and consequent creation of lesson scripts. Lesson scripts are intended to build upon what the student already knows how to do but to provide special attention where the student is having difficulty. These lesson scripts do not give answers but challenge students to find the right answer, very much as an individual instructor would do in a one-on-one setting. One way to create highly interactive materials is to develop a detailed and precise database from which relevant and specific information can be extracted. Students in long-distance learning environments should find this application particularly useful. This type of problem analysis, tailored to the wide range of incorrect methods of problem solving, offers the potential of ultimately transforming mere “interactive” modules into “highly interactive” ones. Acknowledgment Special thanks is due to Sally Solomon of Drexel University for helpful suggestions and for her unconditional support of my work.

1428

Note 1. Readers may access demo files of interactive modules including the iron rust problem discussed in this paper at http:// chemdept.chem.ncsu.edu/maria. 2. Inquiries about Lingo code created in our labs can be addressed to the author.

Literature Cited 1. Kramers-Pals, H.; Lambrechts, J.; Wolff, P. J. J. Chem. Educ. 1982, 59, 509–513. 2. Mettes, C. T. C. W.; Pilot, A.; Roossink, H. J.; Kramers-Pals, H. J. Chem. Educ. 1980, 57, 882–885. 3. Frank, David V.; Baker, Claire A.; Herron, J. Dudley. J. Chem. Educ. 1987, 64, 514–515. 4. Bodner, George M. J. Chem. Educ. 1987, 64, 513–514. 5. Steele, J. D. Fourteen Weeks in Chemistry; A. S. Barnes: New York, 1873. 6. Director, version 7; Macromedia Inc: San Francisco, CA, 1998. 7. Sparrow, G. J. Chem. Educ. 1985, 62, 139–140. 8. Suder, R. J. Chem. Educ. 1985, 62, 499–500. 9. Hayes, B. Sci. Am. 1983, 249 (4), 22. 10. Atkinson, D. E. In Using Computers in Chemistry and Chemical Education; Zielinski, T. J.; Swift, M. L., Eds; American Chemical Society: Washington, DC, 1997; pp 143–161. 11. Keiser, J. E. J. Chem. Educ. 1988, 65, 513. 12. Brandes, A.; Wilensky, U. In Constructivism; Harel, I.; Popert, S., Eds.; Human/Computer Interaction Series; Ablex Press: Norwood, NJ, 1991; pp 391–415. 13. Norman D. A. The Psychology of Everyday Things; Basic Books: New York, 1988. 14. Shakhashiri, B. Z. J. Chem. Educ. 1975, 52, 588–595. 15. Beatty, J. W.; Scott, E. S. J. Chem. Educ. 1982, 59, 130–131. 16. Jain, D. C.; McGee, T. H. J. Chem. Educ. 1980, 57, 253–254. 17. Wood, W. F.; Kent, C. D. J. Chem. Educ. 1981, 58, 47. 18. Breneman, G. L. J. Chem. Educ. 1979, 56, 783. 19. Glasser, L.; Bradley, J. D.; Brink, G.; van Zyl, P. J. Chem. Educ. 1996, 73, 323. 20. Lower, S. K. J. Chem. Educ. 1970, 47, 143–146. 21. Anderson, R. H. J. Chem. Educ. 1982, 59, 129–130. 22. Vaughn, C. J.; Morris, R.; Block, T. F. J. Chem. Educ. 1980, 57, 251. 23. Bendall, V. I. J. Chem. Educ. 1980, 57, 252–253. 24. Byrd, J. E.; Burt, J. T. J. Chem. Educ. 1980, 57, 619–623. 25. Bork, A. J. Sci. Educ.Technol. 1995, 4 (2), 97–102. 26. Bowen, C. W. J. Chem. Educ. 1998, 75, 1172–1175. 27. The Future of Learning: An Interview with Alfred Bork; Educom Rev. 1999, 34 (4); http://www.educause.edu/ir/library/ html/erm9946.html (accessed May 2001). 28. Birk, J. P. J. Chem. Educ. 1992, 69, 294–295. 29. Spain, J. D. J. Chem. Educ. 1996, 73, 222–225. 30. Ford, N.; Wood, F.; Walsh, C. Online CD-ROM Rev. 1994, 18 (2), 79–86. 31. Spink, A.; Losee, R. M. Annu. Rev. Information Sci. Technol. 1996, 31, 33–78. 32. Stolow, R. D.; Joncas, L. J. J. Chem. Educ. 1980, 57, 868–873. 33. Smith, S.; Stovall, I. J. Chem. Educ. 1996, 73, 911–915. 34. Treadway, W. J. J. Chem. Educ. 1996, 73, 876–888. 35. Bell, M.; Gladwin, R. P; Drury, T. A. J. Chem. Educ. 1998, 75, 781–785.

Journal of Chemical Education • Vol. 78 No. 10 October 2001 • JChemEd.chem.wisc.edu