Mapping Students' Thinking Patterns in Learning Organic Chemistry

Modern “Homework” in General Chemistry: An Extensive Review of the Cognitive Science Principles, Design, and Impact of Current Online Learning Sys...
0 downloads 10 Views 184KB Size
Research: Science and Education edited by

Chemical Education Research

Diane M. Bunce The Catholic University of America Washington, D.C. 20064

Mapping Students’ Thinking Patterns in Learning Organic Chemistry by the Use of Knowledge Space Theory Mare Taagepera* and S. Noori Department of Chemistry, University of California, Irvine, CA 92697-2025; *[email protected]

Overview Tracking the development of students’ conceptual understanding of organic chemistry during the one-year sophomore course has revealed both expected and unexpected results. As expected, the students’ knowledge base increases; but the cognitive organization of the knowledge is surprisingly weak, and misconceptions noted by numerous authors (1–3) persist even after two years of college chemistry. We used a simple analysis of percentage of correct responses on pretests and posttests and established the connectivity of these responses by using the knowledge space theory (KST) developed by Falmagne et al. (4). The application of KST to science concepts has been demonstrated by Taagepera et al. (5). Student responses were used to define the students’ knowledge structure and their critical learning pathway, which reflect the cognitive organization of knowledge. If the responses are completely random, a knowledge structure cannot be constructed. Ideally, there will be some hierarchical ordering of concepts. For instance, in mathematics, one may assume that students need to know how to add before they can multiply.

Hierarchy in Organic Chemistry Is there a hierarchical ordering in organic chemistry? If we could agree on what that is, could we teach in such a way that students learn to construct their knowledge rather than memorize seemingly unrelated facts for the exam, as has been documented for conceptual vs algorithmic learning (6 )? Even if the experts in the field can agree on a logical hierarchy, do students as novices necessarily follow the same path? There is research to indicate that the thinking patterns of experts and novices can differ considerably in solving the same problem (7). Or do we perhaps not even need to agree on what the hierarchy is, as long as students learn to make their own connections through concept mapping similar to that developed by Novak and others (8) or by using alternative methods? How would one construct a hierarchy in organic chemistry? What is the organizing principle? Students often use the textbook chapter numbers or even the time of the year when something was taught. A number of organizing principles have been used in textbooks, such as the functional group approach and the bonding approach (single, double, triple bonds). The most useful fundamental organizing principle from our experience seems to be similar to the electron density distribution in the molecule as applied by Shusterman and Shusterman (9). Our Approach to Hierarchy Our approach differs somewhat from the one used by Shusterman and Shusterman, in that we ask students to 1224

predict electron densities from simple electronegativities and resonance structures. This does lead to some differences. For example, in formaldehyde the hydrogen atom is the most electron deficient in absolute terms, but the carbonyl carbon is the center of nucleophilic attack, as predicted from electronegativities and the contributing resonance structure. Although the approach is crude it is simple, and it allows the student to “get to first base” in predicting the physical and chemical properties of new compounds. In regard to chemical properties, it predicts that the negative end of one molecule will attack the positive end of another. This is a first approximation, but it prevents students from making the most obvious mistake— having a negative end of one molecule attack the negative end of another one. In regard to physical properties, the ability to sort out electron densities as reflected in ionic, polar covalent, and covalent bonds leads to predictions of change of state as well as solubilities from the strength of intermolecular attraction. Further applications lead to understanding oxidation and reduction by looking at the electron distribution around a carbon atom and to an understanding of relative nucleophilicities (which can be related to the concentration of charge and solvation) and leaving group abilities (which can be related to the diffusion of charge). This fundamental principle, if used with enough practice, will not easily be forgotten and should serve the students well into the future in predicting the physical and chemical properties of any chemical compound. The more sophisticated electron density mapping is easily accessible through the MacSpartan program (10). Lecture demonstrations of density mapping are now possible: the electron-dense areas of the molecule (in this case, red) and electron-deficient areas (blue) are easily visible. This visualization is a powerful tool in helping students form mental images of electron distributions.

Defining a Knowledge Space in Organic Chemistry If we can define a knowledge space in organic chemistry, this would also allow us to present a unified conceptual structure. In our case, the students need to know something about electron densities to predict physical and chemical properties. Facts or concepts that do not connect to anything else should be left out. The Third International Mathematics and Science Study (TIMSS) for K–12 indicated that the science curriculum in the United States covers more topics at a less demanding level than curricula of countries whose students outperform ours (11). Our curriculum is “a mile wide and an inch deep”. The same holds for organic chemistry. In an effort to maintain a reasonable size, some textbooks approach an outline format with disconnected topics instead of telling a conceptual story with all the appropriate connections. Too

Journal of Chemical Education • Vol. 77 No. 9 September 2000 • JChemEd.chem.wisc.edu

Research: Science and Education

many of the chapter problems are at a simple recall level. We are also teaching this material to students who do not expect to see linkages because of their K–12 training. Establishing the necessary connections will take time and textbook space, but if we do not do it, the students will forget the material soon after the final exam. It is interesting that the working memory seems to be limited (12). Since it contains both facts and the procedural knowledge for chunking those facts, the more facts we present, the less space will be available for processing. Students will therefore not be able to perform the processing necessary for transfer of information from the working memory to long-term memory. The Research Design The experimental group were primarily biology majors who were enrolled in the regular organic chemistry course at the University of California, Irvine. Students entering this course were required to have passed a one-year course in general chemistry. The academic-year sequence comprises 30 weeks divided into three quarters (CHEM 51A, 51B, and 51C). Most sequences are offered during each of the quarters, Fall (F), Winter (W), and Spring (S). Table 1. Schedule of Tests and Number of Students Tested Test Date

Students Tested (No.)

Class

Pretest

Posttest 237

1995 Winter Quarter

CHEM 51B

269

1995 Spring Quarter

CHEM 51C

171



1995 Fall Quarter

CHEM 51C

210

123

1996 Fall Quarter

CHEM 51A

379

290

1997 Winter Quarter

CHEM 51B

254

217

1997 Spring Quarter

CHEM 51C

463



Table 2. Content Knowledge in the Three Stages of Organic Chemistr y Class

Question No.

a

1

2

3

4

5

6

7

8

9

51A

57

52

36

40

45

26

11

24

13

51B

73

64

78

91

76

34

34

36

31

51C

63

69

64

86

71

28

41

27

37

aValues

represent the percentage of students who answered the question correctly. Data for pretests and posttests are combined.

Figure 1. Comparison of students in three stages of organic chemistry.

Timetable of Administered Tests The research design and the KST model required that students be given both a pretest and a posttest so that once a learning pathway was developed, a student’s progress through this pathway could be followed. A schedule of tests is shown in Table 1. Approximately 2600 tests were administered. Posttests were not given in classes taught by instructors who did not necessarily emphasize electron density approaches. The same test was given throughout the study to all classes at various stages of learning organic chemistry, to follow the construction of a knowledge structure. It was developed by the research group, which was composed of UCI faculty and students who had taken or were taking organic chemistry and was based on the first four chapters in an organic textbook—in our case, E¯ge’s text (13). The test contained nine questions arranged in order of difficulty, as determined by experts on the assumption that if students could determine electron distribution in a molecule on the basis of electronegativities, they should then be able to predict physical properties (state changes and solubilities) from intermolecular attractions and reactivities by determining the most electrondense site (base, nucleophile) and the least electron-dense site (acid, electrophile). The test questions and acceptable answers are given in the Appendix. Scoring the Test The test was scored in a binary fashion: each question was graded as either right or wrong; no partial credit was given. For example, if the explanation for boiling points or solubilities in question 3 or 4 was incorrect, then the whole answer was incorrect even if the student had circled the right compound. For reactions 5–9 credit was given for a product that was reasonably correct with no penalty for incorrect electron density distribution. Partial analysis indicated that of the students who did get the correct answer, approximately 75% also had the correct charge distribution. Results Results were analyzed by looking at the percentage of correct responses and by applying KST to look for the connectivity in responses (the student’s cognitive organization of the material). Some caution is needed in interpreting the data because we did not have the same number of students taking the pretests and posttests, nor did the same students continue the three-quarter sequence in the same section. Since we have two or three parallel sections, students who took the test in CHEM 51A might not have taken it in CHEM 51B and might have taken it again in CHEM 51C. The instructors and therefore the emphasis also change from one course to another. However, the trends noted have been reproduced. In CHEM 51B we repeated the testing twice, and in CHEM 51C, three times. Students who took the test multiple times could have recognized the questions, but since the answers were never discussed explicitly, a correct answer would still indicate some learning on the student’s part. The only exception to this was in CHEM 51A F ’96, where a dedicated effort was made to constantly refer back to electron densities and explicit questions about electron densities also appeared on the examinations. In general, content knowledge increases during the first two quarters and then drops off in the third, as shown in Table 2 and Figure 1. Students who are better at identifying electron

JChemEd.chem.wisc.edu • Vol. 77 No. 9 September 2000 • Journal of Chemical Education

1225

Research: Science and Education

densities also do better in predicting physical properties (questions 3 and 4), although they should have had quite a bit of practice in general chemistry. They also improve with a simple acid–base reaction (CH3OH + NaOH, question 5). Not surprisingly, the most difficult questions remain the chemical reaction questions applied to carbonyl compounds (questions 7–9). Although their performance improved over time, after two years of chemistry, students still had problems recognizing that the alcohol or carbonyl function can be protonated (questions 6 and 8) and still had difficulty predicting nucleophilic substitution (addition) reactions (questions 7 and 9). Approximately 70% of the students mastered the simpler concepts (questions 1–5) by the end of the year, but only 30–40% were reasonably sure of their carbonyl chemistry (questions 7–9) or recognized that an alcohol could be protonated (question 6). Analysis of the Data: Building the Knowledge Structure Using the ␹2 Method The formal mathematical details of knowledge space theory are presented in the book on knowledge spaces by Doignon and Falmagne (14 ). KST depends upon collecting student data from a set of questions reflecting different levels of conceptual development. Although multiple-choice tests can be used, a better option is a test with open-ended questions or, failing that, a multiple-choice test with justification. The questions can be arranged in any order, but each response must be scored as correct or incorrect only. In this manner, an N-question test will generate a maximum of 2N response states, of which only some are populated. In our 9-question test there are 29 or 512 possible response states, from the null state φ in which no questions are answered correctly to the state Q in which all questions are answered correctly. For example, a student who answers questions 1 and 3 correctly is in the response state [1,3]. From all the response states, KST attempts to recognize a subset (called the knowledge structure) of 10 to 30 response states that represent the whole response population to a predetermined χ 2 value, usually 0.05. The χ 2 calculation indicates how well the selected states represent the original data set. The procedure to select these states is a systematic trial-anderror process. One starts with the most populated response states, then adds and subtracts response states to minimize the χ2 value while forming an interconnected network where each state (other than φ and Q) has a preceding state and a succeeding state and each successive state has exactly one more question than the preceding one (i.e., the structure is well graded). The states in the final knowledge structure can number from a few to several dozen, depending upon the complexity of the knowledge structure of these states. The relative probability that a particular response state (now called a knowledge state) is in the given knowledge structure is calculated during the χ2 analysis. Once the knowledge structure is determined, each knowledge state j has an approximate probability value Pj from the χ2 fit. Several pathways exist in the knowledge structure. The dominant ones are approximated by a sequence of N knowledge states from φ to Q that have the highest probabilities. The most probable pathways will stand out from the rest even though their probabilities might have been undervalued in the fitting process. Therefore, we examine the probabilities expecting to 1226

Table 3. Responses and Response States of Three Students on a Hypothetical 2nd-Grade Math Test Problem

Response of Student 1

2

3

(1)

3+3

6

6

9

(2)

2+2+2

6

4

6

(3)

3×2

5

6

6

% Correct Response State

67

67

67

[1,2]

[1,3]

[2,3]

find a few pathways with significantly higher probabilities and we designate them as critical learning pathways. From the knowledge structure and the critical learning pathways we attempt to understand the student message and determine how far the novice pathway is from the expert pathway. To augment the process we can use student interviews, interpretations of their answers’ justifications, and other techniques. A simple example that might result from a second-grade mathematics test consisting of 3 questions is shown in Table 3. Suppose the test were given to 3 students, all of whom answered 2 of the 3 questions correctly for a score of 67%. Yet their response states are different, demonstrating a possibly differing understanding of the addition and multiplication concepts: Student 1 in the [1,2] response state can probably add but does not yet know how to multiply. Student 2 might be unsure of adding three numbers and might have simply gotten question 3 correct by a lucky guess. The other possibility is that she or he made a careless error in question 2. Student 3 probably made a careless error in question 1.

One test will not give a definitive analysis, but repeated tests will. The diagnosis will help the teacher to plan the lesson better for the whole class as well as help each student individually. Falmagne’s team built a knowledge space for grades 1–8 mathematics by a set of approximately 500 problems. This is currently in use in the Irvine Unified School District schools. Students take a short test (some subset of the knowledge space) each week. The teacher has immediate feedback on the knowledge state for each student. More information about the project, named ALEKS, is available on the Internet (15).

Data Analysis Optimization of 8 of the 10 sets of data produced from 6 different courses gave a well-defined knowledge structure. Two sets of data could not be optimized: the pretest for the beginning of the sequence (51A F ’96) and the posttest for the 51B, W’95. That there was no logic pattern in the beginning of the year (51A F ’96) was not surprising, because the students had little knowledge of organic chemistry. The 51B W’95 data could not be optimized because approximately 30% of the students answered all of the questions correctly, and of the remainder there were too few students in too many knowledge states. Figure 2 presents two critical learning pathways: one for the experts, who base their reasoning on electron densities, and the other for novices, which summarizes most of the main

Journal of Chemical Education • Vol. 77 No. 9 September 2000 • JChemEd.chem.wisc.edu

Research: Science and Education

features of the eight data sets. There were minor variations in every data set, depending on the material that had just been covered. In the specific critical learning pathway for the CHEM 51C W’95 posttest, presented in Figure 3, the charge densities around the carbonyl function are correctly identified somewhat earlier along the pathway. Most of the CHEM 51B and 51C data could best be summarized by the “novice structure”. Instead of understanding the structure–reactivity analysis on the basis of electron densities, as in the “expert structure”, the students start with algorithmic knowledge: methanol dissolves in water (question 4); methanol has a higher boiling point than ethane (question 3); sodium hydroxide reacts with methanol (question 5), as represented in the novice structure. The implication is that in spite of the reference to polarity in answering questions 3 and 4, students seem to not appreciate the more fundamental fact that polarity is a reflection of differences in electron distribution. If they forget that alcohols could be polar, they will have no way of deducing this from first principles. The 51A F ’96 posttest was unique in being similar to the expert structure in the acquisition of charge distributions before physical and chemical properties. The reason for this is that having learned from our previous experience, we did not simply introduce the notion of charge distribution during the first lecture and refer to it casually thereafter, but emphasized it throughout the quarter. Moreover, the students had had no previous organic chemistry, so their basic conceptions could be systematically developed using the electron density approach. A specific critical learning pathway analysis is given in Figure 3. The data were optimized using only 18 of the possible 512 knowledge states. In this case there were two major pathways as shown. The majority of students were on the critical learning pathway designated by the thick lines in Figure 3 (the sequence of obtaining correct responses is 4, 3, 2, 1, 5, 7, 8, 9, 6). There was approximately a 50:50 branching in the order of getting questions 1 and 5 correct when progressing from knowledge state [234] to [12345]. The critical learning pathway in CHEM 51C showed that the majority of students answered the questions about physical

properties correctly: first the solubility question (4) and then the boiling point question (3). The electronic structure or charge questions followed: first the charge distribution on the carbonyl function (2) and then on an alcohol (1), again showing that the students did not relate the electron distribution to predicting physical properties. The better knowledge of the charges on formaldehyde might reflect the fact that the carbonyl function had been studied for the preceding two quarters. Then there was about an equal probability of getting the alcohol charge question (1) and the simple acid–base reaction (5) correct. The general pattern that held in Figure 2 is still valid, although the charge density on the carbonyl (question 2) comes earlier. As in the general pattern, the carbonyl reactions are still difficult even after 2 quarters (questions 7–9); and surprisingly, the Lewis base property of alcohols (question 6) is the most difficult of all. Whether the product predictions (questions 6–9) had been arrived at by making the correct charge assignments was ascertained by looking at a sample of the tests. Approximately 25% of those who got the correct answer had some ambiguity in the designation of the charges. However, the question was counted as correct if the product was correctly designated. Fifty students were followed through a 3-quarter sequence to study the time dependence of the critical learning pathway. All 50 stayed pretty much on the pathway that they started on, losing not more that one knowledge state and gaining several as the year progressed. A simplified version of the critical learning pathway calculation is given on the Internet for a 10-question test (16 ). Common Misconceptions The misconceptions about physical properties became very obvious in the written responses. Common misconceptions persist throughout the freshman and organic chemistry courses. In this test they could be placed into the following categories: (i) belief that bond polarities depend on absolute electronegativities of atoms only, whether they are connected or not (e.g., hydrogen will always be positively charged); (ii) confusion of boiling and burning, thinking that covalent bonds are broken on boiling; (iii) inability to recognize reaction types

Expert Structure (hypothetical critical learning pathway) 5

1

2

Charge densities

3

4

B.P.

Solubility

5

6

8

7

Nucleophile/ Electrophile*

Acid/Base

* the assumption is made that acid/base chemistry is easier for experts Novice Structure (most common critical learning pathway from student results)

3

1

2

6

8

45

345

2345

9 φ

4

1

34

14

234

124

1234

12345

123457

1234578

12345789

Q

1245

φ - initial state Q - initial state

*

- knowledge states - critical learning pathway for most students

4 5

7

9

Figure 2. Critical learning pathways for experts and novices.

* For example, 234 represents students who had answered Questions 2, 3, and 4 correctly

Figure 3. Knowledge structure for students in CHEM 51C F ’95.

JChemEd.chem.wisc.edu • Vol. 77 No. 9 September 2000 • Journal of Chemical Education

1227

Research: Science and Education

(tending to do nucleophilic addition reactions to carbonyl groups with strong acids, not recognizing a simple proton transfer reaction); and (iv) belief that hydrogen bonding involves a covalent bond. Conclusions This research indicates that we need to spend a lot more time on making connections than we generally do, regardless of the organizing principles that we use. Since students’ knowledge is often algorithmic rather than based on basic principles, students have difficulty solving problems in new situations or retaining knowledge. Instructors need to spend more time helping students to construct a knowledge space based on basic principles. This implies that we need to “cover” less in more depth and change our teaching strategies as a result of continuous monitoring. If students are obviously unable to relate new information to basic principles, there is little justification in moving on. Merely testing new information does not necessarily show connections to basic principles unless these are continuously monitored. Nonmajors will not have the luxury of waiting until they obtain their Ph.D.’s to finally gain enough insight to organize their own information. We need to be more aware of our own knowledge structure and make it more transparent for the students. Otherwise the students’ cognitive structure will remain weak. Students are generally not tracked throughout the oneyear sequence using the same probe or test as was the case here. The low level of understanding of carbonyl chemistry at the end of the year was disappointing. The implication is that regular test results show a deceptively high level of understanding of organic chemistry because much of the information is memorized for the test and then forgotten—there is no carry-over into long-term memory. This carry-over is critical in a sequential course such as organic chemistry. A lot of information will be lost if there is no follow-up advanced organic chemistry course where some of the concepts can be reinforced and conceptual understanding developed. The analysis based on electron densities was introduced during the first lecture of every quarter and referred to thereafter, except for CHEM 51C in S ’95 and ’97, when other instructors were involved. It is clear that a one-time exposure to electron densities and casual reference thereafter, although it is a very simple idea, is not enough, and tests need to include a lot of why? questions relating to electron densities. Rote memorization may to some extent suffice for multiple choice questions. Unless the why? questions, which take longer to grade, are asked, we will not be able to probe the students’ thinking patterns. For instance, approximately 30% of the students circled the correct molecule in questions 3 and 4 (relative boiling points and solubilities of methanol and ethane) for entirely the wrong reason. Knowing this, we should be able to bring the novice structure closer to the expert structure by continually referring to the expert structure, thereby making it more transparent. This appeared to be the case in CHEM 51A F ’96, where a conscious effort was made to repeatedly emphasize the electron density analysis in lecture as well as to include it explicitly on examinations. Our awareness of the difficulty of countering the misconceptions necessitates addressing these issues head on. For instance, when talking about a hydrogen bond it is necessary to 1228

continue saying that this is not a covalent bond and to describe why that is the case. It is gratifying that textbook authors are beginning to describe the common problems or misconceptions that students have. What are our goals for teaching the one-year organic chemistry course for nonmajors? Our philosophy is that students should be able to recognize structure–reactivity relationships and have enough practice doing simple synthesis problems to propose a reasonable synthesis (and the relevant mechanisms) for almost any simple compound. These results indicate that the structure–reactivity relationship goal is not achieved unless the underlying principles are continually reinforced. The KST analysis provides a new approach to the assessment of students’ cognitive organization of knowledge and therefore enables us to teach with greater insight. The next phase of the study will make some of the questions less ambiguous, probe for more specificity in the answers to the why? questions, and be more selective of the student pool. General chemistry students’ understanding of bonding concepts will also be probed using the KST approach. Acknowledgments We would like to thank Jean-Claude Falmagne for his unwavering willingness to guide us; Frank Potter, who did all of the KST calculations, for his enthusiasm and support during the project; and the many undergraduate students who collected the data and shared their insights with us: Ronald Picazo, Jon Detterich, Mildred Fabros, Albern Yolo, Janine Dymand, Kristopher Rinehart, Ahmad Kamal, Siamak Abai, Rachel Estrada, Najeeb Khan, and Arash Soroudi. Literature Cited 1. Zoller, U. J. Res. Sci. Teach. 1990, 27, 1053–1065. 2. Nakleh, M. B. J. Chem. Educ. 1992, 69, 191–196. 3. Peterson, R. F.; Treagust, D. F. J. Chem. Educ. 1989, 66, 459– 460. 4. Falmagne, J.-Cl.; Doignon, J.-P. J. Math. Psychol. 1988, 3, 232–258. 5. Taagepera, M.; Potter, F.; Miller, G. E.; Lakshminarayan, K. Int. J. Sci. Educ. 1997, 19, 283–302. 6. Nakleh, M. B.; Lowrey, K. A.; Mitchell, R. L. J. Chem. Educ. 1996, 73, 758–762. 7. Larkin, J.; McDermott, J.; Simon, D. P.; Simon, H. Science 1980, 208, 1335–1342. 8. Pendley, B. D.; Bretz, R. L.; Novak, J. D. J. Chem. Educ. 1994, 71, 9–15. 9. Shusterman, G. P.; Shusterman A. J. J. Chem. Educ. 1997, 74, 771–776. 10. MacSpartan, version 1.0, and PC Spartan, version 1.0; Wavefunction, Inc.: Irvine, CA; http://www.wavefun.com/ (accessed Apr 2000). 11. Schmidt, W. H.; McKnight, C. C.; Raizen, S. A. A Splintered Vision: An Investigation of U.S. Science and Math Education; Kluwer: Boston, 1996. 12. Johnstone, A. H. J. Chem. Educ. 1997, 74, 262–268. 13. E¯ge, S. Organic Chemistry; Heath: Lexington, MA, 1994; pp 1–154. 14. Doignon, J.-P.; Falmagne, J.-C. Knowledge Spaces; Springer: London, 1999.

Journal of Chemical Education • Vol. 77 No. 9 September 2000 • JChemEd.chem.wisc.edu

Research: Science and Education 15. Falmagne, J.-C. ALEKS at UC Irvine: A Complete Educational System for Arithmetic and Elementary Algebra; http:// www.aleks.uci.edu (accessed Apr 2000). 16. Potter, F. Simplified Version of KST Analysis; http://chem. ps.uci.edu/~mtaagepe/ KSTBasic.html (accessed Apr 2000).

Appendix Organic Chemistry Test, Answers, and Comments NOTE: This test is a diagnostic tool which will not affect your grade. NAME: ________________________

ID #: ____________

1, 2. Place “᎑” next to the atom with the highest concentration of electron density and “+” next to the atom with the lowest concentration of electron density. H

Question 1:

H

C

O

O

H

Question 2:

H

C H

H

According to the expert pathway, questions 1 and 2 are the easiest. Answering them correctly requires only the fundamental concept of electron density. Ordering the atoms present in order of increasing electronegativity (H ≅ C < O) is sufficient information to allow the student to correctly answer these questions. Correct responses to questions 1 and 2 are: H

Question 1

H

C

δ−

O

H

δ+

δ+

H

or

H

H

(b) Place “᎑” next to the atom with the highest electron density and “+” next to the atom with the lowest electron density. (c) Predict products for the following reactions. 5. CH3OH + NaOH → 6. CH3OH + H2SO4 → O

7.

8.

H

+

H

+ H2SO4 →

C H

H

O

H

resonance contributor

Using similar reasoning for electronegativities, students should be able to identify the polarity of the carbonyl group. Although hydrogen and carbon have similar electronegativities, students must realize that the hydrogen does not bear a partial positive charge because of its spatial relation to the oxygen atom. A knowledge of resonance contributors, while not necessary, may help students with this question: the polarity of the carbonyl group in formaldehyde is rationalized by its resonance contributor in which carbon bears a positive formal charge and oxygen bears a negative formal charge. (Students have not seen the computergenerated electron density map for formaldehyde, in which the H’s are the most electron deficient in absolute terms.) (a) Circle the compound with the higher boiling point. CH3OH vs CH3CH3 (b) Explain. The correct response is CH3OH. According to the expert pathway, this question leads the student to the next level of difficulty. The question requires knowledge at two levels. Identification of electronegativities is required to determine the polarity of the molecules. Students are then required to know that owing to its polarity, a methanol molecule has stronger intermolecular attractions. A basic understanding of boiling is also essential: students are expected to know that boiling requires overcoming intermolecular forces. Finally, students should realize that methanol has a higher boiling point owing to its stronger intermolecular forces—in this case, H-bonding.

+ NH3 →

C H

C H

+ NaOH →

C H

9.

O

C δ+

H



δ−

H

(a) Choose the molecule that would have the highest concentration of either + or ᎑ charge.

O

δ−

O

Predict Products

H

O

Question 2

C

(a) Circle the compound that would be more soluble in H2O. CH3OH vs CH3CH3 (b) Explain. The correct response is CH3OH. Question 4 is approximately as difficult as question 3. Once again using the concept of electronegativities, the polar and nonpolar molecules should be determined. Using similar reasoning, students should know that water is a polar molecule and realize that the polarities of methanol and water enable the formation of intermolecular attractions—the formation of hydrogen bonds between water and methanol. 4.

H

These questions require knowledge on three levels: (i) electronegativity, to predict the most positive and negative sites; (ii) selection of the molecule with the most negative (base, nucleophile) and most positive (acid, electrophile) site; and (iii) the product of the reaction.

+



5. CH3OH + NaOH → CH3O᎑ Na + + H2O



+

+

6. CH3OH + H2SO4 → O

7.

+

H O−

+ NaOH → H

C H

CH3OH

H + Na+

C



H

+ ᎑ HSO4

OH

Other acceptable answers included No reaction.

3.

– 8.

H

+

H

OH

C H

+ ᎑ HSO4

C H

O

9.

OH

+ H2SO4 →

C H

+

+

O

H

+ NH3 → H



C

H

NH2 O−

Other acceptable answers included

H

C

H

N H3 +

NH

and

.

C H

JChemEd.chem.wisc.edu • Vol. 77 No. 9 September 2000 • Journal of Chemical Education

H

1229