Attitude toward Chemistry: A Semantic Differential Instrument for

Oct 1, 2008 - Suazette R. Mooring , Chloe E. Mitchell , and Nikita L. Burrows. Journal of ... Janelle A. Arjoon , Xiaoying Xu , and Jennifer E. Lewis...
0 downloads 0 Views 266KB Size
Research: Science and Education edited by

Chemical Education Research 

  Diane M. Bunce The Catholic University of America Washington, DC  20064

Attitude towards Chemistry: A Semantic Differential Instrument for Assessing Curriculum Impacts Christopher F. Bauer Department of Chemistry, University of New Hampshire, Durham, NH 03824; [email protected]

Instructors and curriculum innovators often set goals and make decisions that intend to enhance student attitudes regarding the subject matter of chemistry. Determining whether efforts are successful requires assessing those attitudes in a valid and reliable manner. Unfortunately, chemists perceive “attitude” as a unidimensional construct—which it is not—and believe that attitude assessment only requires a few good survey questions— which is not so. Excellent critical reviews in the education, psychology, and sociology literature over the past 30 years (1–14) have strongly argued this point. In a paper in this Journal concerning the Chemistry Self-Concept Inventory (CSCI) (15), this author has elaborated on the issues of (i) failing to distinguish among different mental constructs (e.g., attitude, belief, interest, value, self-concept, self-efficacy, understanding science, scientific habits); (ii) basing analyses on single survey items rather than multiple items that are averaged (replication advantage); and (iii) using assessments that are not independent of the intervention being assessed. This article describes a survey instrument for measuring student attitudes regarding “chemistry” as a body of knowledge or practices. (The survey is named the Attitude toward the Subject of Chemistry Inventory, or ASCI). The sense of “attitude” used here is the tendency to approach or avoid—to react negatively or positively—to the subject or discipline of chemistry (16–18). These tendencies may be exhibited through expressions about belief, about feelings, and about behavioral intentions regarding a particular object (19). Our understanding of attitude has evolved with the advent of stronger models of personal and social cognition and of motivation. Attitudes are learned; however the processes by which they become strong are not well understood (18). In particular, a lot of attention has been focused on the relationship between attitude and behavior (e.g., theory of planned behavior) (19–21) and on the issue of how attitudes change (18). Thus, “attitude to chemistry” is an important mental construct that may affect and be affected by student learning behaviors. In addition to the CSCI (15), other authors have developed new tools for noncontent assessment of student outcomes in chemistry, for example the Chemistry Attitudes and Experiences Questionnaire (CAEQ) (22) that probes self-efficacy, attitudes (toward chemists, chemistry research, chemistry topics, and chemistry careers), and learning experiences; CHEMX (23), which probes cognitive expectations; and the Student Assessment of Learning Gains (SALG) (24, 25). Instrument Design The instrument is designed in the format of a semantic differential: Students position themselves on a seven-point scale 1440

between two polar adjectives, in reference to how they feel about the attitude object “chemistry”. The classic works on semantic differentials are those of Osgood (26–28). Most of the applications in science precede 1985, as a search of the ERIC database using “semantic differential” and “science” will confirm. There are a few more recent applications of this approach (29–31), and the CAEQ (22) attitude scale includes several survey items bearing this structure. A prominent theoretical structure for attitude—emerging from factor analysis of results from many contexts and cultures—holds that attitude is composed of components of evaluation (e.g., good–bad, valuable–worthless), potency (e.g., strong–weak, heavy–light), and activity (e.g., fast–slow, excitable–calm) (17, 18). In practice, the evaluation component typically explains most of the variance, so the adjectives selected for the ASCI emphasize that aspect. Furthermore, the older literature sometimes used terms, particularly for the “activity” component, which could only be interpreted metaphorically (26–28). For example, respondents might be asked to describe chemistry as “large–small” or “fast– slow”. This problem was avoided by selecting adjectives that would make sense in a statement from one person to another, communicating that person’s affect concerning chemistry; for example, “Chemistry as a subject is (beneficial, organized, attractive, hard, fun, secure)”. An additional criterion was that the adjectives be understandable to those in a college-age population. The semantic differential (express your feeling toward chemistry on a scale indicated by polar adjectives) was selected over the typical alternative (indicate level of agreement with a statement, e.g., “I like chemistry.”) in order to focus attention on a single attitude object—chemistry. It was also desirable to avoid statements that might risk response bias for social desirability (whether subjects believe they should or should not like chemistry, as opposed to whether they do or do not), or for directionality of the statement (whether a negative versus a positive sense of statement affects responses) (32–34). Another important design feature of a semantic differential is trying to focus respondents to a very specific attitude object (17). The ASCI instructions request that respondents not consider their attitude toward chemistry teachers or chemistry courses, but focus on the subject of chemistry. Other research has used “science” as the attitude object. College students clearly are able to distinguish their feelings and performance among the various disciplines, such as chemistry versus biology versus physics (15, 35). Thus, it was important to focus on the single discipline of chemistry. The survey form has important physical design features intended to help a subject focus carefully on the adjectives

Journal of Chemical Education  •  Vol. 85  No. 10  October 2008  •  www.JCE.DivCHED.org  •  © Division of Chemical Education 

Research: Science and Education

and on the response range. Seven choices help strengthen the reliability of the instrument and take advantage of the ability of adults to draw distinctions (17). Adjectives and choices are placed on the same line. Some adjective pairs are listed with the “positive” adjective on the right side and some have the “positive” adjective on the left. This helps minimize response bias. For example, if all adjective pairs are listed with the positive sense on the right, respondents may fall into a pattern of acquiescence or mild agreement and not think about each item independently (32–34). The attitude object “CHEMISTRY IS” is prominently featured in large bold letters at the top of the page. The central choice is labeled “middle” several times down the page as a scale referent. Responses may be placed directly on the sheet, or on a scanning form. Seven-choice scan sheets may be purchased from a business forms company. It is also possible to administer the survey through a Web form (currently implemented at UNH). The hardcopy instrument is available in the online supplement.

Student Populations For the exploratory factor analysis, students in a general chemistry course at UNH, electing to participate through informed consent, completed the instrument. These students (about two-thirds of the course in the first year) represented diverse majors, including engineering, sciences, health and human services, and liberal arts. The inventory was administered in one specific lab room across an entire week near the end of the first semester. Because students are scheduled into lab sections independent of lecture section, this included a cross section with respect to lecture instructor, laboratory teaching assistant, academic major, lab time preference, and day preference. The validity of factor analytic results is strengthened when based on a representative sample of respondents from the population of interest (17, 36). The number of usable surveys was 379. Another student cohort from this same course was surveyed in a subsequent year to gather test–retest reliability data. Students were selected in the same manner. The first survey was

Table 1. Attitude toward Chemistry Inventory Items with Factor and Loading Profiles Itema

Polar Adjectives

Factor 1b

Factor 2b

Factor 3b

Factor 4b

Interest and Utility 15

worthwhile

useless

−0.85

−0.01

−0.06

−0.11

 2

worthless

beneficial

−0.79

−0.10

−0.03

−0.04

 6

good

bad

−0.71

−0.05

−0.20

−0.04

12

interesting

dull

−0.67

−0.32

−0.02

−0.15

 3

exciting

boring

−0.58

−0.38

−0.05

−0.09

Anxiety 19

tense

relaxed

−0.14

−0.75

−0.32

−0.02

16

work

play

−0.06

−0.74

−0.23

−0.15

 8

scary

fun

−0.35

−0.60

−0.18

−0.16

20

insecure

secure

−0.34

−0.53

−0.23

−0.29

13

disgusting

attractive

−0.42

−0.53

−0.01

−0.11

Intellectual Accessibility  4

complicated

simple

−0.03

−0.13

−0.80

−0.13

 5

confusing

clear

−0.24

−0.33

−0.75

−0.06

 1

easy

hard

−0.13

−0.18

−0.73

−0.34

10

challenging

unchallenging

−0.29

−0.36

−0.54

−0.01

 9

comprehensible

incomprehensible

−0.38

−0.03

−0.52

−0.41

safe

dangerous

−0.03

−0.05

−0.05

−0.85

Fear 18

Emotional Satisfaction 11

pleasant

unpleasant

−0.50

−0.44

−0.35

−0.27

14

comfortable

uncomfortable

−0.48

−0.43

−0.35

−0.28

17

chaotic

organized

−0.44

−0.34

−0.32

−0.15

 7

satisfying

frustrating

−0.41

−0.30

−0.46

−0.28

aItem

numbers in bold type must have their scores reversed before averaging. 

bLoadings

greater than |0.5| are considered strong (N = 379).

© Division of Chemical Education  •  www.JCE.DivCHED.org  •  Vol. 85  No. 10  October 2008  •  Journal of Chemical Education

1441

Research: Science and Education

presented during the first week of lab and the unannounced retest the second week. The number of usable surveys was 65. This latter group of students was also given the CSCI (15) at the same time. We were concerned whether response might be affected by this co-administration of the two instruments. About half of the students (half of the lab rooms) involved in the re–test cohort were given the CSCI first and then the ASCI immediately afterward, and the other half of the students had the reverse sequence. A week later, the re-test of each instrument was given only to those students who had done the particular instrument first. Additional student populations at the same institution— chemistry majors (n = 13) and students acting as study group leaders for the course (n = 19)—provided results for comparison. To exemplify its use, the ASCI was administered in a Chemistry and Society course. Chemistry and Society is a predominantly lectureless, inquiry-based, science-elective course for nonscience majors taught in two, 2.5-hour sessions weekly. Students engaged in hands-on activities, often structured in the form of the learning cycle (exploration, concept invention, application) (37). Content emphasized chemical concepts over mathematical problem solving. The text Chemistry in Context (38) was used. Students all had taken one year of high school chemistry but no college-level physical science course and little college science laboratory work. There were 21 students (12 female, 9 male). The paper-and-pencil form of the ASCI was given in the first and last weeks of the semester. Data Analysis Survey responses were manually transcribed or machine scanned to numerical values in the range of 1–7. Statistical tests were performed using a combination of tools, particularly Excel, Minitab, and SPSS.

Table 2. Attitude Scores by Various Student Populations Percent of Scale Range (0–100%) Subscale

General Chemistry

Peer Leaders

Chemistry Majors

Interest and Utility

56

78

82

Anxiety

60

46

43

Intellectual Accessibility

44

49

46

Fear

38

39

41

Emotional Satisfaction Set

45

67

70

Factors

Items

Note: Categories shown in bold type indicate significant differences among populations at p < 0.01.

Table 3. Attitude Scores by Various Student Populations Pearson Correlation Coefficient Valuesa Subscale

Factors Interest and Utility

0.18

Anxiety

−0.20

Intellectual Accessibility

Results

Items

Exploratory Factor Analysis Factor analysis helps identify survey items that show similar response patterns. A group of similar items defines what is called a “factor”. Detailed discussion of the procedures available and the decisions that must be made can be found in the citations (15, 17, 39). The matrix of two-by-two correlations among items contained a large number of values in midrange (0.3–0.7) indicating the likelihood that the data set would factor well. Factors were extracted by the principal components method. A scree plot indicated three eigenvalues greater than one, which explained 55% of the variance. Structure was explored by extracting 2–7 factors using Varimax (orthogonal) rotation and by studying the pattern and magnitude of the loading (degree of association) of each survey item on each factor. Ideally, loadings would be close to −1 or 1 for a few survey items on one specific factor, and near 0 for the other factors. The other factors would have a different set of survey items with high and low loadings. The stability of the loadings of individual items was compared manually as the number of factors extracted was varied. Three strong, distinct factors emerged, which were given the labels Interest and Utility, Anxiety, and Intellectual Accessibility. The labels attempt to capture the theme represented by the adjectives loading strongly on each factor. Table 1 shows the survey statements organized by factor and ranked by loading magnitude

Fear

1442

Course Anxiety Intellectual Fear Emotional Grade Accessibility Satisfaction Set

bCorrelation

0.32

−0.15

0.64

−0.58

0.15

−0.72

−0.19

0.62

−0.05b

Emotional Satisfaction Set aCorrelation

0.39

−0.51

−0.22

0.24

values are significant at p < 0.05 (N = 379). values are not significant for this item.

within the factor. (Loadings range from −1 to 1. Values greater than |0.5| are considered strong.) The high degree of relatedness of the items within each factor suggests that these items can be combined into a single subscale score. For instance, Intellectual Accessibility for one student consists of the sum of the values for items 1, 4, 5, 9, and 10 with one adjustment. Items that load with opposite signs (indicated by bold type in Table 1) must be reversed on the scale. Subscale scores may then be calculated as an average of the item ratings. Here we report subscale scores as a percent of the scale (scale value 1 = 0%; scale value 7 = 100%), where the largest value indicates a strong feeling that chemistry is “interesting and useful”, “anxiety producing”, and “intellectually accessible”. Several other adjective pairs cannot be called factors, yet may provide helpful insight as this instrument is more widely applied. Specifically, one item (statement 18, safe–dangerous) is clearly distinct—this is called the “Fear” item. Interestingly, despite various constraints on the analysis, this item never

Journal of Chemical Education  •  Vol. 85  No. 10  October 2008  •  www.JCE.DivCHED.org  •  © Division of Chemical Education 

Research: Science and Education Table 4. Reliability Estimates for Subscale Internal Consistency and Retesting Cronbach’s α Values

Test–Retest Correlation Values

Interest and Utility

0.83

0.74

Anxiety

0.77

0.64

Intellectual Accessibility

0.78

0.71

Subscale Factors

Items Fear

0.47

Emotional Satisfaction set

0.79

0.72

Table 5. Attitude Scores for Chemistry and Society Course Subscale

Pretest Posttest p Valuesa Scores, % Scores, %

Effect Sizeb

Factors Interest and Utility

73

77

0.50

0.22

Anxiety

62

48

0.01

−0.87

Intellectual Accessibility

29

35

0.19

0.44

Items Fear

52

40

0.09

−0.57

Emotional Satisfaction Set

52

60

0.18

0.44

aProbability

value for a type I error comparing pre and posttest means differences divided by pooled standard deviations

bPosttest–Pretest

loaded on the Anxiety factor, thus students are perceiving this descriptor differently, perhaps in terms of physical rather than emotional safety. Another set of items (statements 7, 11, 14, and 17) load weakly across all three major factors, and never emerge together as a distinct separate factor. Adding these weak items to one of the scales would raise the intercorrelation of the subscales, which was not desired. Nevertheless, these adjectives seem to have a consistent theme, and as a result are called collectively Emotional Satisfaction item set (but not a “factor”). The largest percent of scale values here indicate a strong feeling that chemistry is “safe” and “emotionally satisfying”. Validity and Reliability Validity was evaluated by comparing groups of students whose responses should be predictable: students in general chemistry, chemistry majors, and students who were discussion group leaders for general chemistry (Peer-Led Team Learning, or PLTL) (40). The PLTL leaders were not chemistry majors; about half had taken one or two semesters of organic chemistry. Table 2 contains the percentage score on each subscale for these three populations of students. Validity was also evaluated by comparing instrument subscale scores with student course performance (Table 3) and by comparing subscale scores with each other.

Two types of reliability were evaluated (17, 33, 34): subscale internal consistency (Cronbach’s α) and test–retest replication. Internal consistency is the relationship of the items in a subscale to each other. Table 4 contains Cronbach α values and test–retest correlations of student subscale scores. Students’ Outcomes for the Chemistry and Society Course Precourse and postcourse scores on the ASCI subscales are listed in Table 5. Anxiety decreased substantially (nearly a full standard deviation) and Fear to a reasonable extent (0.6 standard deviations). Intellectual Accessibility and Emotional Satisfaction increased moderately (about 0.4 standard deviations) at a marginal significance for an N = 19 population size. Intellectual Accessibility did not change significantly (small effect size of about 0.2) Figure 1 is a compact way to visualize results for a course. It shows the pre-to-post changes for each set of adjectives. The plotted points are the average student response for the beginning and end of the course. Responses (1–7 on survey) were converted to a normalized ‒1 to 1 scale with zero in the middle (position 4 on the survey). The adjectives are grouped by subscale and reversed in sense as indicated in Table 1 (i.e., not the same order or arrangement as in the survey itself ). The direction and magnitude of the changes in Figure 1 for individual adjective pairs is consistent with the average changes listed in Table 5. Discussion Validity The item loadings in Table 1 show a fairly clean distinction for three factors. Items that load strongly do so with values greater than |0.5|. Items that load weakly do so with values less than |0.35|. Evidence for content validity comes from comparing the three student populations (general chemistry, study group leaders, chemistry majors) in Table 2. The students who had less contact and experience and study with chemistry were significantly lower (analysis of variance) on Interest and Utility and Emotional Satisfaction, and higher on Anxiety, as might be expected. The Fear item and Intellectual Accessibility show little differences. The nonscience major group (Table 5) at the start of the semester rated highest of all groups on Anxiety and Fear and lowest in Intellectual Accessibility, which seems a reasonable expectation for this population. They were between the general chemistry and peer leaders on the scales of Interest and Utility and Emotional Satisfaction, which seems consistent with the fact that they elected to take the Chemistry and Society course. An instrument purporting to measure attitude should not exhibit exceptionally strong correlation with measures of content knowledge acquisition (otherwise the instrument is just another measure of achievement). This is confirmed in Table 3. All correlations with academic performance (course grade) are weak. The strongest is with the factor most identified with Intellectual Activity. This weak relationship is in line with what others have found—that the relationship between attitude and achievement is complex and indirect (41–44). The subscales correlate significantly with each other (Table 3). The three factors with a strong separation in loadings (Intellectual Accessibility, Anxiety, Interest and Utility) have weak to modest correlations. The Fear item has a very low rela-

© Division of Chemical Education  •  www.JCE.DivCHED.org  •  Vol. 85  No. 10  October 2008  •  Journal of Chemical Education

1443

Research: Science and Education

intellectual accessibility

interest & utility fear emotional satisfaction

anxiety

challenging complicated incomprehensible confusing hard dull useless bad boring worthless dangerous unpleasant frustrating uncomfortable chaotic play relaxed secure fun attractive

not challenging simple comprehensible clear easy interesting worthwhile good exciting beneficial safe pleasant satisfying comfortable organized work tense insecure scary disgusting 1

0.5

0

0.5

Figure 1. Attitude toward the Subject of Chemistry Inventory results for students in the Chemistry and Society course, shown for each survey item at start (squares) and end (circles) of the course (N = 21).

1

Average Student Rating, Normalized Scale

tionship to the others. The Emotional Satisfaction item set have modestly strong correlations with the first three, which reflects the ambivalence of their loadings across all three factors and emphasizes they are not a totally independent measure.

The nonscience students in Chemistry and Society, however, who had little to no college-level physical science experience, showed a change in the Fear score from 52% to 41% of scale over the semester.

Reliability Reliability coefficients (Table 4) are close to or above 0.7. The Cronbach values here are strong for attitude instruments (17). The test–retest results are weaker than expected, perhaps because retesting was done within the first three weeks of the semester. The one week between administrations represents a significant fraction of the new course experiences at that point, including the first lab experiment. This may have altered students’ responses more than just randomly. Notably, the Fear item score changed significantly ( p = 0.01) downward from week 1 to 2 (from 50% of scale to 42%), the only scale showing a significant change. The Fear item has no Cronbach value because it is a single item.

Sequencing Comparison of the mean scores for students who completed the ASCI first and CSCI (15) second versus those who completed them in the opposite order showed no significant differences below p < 0.1.

Factor Structure An attempt was made to further understand the Emotional Satisfaction items and the Fear item. Oblique factor structures (instead of orthogonal) and alternate methods of factor extraction did not provide a more interpretable set of factors. The Fear item and four Emotional Satisfaction items retained their uniqueness. One could exclude considering them in the analysis. The following insights suggest keeping them at this time. The Emotional Satisfaction items demonstrate reasonable internal consistency, and change in a direction consistent with growing experience with chemistry, but as a group they do not explain much of the variance. The Fear item may be more useful for populations of students with less familiarity with physical science. The results for this item were found to be 43 ± 3% of scale for a population of hundreds of general chemistry students sampled over several years both in September and May. This indicates that this item is not very sensitive for a group of students most of whom take chemistry as part of their program requirements.



• Interesting and useful the students feel the subject of chemistry to be.



• Anxious the subject of chemistry makes them feel.



• Accessible they feel the subject of chemistry is intellectually.

1444

Applicability The Attitude to the Subject of Chemistry Inventory is intended to provide a measure of students’ emotional stance with respect to the subject of chemistry. To the extent that the semantic differential structure can accomplish this, the Attitude to the Subject of Chemistry Inventory succeeds in delineating several reasonably independent features of affect, including how:

Furthermore, one other set of items seemed to broadly tap how emotionally satisfying the subject of chemistry is. The evidence indicates that students respond to these probes in a consistent manner, the strength of response is consistent with how familiar students are with chemistry, and the construct being assessed is something other than chemistry knowledge because of the weak association with achievement. There is no claim that this instrument is measuring fundamental mental states, for example, a student’s “anxiety”. This is just the label the author has placed on a set of survey items that have similar student response patterns. It will be most informative to compare “before and after” scores for individuals or groups over time, or to compare different groups with

Journal of Chemical Education  •  Vol. 85  No. 10  October 2008  •  www.JCE.DivCHED.org  •  © Division of Chemical Education 

Research: Science and Education

each other. As an example, the ASCI was used to assess student outcomes in the Chemistry and Society class taught by the author. Table 5 and Figure 1 show that the attitudes of nonmajors moved toward being less anxious and fearful about chemistry and toward more emotional and intellectual satisfaction with the subject. These results provide confirmation that the course structure and instruction were achieving the desired goals. Changing the physical form of the survey may change how students respond to it—introducing more noise or some bias (17, 32–34). Validity of the instrument (as a measure of attitude) may be more suspect the farther one gets away from a college-student population on which it was developed and tested. The tone of presentation is also important. Students are more likely to take the survey seriously if it is presented as an opportunity to help the instructor gain insight into students’ response to the curriculum. The survey takes students at most 10 minutes to complete. Acknowledgments I thank previous postdoctoral research associates Kimberly (Rickert) Woznack and Laurie Langdon, and laboratory coordinator Amy Lindsay, for their assistance and advice. Literature Cited 1. Munby, H. An Investigation into the Measurement of Attitudes in Science Education; ED237347, ERIC Clearinghouse for Science, Mathematics, and Environmental Education: Ohio State University, Columbus, OH, 1983. 2. Mayer, V. J.; Richmond, J. M. Sci. Educ. 1982, 66, 49–66. 3. Blosser, P. E. Attitude Research in Science Education. Information Bulletin No. 1; ED259941, ERIC Clearinghouse for Science, Mathematics, and Environmental Education: Ohio State University, Columbus, OH, 1984. 4. Schibeci, R. A. Stud. Sci. Educ. 1984, 11, 26–59. 5. Osborne, J.; Simon, S.; Collins, S. Int. J. Sci. Educ. 2003, 25, 1049–1079. 6. Haladyna, T.; Shaughnessy, J. Sci. Educ. 1982, 66, 547–563. 7. Shrigley, R. L. Sci. Educ. 1983, 67, 425–442. 8. Shrigley, R. L.; Koballa, T. R.; Simpson, R. D. J. Res. Sci. Teach. 1988, 25, 659–678. 9. Gardner, P. L. Stud. Sci. Educ. 1975, 2, 1–41. 10. Koballa, T. R. Sci. Educ. 1988, 72, 115–126. 11. Gardner, P. L. Res. Sci. Educ. 1995, 25, 283–289. 12. Gardner, P. L. Int. J. Sci. Educ. 1996, 18, 913–919. 13. Reid, N. Res. Sci. Technol. Educ. 2006, 24, 3–27. 14. Munby, H. J. Res. Sci. Teach. 1997, 34, 337–341. 15. Bauer, C. F. J. Chem. Educ. 2005, 82, 1864–1870. 16. Mager, R. F. Developing Attitude toward Learning, 2nd ed.; Lake Publishing Co.: Belmont, CA, 1984. 17. Gable, R. K. Instrument Development in the Affective Domain; Kluwer-Nijhoff Publishing: Boston, MA, 1986. 18. Eagly, A. H.; Chaiken, S. The Psychology of Attitudes; Harcourt Brace Jovanovich College Publishers: New York, 1993. 19. Ajzen, I. Attitudes, Personality, and Behavior; Dorsey: Chicago, 1988. 20. Fishbein, M.; Ajzen, I. Beliefs, Attitude, Intention, and Behavior: An Introduction to Theory and Research; Addison-Wesley: Reading, MA, 1975. 21. Shrigley, R. L. J. Res. Sci. Teach. 1990, 27, 97–113.

22. Dalgety, J.; Coll, R. K.; Jones, A. J. Res. Sci. Teach. 2003, 40, 649–668. 23. Grove, N. P.; Bretz, S. L. J. Chem. Educ. 2007, 84, 1524–1529. 24. Seymour, E.; Wiese, D. J.; Hunter, A.; Daffinrud, S. M. Creating a Better Mousetrap: On-line Student Assessment of their Learning Gains, American Chemical Society 218th National Meeting. http://www.salgsite.org/docs/SALGPaperPresentationAtACS.pdf (accessed Jul 2008). 25. Student Assessment of their Learning Gains Home Page. http:// www.salgsite.org/ (accessed Jul 2008). 26. Osgood, C. E.; Suci, G. J.; Tannenbaum, P. H. The Measurement of Meaning; University of Illinois Press: Urbana, 1957. 27. Snider, J. G.; Osgood, C. E. Semantic Differential Technique: A Sourcebook; Aldine Publishing Co.: Chicago, 1969. 28. Osgood, C. E.; May, W. H.; Miron, M. S. Cross-Cultural Universals of Affective Meaning; University of Illinois Press: Urbana, 1975. 29. Mensch, D. L.; Rubba, P. A. Sch. Sci. Math. 1991, 91, 164–168. 30. Lee, J. D. Soc. Psychol. Q. 1998, 61, 199–219. 31. Reid, N.; Skryabina, E. A. Res. Sci. Technol. Educ. 2002, 20, 67–81. 32. Fowler, F. J., Jr. Improving Survey Questions: Design and Evaluation; Sage Publications: Thousand Oaks, 1995. 33. DeVellis, R. F. Scale Development: Theory and Applications, 2nd ed.; Sage Publications: Thousand Oaks, 2003. 34. Nunnally, J. C. Educational Measurement and Evaluation; McGraw-Hill: New York, 1972. 35. Perkins, K. K.; Barbera, J.; Adams, W. K.; Wieman, C. E. Proceedings 2006 Physics Education Research Conference; AIP Press: Melville NY, 2006. http://link.aip.org/link/?APCPCS/883/53/1 (accessed Jul 2008). 36. Tabachnik, B. G.; Fidell, L. S. Using Multivariate Statistics; Harper and Row: New York, 1989. 37. Abraham, M. R.; Renner, J. W. J. Res. Sci. Teach. 1986, 23, 121–143. 38. Schwartz, A. T.; Bunce, D.; Silberman, R. G.; Stanitski, C. L.; Stratton, W. J.; Zipp, A. P. Chemistry in Context, 2nd ed.; Brown Publishers: 1997. 39. Kline, P. An Easy Guide to Factor Analysis; Routledge: London, 1994. 40. Gosser, D. K.; Roth, V. J. Chem. Educ. 1998, 75, 185–187. 41. Fraser, B. J. Sch. Sci. Rev. 1982, 63, 557–559. 42. Oliver, J. S.; Simpson, R. D. Sci. Educ. 1988, 72, 143–155. 43. Simpson, R. D.; Oliver, J. S. Sci. Educ. 1990, 74, 1–18. 44. Shrigley, R. L. J. Res. Sci. Teach. 1990, 27, 97–113.

Supporting JCE Online Material

http://www.jce.divched.org/Journal/Issues/2008/Oct/abs1440.html Abstract and keywords Full text (PDF) Links to cited URLs and JCE articles Supplement The paper form of the Attitude toward the Subject of Chemistry Inventory

A spreadsheet template that automates calculations, with instructions on how to use it



A users’ guide

© Division of Chemical Education  •  www.JCE.DivCHED.org  •  Vol. 85  No. 10  October 2008  •  Journal of Chemical Education

1445