Developing and Measuring Proficiency

available to help us define and measure proficiency? Does avail- ability of measurement tools influence our choices regarding which proficiencies to a...
52 downloads 0 Views 78KB Size
Chemical Education Today

Editorial

Developing and Measuring Proficiency One of our major goals as teachers is to enhance our students’ proficiency in our subject, in logical and critical thinking, in designing and carrying out experiments, and in many other areas. Such goals naturally raise many questions. How do we define proficiency? How can we measure it? What tools are available to help us define and measure proficiency? Does availability of measurement tools influence our choices regarding which proficiencies to assess? Do others define proficiency in the same terms and according to the same measures as we do? I was reminded of these questions when I read a newspaper article titled “Meaning of ‘Proficient’ Varies for Schools Across Country” (1). According to state exams, 87% of fourthgrade students in Mississippi and in Colorado are proficient in reading. These are the highest percentages achieved in any of the states. However, when fourth-grade reading proficiency was measured using the National Assessment of Educational Progress (NAEP), only 18% of Mississippi students were assessed to be proficient, a performance that placed Mississippi last among the states. Clearly the definitions of proficient used in the two tests are far from the same. Lest those of us in higher education think that we are above such problems, I recommend two recent articles about Ivy-League schools (2, 3). Princeton University has put an upper limit on the fraction of A grades any department can give in any semester—35%. This will be a significant reduction from the current university-wide average: nearly half of all grades are A. In The Atlantic Monthly a recent recipient of a bachelors degree from Harvard suggests that lazy undergraduates, lazy or indifferent faculty, and college administrators are part of a system in which a true liberal arts education is almost impossible to achieve. More often than not the path of least resistance results in students’ achieving a “gentleman’s B-plus”. Is this proficiency? Proficiency will not be achieved unless students are challenged to achieve it and their levels of proficiency are assessed. Fortunately there are good sources of assessment materials. The ACS DivCHED Exams Institute has a variety of standardized examinations at various levels, and their new, interactive score reporting system makes it easy to compare performance of our own classes with the national norms (4). For many years the Exams Institute has been providing our discipline with an important service, and they continue to develop new exams and new types of assessment tools. Another national resource is the Educational Testing Service, which has recently developed a new test that aims to assess whether college-level students can make effective use of the vast resources available on the Internet (5). There is clear evidence that the types of questions we ask also can influence our ability to assess students’ knowledge accurately (6, 7). Students who “solve” problems by memorizing and applying an algorithm often do not understand the concept underlying the problem solution. Such students are likely to forget the memorized algorithm and therefore often cannot solve related problems subsequently. To deal with such issues, concept-oriented questions and assessment tools have been developed in physics (7), chemistry (8), materials science (9), www.JCE.DivCHED.org



Proficiency will not be achieved unless students are challenged to achieve it and their levels of proficiency are assessed.

electromagnetics (10), and biology (11). Developing and testing assessments of this type is difficult and time consuming, but it is very important work that can be very effective in improving teaching and learning of chemistry and other subjects. Though testing and assessment are very important, they can be misused or overused (12, 13). There is evidence in the psychological literature that high-stakes, high-anxiety tests may cause the best students to “choke” more than average students on mathematical problem solving tasks (14). Apparently the larger a student’s working memory, the more pressure to succeed competes for working memory with the cognitively-based thinking needed to solve problems. This harms the best students more than those with less well developed problem-solving skills. Given that tests may not measure what we think they do, that there may not be tests to measure important learning outcomes such as laboratory skills, and that high-stakes tests may cause poorer performance by the best problem solvers, putting too much emphasis on any single test or on standardized testing in general is not a good idea. Nevertheless, it is crucial that we require and carefully assess proficiency. Literature Cited 1. 2. 3. 4. 5. 6. 7.

8.

9.

10.

11. 12. 13. 14.

Saulny, Susan. New York Times January 19, 2005, p A18. Mulvihill, Geoff. Wisconsin State Journal January 23, 2005, p A8. Douthat, Ross. The Atlantic Monthly March 2005, pp 95–99. ACS DivCHED Examinations Institute. http://www3.uwm.edu/ dept/chemexams/ (accessed Feb 2005). Zeller, Tom, Jr. New York Times January 17, 2005, p C1. Nakhleh, M. B.; Mitchell, R. C. J. Chem. Educ. 1993, 70, 190–192. Hestenes, D.; Wells, M.; Swackhammer, G. The Physics Teacher 1992, 30, 141–158; Hestenes, D.; Halloun, I. The Physics Teacher 1995, 33, 8. Mulford, D. R.; Robinson, W. R. J. Chem. Educ. 2002, 79, 739; http://www.jce.divched.org/JCEDLib/QBank/collection/CQandChP/ CQs/ConceptsInventory/CCIIntro.html (accessed Feb 2005). Strength of Materials Concept Inventory Assessment Instrument. http://www.foundationcoalition.org/home/keycomponents/ concept/strength.html#1 (accessed Feb 2005). Electromagnetics Concept Inventory Assessment Instruments. http://fc1.tamu.edu/home/keycomponents/concept/ electromagnetics.html (accessed Feb 2005). Bioliteracy.net Home Page. http://bioliteracy.net/ (accessed Feb 2005). Zare, R. N. Chem. Eng. News 2005, 83 (5), 3. Moore, J. W. J. Chem. Educ. 2001, 78, 855; 991. Beilock, S. L.; Carr, T. H. Psychological Science 2005, 16 (2), 101.

Vol. 82 No. 4 April 2005



Journal of Chemical Education

503