AC Educator: The Limits of Written Tests - Analytical Chemistry (ACS

AC Educator: The Limits of Written Tests. Thomas J. Wenzel. Anal. Chem. .... Gloves are off in Ashland investor dispute. Cruiser Capital wants to get ...
0 downloads 0 Views 10MB Size
ac educator

The Limits of Written Tests Thomas J. Wenzel, Bates College

W

ritten tests are the primary form of assessment in most analytical chemistry courses. But are written tests an effective assessment tool for predicting future success? In this column, the structural pitfalls of written tests are examined, and techniques for making them more effective assessment tools are described.

JOHN CEBALLOS

Pitfalls of written tests A person could study strategies for winning tennis matches and get an excellent score on a written test. But a written test cannot grade the confidence, agility, or impromptu judgment skills needed during an actual match. Likewise, a student who excels on tests in analytical chemistry courses may not exhibit the skills needed by analytical chemists on the job. To bridge this disconnect, some advocate expanding assessment measures to include skills valued by employers, such as creative or original thought, motivation, independence, initiative, ability to express oral and written ideas, ability to work with others, disciplined work habits, and potential for growth (1–3). The format of written tests makes it difficult to measure a student’s critical thinking skills, problem-solving abilities, and independent thought. Most test questions only measure a student’s ability to remember and reproduce material that has been presented by others. The time allowed for the completion of an exam is often restricted, and students must learn the material and generate a response on cue, often with little or no opportunity for clarification or discussion—essentially cramming and regurgitating the information. As a result, oftentimes, much of the material is forgotten a short time later.

Improving tests Studies show that students concentrate on learning only the material they think will be on the test (4). Usually, the instructor

tells the students generally what material will be covered, without giving away the specific questions. But can tests really be restructured to improve their assessment

J A N U A R Y 1 , 2 0 0 1 / A N A LY T I C A L C H E M I S T R Y

43 A

ac educator

value and expand the student’s learning? Indeed they can. Expanding the scope of tests to include other valued qualities will enhance a student’s learning. It should be emphasized that many of these skills can be assessed in the laboratory component of an analytical course, especially those courses that use problem-based learning (5). One strategy is to give out in advance a large set of questions that encompass all the material and includes those that will be on the exam (1). An alternative is to permit students to use resources such as notes, books, and other literature. Another option is to develop questions that require the student to demonstrate or explain the steps involved in answering a quantitative problem. This method encourages understanding of the material rather than just memorization. Essays, take-home exercises, and more liberal time allowances are other ways to assess some of these qualities. Tests that incorporate ranked “degreeof-difficulty” problems offer a method for evaluating student learning on exams (1). These types of problems give an instructor insight into each student’s understanding of the material and provide the option, if desirable, to weight earlier exams less than later ones. Retesting more difficult problems or concepts, and rewarding improvement can be an exceptional motivational tool for learning. Finally, it might be desirable to incorporate an oral component into exams. In an oral exam, the student has the chance to ask questions and clarify responses. The instructor has the chance to probe the extent of the student’s knowledge. I incorporated both a written and oral component into the final exams for my two analytical chemistry courses this past year. Questions were projected on a screen one at a time, and the students were given a set amount of time to write a response. If confused by the question, they were encouraged to seek clarification. I then opened up the question for discussion and either accepted answers from volunteers or called on students. I was able to probe the student’s response and ask other students to either elaborate or comment on answers that were given. Everyone was expected to participate 44 A

A N A LY T I C A L C H E M I S T R Y / J A N U A R Y 1 , 2 0 0 1

equally during the oral discussions over the entire two-hour exam period. A grade was given on the basis of each student’s oral participation and written responses. The oral format was a hit with students in both classes. They appreciated having the chance to explain their thoughts. The opportunity for immediate feedback allowed them to reinforce what they understood and clarify areas still causing confusion. Some expressed concern about not being able to go back and refine their answer as they might on a typical written exam. However, the discussion did allow them a chance to orally improve their answer. The oral component created a sense of closure for the course. Based on this successful experiment with my final exams, I intend to try some of the other alternatives described in this article in my introductory and advanced chemistry courses. Even with other forms of assessment, it is difficult to imagine the elimination of written tests from college courses. Expanding the format of written tests by providing questions in advance, allowing the use of notes or other resources, assigning graded take-home essays or exercises, retesting difficult material, or incorporating an oral component can enable instructors to facilitate learning and assess broader sets of skills. Thomas J. Wenzel is a professor at Bates College. Address correspondence to Wenzel at the Dept. of Chemistry, Bates College, Lewiston, ME 04240 ([email protected]).

References (1)

(2)

(3)

(4)

(5)

Wiggins, G. P. Assessing Student Performance: Exploring the Purpose and Limits of Testing; Jossey-Bass: San Francisco, CA, 1993. Weber, E. Student Assessment That Works: A Practical Approach; Allyn and Bacon: Boston, MA, 1999. Angelo, T. A.; Cross, K. P. Classroom Assessment Techniques: A Handbook for College Teachers; Jossey-Bass: San Francisco, CA, 1993. McKeachie, W. J.; Pintrich, P. R.; Lin, Y.-G., Smith, D.A.F. Teaching and Learning in the College Classroom: A Review of the Research Literature; National Center for Research To Improve Postsecondary Teaching and Learning: University of Michigan–Ann Arbor, MI, 1986; p 76. Wenzel, T. J. Anal. Chem. 1999, 71, 693 A–695 A.