Using an Online, Self-Diagnostic Test for ... - ACS Publications

Aug 30, 2010 - courses, identifying individual weaknesses to improve overall performance ... graphics, have been used in other studies to predict cour...
0 downloads 0 Views 750KB Size
Research: Science and Education

Using an Online, Self-Diagnostic Test for Introductory General Chemistry at an Open University Dietmar Kennepohl,* Matthew Guay, and Vanessa Thomas Centre for Science, Athabasca University, Athabasca, Alberta, Canada, T9S 3A3 *[email protected]

For decades a great deal of effort has been spent predicting the potential success of students entering university-level introductory general chemistry primarily with the objective of reducing the rate of failures in the course (1-16). Numerous additional reasons for gauging student success have also been articulated, including placement of students in the appropriate version of the course, protecting students from a demotivating experience that might influence their performance on other courses, identifying individual weaknesses to improve overall performance, increasing the yield of much-needed students who could go on in science-oriented disciplines, and finally simply avoiding wasted effort and resources by both institution and student. However, the assumption is usually that a limited number of places is available for students and, irrespective of the predictive method being employed, it ultimately becomes a gating mechanism for students entering the course. Athabasca University (AU), Canada's Open University with over 37,000 students, has the mission to reduce barriers to university-level education. As an open university, students are not required to have formal prerequisites to register in entry-level courses, but they are still expected to perform satisfactorily once they enter. Within this environment, any adult wishing to enroll in introductory general chemistry can do so. In addition, because of the general scalability of online and distance education, the number of places are not limited to the same extent as more traditional universities. One immediately realizes the challenges this brings, especially considering that AU faculty and staff also wish to reduce failure rates in general chemistry for many of the same reasons already mentioned. Furthermore, there is a strong moral obligation among teaching staff to adequately advise and inform students attempting the course to ensure that students have a reasonable chance to pass the course and obtain the education they seek and deserve. Here, we report the development and effectiveness of an online, self-diagnostic instrument used to predict student success in our introductory general chemistry course. Unlike many previous studies in the area of placement and aptitude testing, this online test is initiated and controlled by the potential student. The self-diagnostic test analyzes the areas of student background, conceptual basics, critical thinking, mathematical skills, and problem solving, which are all seen as essential components for success in a chemistry course (14, 17, 18). Background In the past, a variety of academic performance predictors have been employed individually and in various combinations.

_

These have included using high school grades (6), Piagetian tasks (7), Test of Logical Thinking (TOLT) (16), as well as placement tests such as the Scholastic Aptitude Test (SAT) (9), Toledo Chemistry Placement Examination (TCPE) (4), ACT1 (19, 20), California Chemistry Diagnostic Test (CCDT) (8), and American Council of Education Psychological Examination (ACE) (21), just to name a few. In addition, noncognitive student attributes, such as possessing a student loan, attitude to science, motivation, self-efficacy, background and general demographics, have been used in other studies to predict course withdrawals and grades (5, 12, 16, 20-24). While numerous predictors are available, student success in introductory general chemistry is usually measured by the final grade in the course. Researchers are quick to point out that these course grades do not necessarily reflect chemistry understanding (16). While this may be true, the currency of the realm remains the final course grade. A review of the literature indicates that the more powerful predictors seem to be connected with prior content knowledge or chemistry experience, level of high school achievement, mathematical skill, problem-solving ability, and even critical thinking (6, 8-10, 12, 14). Weaker predictors include attributes such as straight recall, intellectual aptitude, and the ability to visualize at the molecular level (10). However, the major limitation of any predictive model is that they simply reach their best-before date as soon as the student starts the course. In the science-fiction movie thriller Minority Report starring Tom Cruise, a “precrime” unit apprehends criminals before they commit a crime. Although their predictive ability is quite good, a fundamental flaw of the system is ultimately revealed. If you know your future, you have the ability to change it. Academic performance predictors are similar. While the course is running, the actual student participates and responds to those experiences that become more important than the initial measure of potential ability. Most predictive models do a mediocre job of accurately determining individual student grades. Still, in broader terms of passes and fails for the entire group, incorrect predictions are in the minority. This has lead several chemical educators to move the focus from grade prediction to identifying at-risk students with the intention of either redirecting them in their studies or offering remedial help (16). Details regarding our introductory general chemistry course have been previously reported (25), including the delivery of the laboratory component, which is a combination of supervised face-to-face sessions, computer simulations (26), and home-study laboratories (27, 28). Because of the online and distance nature of our university coupled with its open status, the learning environment is more accessible and less threatening, which encourages

_

r 2010 American Chemical Society and Division of Chemical Education, Inc. pubs.acs.org/jchemeduc Vol. 87 No. 11 November 2010 10.1021/ed900031p Published on Web 08/30/2010

_

Journal of Chemical Education

1273

Research: Science and Education

Figure 1. Sample questions from each section of the self-diagnostic test.

many students to choose distance learning. Paulsen and Moore argue for maintaining a level of freedom and individual choice that is crucial in distance education (29, 30). Essentially, “distance learners perceive themselves as self-directing individuals who are seeking control of their own learning outcomes. The assumption is that they are highly motivated, so course design incorporating a high degree of student freedom is desirable” (28). Not surprisingly, there are incredibly high student satisfaction ratings among distance education students. However, that freedom and ease of entry into distance education courses at an open university also results in lower overall pass rates. The typical range of pass rates for program students in undergraduate courses at open distance education universities (including AU) is 55-62% (31, 32). That pass rate is lower in the sciences, particularly in chemistry where historical pass rates for introductory chemistry are in the 37-49% range (31-33). It is interesting and important to realize that the actual failure rates are relatively small with a much larger portion of students falling into the nonstart category (i.e., students who register in a course but who never complete even one single assignment or write one exam or attend a laboratory session). Our experience has been that nonstarts represent 52% of our introductory chemistry course (25). One of the goals for creating this online self-diagnostic test was to attempt to balance the open access nature of the course with student advising and clearly communicating realistic expectations of eventual student performance in the course. Methodology The aim of the study is to compare scores between the selfdiagnostic test taken before the student starts the course with 1274

Journal of Chemical Education

_

Vol. 87 No. 11 November 2010

_

actual student performance in the course. The instrument of the self-diagnostic test employed the historically more powerful predictors and the questions were modeled on questions that performed well in previously published tests. In an iterative process based on individual question performance, the questions were edited, altered, removed, or new ones added over the past 4 years in an effort to give more accurate predictions. The original test had 57 items, which has now been reduced to 45 items. The current version of the test (not including the background section) has a Cronbach's R value of 0.83 and the point biserial correlations of the questions range from 0.18 to 0.56. The background section (9 test items) accounts for previous academic schooling, including the more applicable courses such as chemistry, physics, biology, and mathematics (14, 17), which will aid students entering the course to have greater success. However, the majority of the self-diagnostic test uses other powerful predictors, including assessing skills in concepts (18 test items), mathematics (7 test items), critical thinking (4 test items), and problem solving (7 test items). The self-diagnostic test with all items is available online and completely open to the public.2 Figure 1 provides a brief sample from each section of the test. Over the past few years, correlations have been made between class grade and self-diagnostic test scores. From this, we have created a general table predicting how well a student will potentially do and suggesting strategies to students, including revision, upgrading, and remedial work. The student is given a raw score and a table with general comments organized by score ranges that predict their performance and offers advice (Table 1). The results of the self-diagnostic test are immediately available for the student to take into consideration. Anyone taking the online test is then invited to participate in the study by

pubs.acs.org/jchemeduc

_

r 2010 American Chemical Society and Division of Chemical Education, Inc.

Research: Science and Education

leaving his or her name or student identification number. The resulting self-diagnostic tests accumulated over 2 years (∼2000 tests) were analyzed and matched with students actually enrolled in the course. Matching the identification left in the selfdiagnostic test to course students was done using a variety of different screening methods. Cross-referencing the survey identity with people enrolled in the class was done by either full names or student identification number. After sorting through the data, we found some 50 students that were positively matched from the self-diagnostic test to the enrollment in the class over the past 2 years. When analyzing the different components of both the test and the student's class grades, we were able to find various correlations. Each student's individual grades were matched up and correlations were drawn between them. Results A direct correlation was found between the overall selfdiagnostic test score and the student's final grade. The resulting Pearson correlation coefficient has a value of R = 0.749, which suggests a very strong linear trend. A t-test (p < 0.001) confirms this value is significant. A summary of correlations by selfdiagnostic test and course components is given below in Table 2. Of particular interest in this study is the observation that different sections of the self-diagnostic test correlate better with different course components. Significant R values of interest include correlations between total score and total grade (0.749); critical thinking and total grade (0.696), in particular the examinations component (0.675); and between conceptual basics and laboratories with outliers (0.557) and without outliers (0.771). The background section of the self-diagnostic test had a very weak correlation (25 years) have lower nonstart rates. These noncognitive attributes should improve the overall predictive power of the instrument. Unfortunately, from an advising perspective, it provides no actionable direction for the individual student other than being aware of the trend. Finally, the most significant improvement may not be with the design of the test itself. Given that a minority of students were identified as using the test, it may be more important to further encourage use of the self-diagnostic test results by students considering taking the course.

This entire project and its future depend on the students that take this self-diagnostic test. With special thanks also to the AU staff that assisted in data compilation and to Athabasca University for funding this research.

Conclusion

13. 14.

General introductory chemistry can be difficult for many students; pass rates are often lower when compared with other disciplines. In an environment of individualized study at a distance coupled with open admission, the pass rates are even lower. Student advising and early identification of at-risk students 1276

Journal of Chemical Education

_

Vol. 87 No. 11 November 2010

_

Notes 1. In late 1996, the name changed from American College Testing to ACT. http://www.act.org/ (accessed Aug 2010). 2. Am I Ready For CHEM.217? Web page. http://www.athabascau. ca/courses/chem/217/am_i_ready/ (accessed Aug 2010).

Literature Cited 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12.

15. 16. 17. 18.

MacPhail, A. H.; Foster, L. S. J. Chem. Educ. 1939, 16, 270–273. MacPhail, A. H.; Foster, L. S. J. Chem. Educ. 1941, 18, 235. Hovey, N. W.; Krohn, A. J. Chem. Educ. 1958, 35, 507–509. Hovey, N. W.; Krohn, A. J. Chem. Educ. 1963, 40, 370–372. Simpson, O. Open Learn. 2006, 21, 125–138. Ozsogomonyan, A.; Loftus, D. J. Chem. Educ. 1979, 56, 173–175. Bender, D. S.; Milakofsky, L. J. Res. Sci. Teach. 1982, 19, 205–216. Russell, A. A. J. Chem. Educ. 1994, 71, 314–317. Spencer, H. E. J. Chem. Educ. 1996, 73, 1150–1153. McFate, C.; Olmstead, J., III. J. Chem. Educ. 1999, 76, 562–565. Legg, M. J.; Legg, J. C.; Greenbowe, T. J. J. Chem. Educ. 2001, 78, 1117–1121. Wagner, E. P.; Sasser, H.; DiBiase, W. J. J. Chem. Educ. 2002, 79, 749–755. Pienta, N. J. J. Chem. Educ. 2003, 80, 1244–1246. Tai, R. H.; Ward, R. B.; Sadler, P. M. J. Chem. Educ. 2006, 83, 1703–1711. Sadler, P. M.; Tai, R. H. Sci. Educ. 2007, 16, 1–19. Lewis, S. E.; Lewis, J. E. Chem. Educ.: Res. Pract. 2007, 8, 32–51. Taylor, J. M. J. Coll. Read. Learn. 2008, 39, 35–53. Akanbi, T. Ilorin J. Educ. 1997, 17, 31–42.

pubs.acs.org/jchemeduc

_

r 2010 American Chemical Society and Division of Chemical Education, Inc.

Research: Science and Education

19. Carmichael, J. W., Jr.; Bauer, J., Sr.; Sevenair, J. P.; Hunter, J. T.; Gambrell, R. L. J. Chem. Educ. 1986, 63, 333–336. 20. House, J. D. Res. Higher Educ. 1995, 36, 473–490. 21. Boe, E. E. Educ. Psych. Meas. 1964, 24, 377–383. 22. Hoffert, A. L. Cognitive and Non-cognitive Predictors of Retention and Academic Performance of University Freshmen. Ph.D. Thesis, The University of North Dakota, Grand Forks, ND, United States, 2004. 23. Tuan, H. L.; Chin, C.-C.; Shieh, S.-H. Int. J. Sci. Educ. 2005, 27, 639–654. 24. Anderson-Rowland, M. R. Retention: Are Students Good Predictors? In Frontiers in Education Conference Proceedings, IEEE, 1997; Vol. 1, pp 62-70. 25. Kennepohl, D.; Last, A. Dist. Educ. 2000, 21, 183–197. 26. Kennepohl, D. J. Dist. Educ. 2001, 16, 58–65. 27. Kennepohl, D. J. Chem. Educ. 1996, 73, 938–939. 28. Kennepohl, D. Chem. Educ.: Res. Pract. 2007, 8, 337–348.

r 2010 American Chemical Society and Division of Chemical Education, Inc.

_

29. Moore, M. On a Theory of Independent Study. In Distance Education: International Perspective, Sewart, D., Keegan, D., Holmberg, B., Eds.; St. Martin's Press, New York, 1983, 68-94. 30. Paulsen, M. F. The Hexagon of Cooperative Freedom: A Distance Education Theory Attuned to Computer Conferencing, DEOS. In The Distance Education Online Symposium, http://www.ed.psu. edu/acsde/deos/deosnews/deosnews3_2.asp (accessed Aug 2010). 31. Powell, R. Openness and Dropout: A Study of Four Open Distance Education Universities. M2009: 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands. http://www.ou.nl/Docs/Campagnes/ICDE2009/ Papers/Final_paper_262powell.pdf (accessed Aug 2010). 32. The Open University. Course Results 2008. Sesame 2009, Issue 242, 22-24; http://www3.open.ac.uk/events/3/2009922_43263_o1.pdf (accessed Aug 2010). 33. Mosse, J. Monash University First-level Chemistry Data (20032009), personal communication October 2009.

pubs.acs.org/jchemeduc

_

Vol. 87 No. 11 November 2010

_

Journal of Chemical Education

1277