Article Cite This: J. Chem. Educ. XXXX, XXX, XXX−XXX
pubs.acs.org/jchemeduc
Assessment of Student Performance on Core Concepts in Organic Chemistry Naha J. Farhat,† Courtney Stanford,‡ and Suzanne M. Ruder*,‡ †
Department of Mathematics & Economics, Virginia State University, Petersburg, Virginia 23806, United States Department of Chemistry, Virginia Commonwealth University, Richmond, Virginia 23284-2006, United States
‡
J. Chem. Educ. Downloaded from pubs.acs.org by UNIV OF SOUTH DAKOTA on 03/25/19. For personal use only.
S Supporting Information *
ABSTRACT: Assessments can provide instructors and students with valuable information regarding student’s level of knowledge and understanding, in order to improve both teaching and learning. In this study, we analyzed departmental assessment quizzes given to students at the start of Organic Chemistry 2, over an eight year period. This assessment quiz was designed to test students on core concepts from Organic Chemistry 1 that require the use of symbolic representations, such as drawing Lewis structures and using curved arrows. Using statistical analysis, it was found that students performed significantly lower than expected on the assessment quiz, and there was no significant difference in scores on the quiz over the eight years analyzed. Analysis of each individual question revealed that approximately one-third of the students made mistakes when converting a condensed formula or a chemical name to a chemical structure. For questions that involved more detailed answers such as drawing structures and showing simple mechanisms, approximately 95% of the students made errors. KEYWORDS: Second-Year Undergraduate, Organic Chemistry, Misconceptions, Lewis Structures, Mechanisms of Reactions
■
INTRODUCTION Organic chemistry is a two-semester course required for majors in chemistry, biology, chemical engineering, and prehealth sciences (medicine, dentistry, pharmacy, and veterinary). Typically taken in the second year of undergraduate education, this service course functions as a foundation for students’ understanding of fundamental concepts and as a gateway to more advanced courses in chemistry and other disciplines. In addition to gaining content knowledge, students in organic chemistry also develop skills such as information processing, critical thinking, and problem solving.1,2 These skills, often referred to as workplace skills, transferable skills, or process skills, are becoming increasingly important as employers place growing emphasis on the need for these skills.3,4 Recent national reports5−8 note that in order to meet the current global challenges students need to be prepared to think critically, solve problems, and collaborate with individuals from a variety of disciplines. Service courses like organic chemistry reach a large number of students, thus it is important to assess student’s content and process skill development to ensure the courses meet the institution’s and department’s goals. In this regard, most chemistry departments conduct assessment tests on student performance to determine if the expected outcomes of individual courses and chemistry programs have been met. Assessments have traditionally focused on student’s content knowledge. Results from assessments can provide instructors with valuable information regarding their student’s level of knowledge and understanding, and the effectiveness of their teaching.9,10 A study by Towns10 found that departmental assessment plans varied depending on the number of students being assessed, institutional resources, and whether assessment took place at the course or program level. In addition, institutions have different goals, priorities, and learning objectives based on feedback from informal observations or © XXXX American Chemical Society and Division of Chemical Education, Inc.
from accrediting agencies. When examining the types of assessment used in chemistry departments, Emenike et al.9 found that ACS Standardized Exams, internal exams, and student research projects were most frequently used to determine whether departmental goals were being met. Furthermore, this study found that the primary motivation for departments to conduct assessment was based on external influences such as ACS certification, university level requirements, and external accreditation. Unfortunately, only 7% of the faculty thought that these assessments were important with respect to their teaching practices.9 In order to perform well on chemistry content, students must be able to use various symbolic representations to present, explain, and demonstrate key concepts. Although there have been different interpretations of symbolic representations, such as Johnstone’s Triangle,11−13 it is widely accepted that addressing this perspective has allowed for a better understanding of how students reason about chemical concepts. Johnstone11,14 theorized that chemistry can be divided into three domains: the macroscopic, submicroscopic, and the symbolic level. The macroscopic level encompasses phenomena that are tangible and visible; the submicroscopic level involves particulate level models of matter, and the symbolic level includes chemical and mathematical signs and their relationships. All subdisciplines of chemistry observe the same macroscopic and submicroscopic phenomenon, but each uses different symbolic representations to visualize, illustrate, and effectively communicate about these macroscopic and submicroscopic processes. Received: November 7, 2018 Revised: February 20, 2019
A
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Figure 1. Assessment quiz. Questions provided to the students are shown in black, and solutions and point distribution are shown in red.
chemical and physical properties from Lewis structures.25,26 Development of representational competency is critical for students to gain an understanding of organic chemistry concepts. Students learn chemistry content from interpreting and applying representations which provide a way to “talk chemistry” with a shared understanding.27 Furthermore, the ability to interpret symbolic representations requires the skill of processing information by evaluating, interpreting, and transforming information to different formats. Combined, these processes should lead to better content understanding. To be successful in organic chemistry, students must be proficient with common tasks such as converting between different chemical notation and structures, drawing resonance structures, and using curved arrow notation, all tasks that require processing different kinds of information. As part of this study, we hope to analyze results from a departmental assessment quiz to begin to examine connections between how well students are able use symbolic representations and process information in order to address questions about key concepts. We began with an assessment quiz already being used in a
Representational competency is often measured by an individuals’ ability to use representations to describe chemical concepts. For example, explaining why a representation is appropriate for a particular purpose, identifying unique features of a representation, describing how different representations illustrate the same things in different ways, and making connections across different representations are all ways to demonstrate representational competency.15 As representational competency grows, individuals are more able to apply one or more representations to explain relationships between physical properties, use representations to support a claim, and explain why certain representations are more appropriate for a given context. However, to a novice, it can be overwhelming to learn the “language” of these chemical representations12,13,16 in addition to making sense of the content material. Common symbolic representations in organic chemistry include condensed formulas, Lewis structures, line-angle structures, and curved arrow notations. It has been established that students have trouble understanding various symbolic representations17−24 and have difficulty in determining B
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
department to help provide instructors with valuable feedback on their students’ ability use the symbolic representations in their class.
■
Table 1. Concepts Tested in Each Question of the Assessment Quiz
METHODS
question #
concept tested
1
drawing Lewis structures drawing Newman projections drawing Lewis structures acid−base reactions equilibrium
Participants and Settings 2
This study took place in multiple Organic Chemistry 2 (OC2) courses over an eight year period at a large urban researchintensive university. At this institution, there are three large enrollment sections, including an evening section, of OC2 taught by three different instructors, with no recitation sections. There were approximately 150−180 students enrolled in each section for a total of approximately 500 students each year. The institution where this study took place has a diverse student population that varies in age, ethnicity, and ability. Students enrolled in OC2 have completed two semesters of General Chemistry and one semester of Organic Chemistry 1 (OC1) with a C or better and are predominately science and engineering majors and often prehealth. The majority of the students took their prerequisite chemistry courses at the same institution; however, there are a fair number of transfer students who took General Chemistry at a different institution. Additionally, over the eight year period, there were approximately six new instructors hired to teach General Chemistry and Organic Chemistry, due to rapid expansion of enrollment in chemistry courses. Over the eight years that data were collected for this study, Organic Chemistry was taught by multiple instructors using different pedagogies such as lecture or active learning POGIL28,29 techniques.
3
4
drawing Lewis structures E2 mechanism
5
drawing Lewis structures addition mechanism
secondary topics students should know to answer correctly condensed formula, octet rule, lone pair electrons, valence electrons, charges nomenclature, condensed formula, molecular geometry, conformation condensed formulas, octet rule, lone pair electrons, valence electrons, charges Lewis and Bronsted−Lowry acid/base definitions, conjugate acid/base pairs acid/base stability, resonance, induction, electronegativity condensed formula, octet rule, lone pair electrons, valence electrons, charges curved arrow notations, nucleophiles, electrophiles, leaving group ability, elimination reactions octet rule, lone pair electrons, charges, wedge dash notations curved arrow notations, nucleophiles, electrophiles, leaving group ability, stereochemistry, addition reactions
year period to see if there were any changes in student performance, particularly given the growth in enrollment and changes in instructional faculty during that period. Future analysis will examine 1 year in more detail to account for differences in student population and instructional techniques at this institution. In order to gain a thorough understanding of student performance over the eight year period, 60 quizzes were randomly selected from each year (approximately 10− 12% of each year), for a total of N = 480 quizzes. Because the quiz score was not factored into students’ grades, many of the quizzes only contained a first name or no name. After collecting the quizzes, each instructor looked over quizzes from their section in order to get a general idea of students’ performance, but they did not formally grade the quizzes. Therefore, quizzes were selected for analysis prior to grading, and quizzes with no answers for multiple questions were intentionally not selected for analysis because little insight would be gained from a blank quiz. Of the 60 quizzes analyzed from each year, 20 quizzes were randomly selected from each section of OC2, or approximately 12% from each section per year. The assessment quizzes were graded for content, and a grading key was used to ensure points were awarded equally across the different years. Finally, statistical analysis was completed on the data to compare how students performed on each question over the eight years. For this analysis, questions were considered to be correct or incorrect, and a question was considered incorrect if any points were deducted from the question. Several statistical tests were performed with the data in this study in order to determine whether there were differences between the eight years. All data were analyzed using JMP Pro 13.2.0 and R 3.3.0 statistical software. In order to evaluate students’ performance for each individual year, a one-sample t test was conducted. This statistical test compares the mean of quantitatively measured data to a hypothesized mean value and follows the t-distribution. This test determined whether students’ mean content score each year was significantly lower than the expected average content score of 13 points (13/20 is 65%, a D grade). This hypothesized average score was chosen as the expected average as, although all students had passed OC1 with a C grade or better, the assessment quiz
Data Collection and Analysis
Data for this project were collected from a five question departmental assessment quiz shown in Figure 1. This quiz was given unannounced on the first day of OC2 over the eight year period in order to assess student’s knowledge after completing OC1. This departmental assessment quiz was designed to test basic knowledge of OC1 concepts such as depicting organic structures in Lewis, line angle, and Newman projection forms, predicting products of acid/base and elimination reactions and using curved arrow notations for drawing mechanisms for addition and elimination reactions. The quiz questions focused on fundamental concepts from OC1, all of which require students to process and apply information contained in multiple symbolic representations. This quiz was developed by an OC2 instructor who did not teach OC1, in order to ensure that students did not have an advantage if they were familiar with an instructor’s style of questions from OC1. In addition, this assessment quiz was reviewed and approved by all OC1 and OC2 instructors, with full agreement that the quiz represents concepts that students should have mastered after completing OC1. A summary of concepts assessed for each question along with a list of secondary topics needed to answer the question is shown in Table 1. The assessment quiz was designed in order to provide instructors with information about their students’ prerequisite knowledge and to offer students with information that they may need to review. This was especially important because many students had different instructors for OC1 compared to OC2. Generally, there were three to four sections of OC1 in the fall semester and three sections of OC2 in the spring semester. Students could choose to switch to different instructors and pedagogies for OC2; in some cases, the OC1 instructor would not teach OC2. We chose to look at the eight C
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
(35%) to 9.07 (45%). In general, average scores for each year were lower than the expected average content score of 13 points (65%). Furthermore, the distribution of scores shown in Figure 2 illustrates that a wide range of scores was found across all the years, with the highest score being a 20 and the lowest a 0. One-sample t tests were conducted on data from each year, as shown in Table 3. This test determined whether the average content score for each year was significantly lower than the expected average score of 13. For this analysis, all eight p values were significant (p < 0.001), indicating that the average content score in each of the eight years was significantly lower than 13. In other words, students were consistently performing lower than the expected average over the entire eight year period. For example, as shown in Table 3, the average content score in 2009 was 4.07 points lower than the expected average passing score of 13, and in 2010, it was 6.07 points lower. The individual year analysis found that for all eight years students were consistently performing below the hypothesized average of 65% on the assessment quiz. We then tested to determine whether there was any significant difference in the average content score between different years. Specifically, were scores from one year significantly better or worse than scores from another year? A Brown−Forsythe test was conducted to test the variability in the content score values between different years. It was found that the p value was not significant (0.5988 > α = 0.01), indicating that there was no evidence that the variability in the content scores significantly differed from year to year. This insignificant p value justified using the equal variance ANOVA test to determine whether the mean content grade differed between years. The overall equal variance ANOVA model p value was significant, p = 0.0058 < α = 0.01, F(7,472) = 2.78 (where F is the value of the ANOVA model test statistic and 7 and 472 represent the degrees of freedom). As a result, Tukey−Kramer test was conducted for multiple pairwise comparisons to further investigate if there was a significant difference in the mean content scores between any pair of years, as shown in Figure 3. This one-way ANOVA graph represents the distribution of content scores each year and provides information on how the means differ across the eight years. The dark black dots on the graph (Figure 3) represent the actual content scores (data points); the red colored plots represent the box plots; the green diamonds represent the mean; the blue line connects the mean content score from each year, and the overlapped circles represent the eight mean content scores (one circle for each year’s mean score) and how they compare to each other. The fact that these circles are overlapped indicates that the values of the eight content score means are very close to each other. At a significance level of α = 0.01, the Tukey−Kramer test did not find any statistically significant mean differences in the content score between any pair of years (all p > 0.01), which indicates that the students’ average content score was consistently lower than the expected average of 13 regardless of the year. Moreover, there was no year in which students performed better than another year. Because the results indicated there was no difference in scores between years, the data were merged from all eight years to perform further analysis.
was taken without reviewing the material and at least 1 month had passed since taking the prerequisite OC1 course. To compare the students’ performance between the eight years, an equal-variance one-way ANOVA test was conducted. This test determined whether the average content score differed significantly between the eight years. A Brown− Forsythe test was conducted to test the variability in the content score values between different years, and the p-value from this test was not significant (p ≥ 0.01), which justified the equal-variance ANOVA test. Furthermore, data were merged from all eight years (N = 480), and a one-sample t test was conducted to test whether the overall average score was significantly lower than the expected average score of 13 points regardless of the year. In order to gain more detailed information about which concepts students struggled with during each year, further investigation of scores for each quiz question was undertaken. Statistical analysis using a one-sample z test of proportions allowed us to further analyze which questions students had the most difficulty with. For each of the five questions of the quiz, this test compared the sample proportion to a hypothesized value and was conducted to test whether the proportion of incorrect answers is significantly higher than a hypothesized proportion for each of the eight years. In addition, the data from all eight years was then merged (N = 480) and a chisquare test of proportions was conducted to test whether the proportion of incorrect answers was significantly higher than a hypothesized value regardless of the year. All statistical tests were conducted at level of significance α = 0.01.
■
RESULTS
Student Performance over Eight Years
The departmental assessment quiz (Figure 1) was originally administered in order to provide OC2 instructors with feedback on which concepts were problematic for students after completing OC1. Additionally, the quiz was intended to provide feedback to the students on prerequisite material they needed to review. In this study, we were interested in investigating whether there were any changes in student overall performance, particularly given the growth in enrollment and changes in instructional faculty during the eight year period. The quiz focused on drawing different representations and using curved arrows to shown mechanisms, factors that are critical for success in OC2. Each assessment quiz was first graded for correctness in terms of content. Table 2 summarizes the content score for the 60 randomly selected quizzes for each of the eight years. Overall, the mean score ranged from 6.93 Table 2. Summary Statistics for the Overall Content Grade by Year
a
yeara
mean scoreb,c
standard deviation
2017 2016 2015 2014 2012 2011 2010 2009
7.78 9.03 8.38 8.23 9.07 7.80 6.93 8.93
3.16 3.08 3.70 3.20 3.89 3.60 3.20 3.37
Combined Student Performance (Regardless of Year)
b
As noted above, analysis of the assessment quiz scores determined that students were consistently underperforming
No data are available for 2013. The scale for this test could range from 0 to 20. cThe n value for each year is 60. D
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Figure 2. Distribution of content scores across the eight years.
Table 3. One-Sample t Test for Each Year a
b
year
mean difference
2017 2016 2015 2014 2012 2011 2010 2009
−5.22 −3.97 −4.62 −4.77 −3.93 −5.20 −6.07 −4.07
t test values
8.27 (41%), with a standard deviation of 3.46 and the 95% confidence interval = [7.9607, 8.5809]. A one-sample t test was conducted on the combined data to test whether the mean content score was significantly lower than the expected average score of 13 points at significance level of α = 0.01. The results of this test yielded a t = −29.96 (t is the value of the test statistic, and it follows the t distribution), df = 479, and p < 0.001. The p value was significant and was less than α = 0.01. This indicates that the overall mean content score was significantly lower than the expected mean content score of 13 by about 5 points. In other words, over a period of eight years, students were consistently performing below the expected 65% average on the assessment quiz.
c,d
−12.80 −9.99 −9.67 −11.53 −7.83 −11.82 −14.67 −9.35
a
No data are available for 2013. bThe hypothesized value of the mean score tested against was 13. cThe degrees of freedom value for each year is 59. dThe p value for each year is p < 0.001.
Student Performance on Individual Questions
on the quiz, and there were no significant differences in the average content score between the eight years. Therefore, the data from all eight years were merged into one population. After combining the eight years of data, we examined the pooled data to look at performance on core concepts regardless of the year. The combined population from the eight years of quizzes had a sample size of N = 480 quizzes. From the combined data, the overall mean content score was
Statistical analysis of the assessment quiz scores over an eight year period indicated that students consistently scored well below the expected average, regardless of the year. In order to gain more detailed information about which concepts students struggled with, further investigation of scores for each quiz question was undertaken. Table 1 lists the concepts tested for each question and the types of information needed to answer E
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Figure 3. Tukey−Kramer test of one-way analysis of content score by year.
chosen because Q1 and Q2 covered basic concepts such that at least 85% of the students should answer correctly. For Q3, Q4, and Q5, we chose 0.7 (70%) to test against because these questions have a higher level of complexity, and we anticipate that fewer students would answer correctly. For each of the eight years, the percentage of students who incorrectly answered Q1 was significantly higher than 15% (all p < 0.001) for all years except for year 2009, where this percentage was about 17% (p > 0.01). This indicates that even after three semesters of chemistry many students still have difficulty drawing simple Lewis structures that require a firm grasp of bonding rules and functional groups. Many students tended to draw linear structures such as H−O−C−O−H and H−C−O−O−H with varying double and triple bonds and charged atoms throughout the structure. The percentage of students who incorrectly answered Q2 was significantly higher than 15% (all p < 0.001) for all years except for year 2012, where this percentage was about 15%. The most common mistakes with drawing Newman structures (Q2) included drawing an incorrect form of the Newman projection skeleton or having the incorrect formula, such as butane instead of ethane. Lastly, for Q3, Q4, and Q5, the percentages of students who incorrectly answered these questions were significantly higher than 70% for all years (all p ≪ 0.001 < 0.01). Whereas approximately half of students did not attempt Q3, those that did predominately drew an ether attached to a cyclohexene, cyclohexane, or hexane chain and sodium hydroxide as the products. Even fewer students justified the equilibrium direction, with stability and base strength (strong to weak) being the most common reasoning for favoring both reactants and products. Common errors in Q4 and Q5 included missing arrows, arrows in the wrong direction, drawing products with different formulas than the given product, and not accounting for stereochemistry. This demonstrates that the majority of the students have some difficulty with drawing Lewis structures, arrow pushing mechanisms, and acid−base reactions. Supporting Information includes using the chi-square test for onesample proportion to test whether the proportion of incorrect answers differed significantly between Q1 through Q5 regardless of the year. Future analysis will be conducted to determine exactly what errors students are making for each of these core concepts.
the questions. By investigating the questions separately, information about which concepts were problematic could be obtained. Thus, analysis of student performance on the five quiz questions was conducted for each of the eight years. Separate analysis of each question for every year was conducted in order to determine if particular questions were more problematic in different years. A question was considered to have an incorrect answer if any amount of points were deducted on a given question. Although students may have received partial credit for their answer, for the purpose of this analysis, that answer was considered incorrect. It was found that the percentage of incorrect answers ranged from 15 to 53% for Q1 and 15 to 37% for Q2, whereas Q3, Q4, and Q5 ranged from 93 to 100% incorrect, as shown in Figure 4. This
Figure 4. Percentage of students that got each question incorrect over the eight years.
indicated that students’ ability to convert chemical formulas and chemical names to various representational structures (Q1 and Q2) varied, but students consistently performed very poorly on acid−base reactions and curved arrow mechanism problems (Q3, Q4, and Q5). Statistical analysis using a one-sample z test of proportions allowed us to further analyze which questions students had the most difficulty with. This test compared the sample proportion with a hypothesized value, in order to test whether the proportion of students who incorrectly answered Q1 was significantly greater than 0.15 (15%), similarly for Q2, whereas for Q3, Q4, and Q5, we tested whether these proportions were significantly greater than 0.7 (70%). A value of 0.15 (15%) was F
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Limitations
institution come into organic chemistry with a wide array of backgrounds, we plan to investigate whether certain populations of students are prone to making a specific error more than other populations. Additionally, we plan to compare the different pedagogies used to see if there are any differences in errors made based on type of pedagogy instructors used.
The data from this study were collected from one institution over the course of eight years. Although this assessment quiz was not initially designed to examine student’s use of symbolic representations, it did provide insight into problems students have when answering questions that require using and applying symbolic representations. In addition, students at this institution come from a variety of backgrounds, and this current analysis does not account for differences in the student population.
■
ASSOCIATED CONTENT
S Supporting Information *
The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.8b00913. Additional statistical analysis (PDF, DOCX)
■
DISCUSSION AND CONCLUSIONS Overall, this study found that students performed significantly lower than expected on an assessment quiz designed to test students on core concepts from OC1 that require the use of symbolic representations. There was no significant difference in scores on the quiz over the eight years analyzed. This was of particular interest given the growth in enrollment and changes in instructional faculty during the eight year period. This suggests that students continue to have difficulty with core concepts like drawing Lewis structures, drawing mechanisms using curved arrow notations, and acid−base reactions. These findings support previous research17−23,30−33 acknowledging that students have trouble drawing Lewis structures, acid−base reactions, and using curved arrows to show mechanisms. Herein, we have shown that the difficulties students have with these concepts are consistent from students in different sections and over an eight year period. Specific analysis of each individual question on the quiz revealed that approximately one-third of the students made mistakes when converting from a condensed formula or a chemical name to a chemical structure. For questions involving more detailed answers including chemical reactions, drawing structures, and mechanisms, approximately 97% of the students made errors. This is problematic because, after three semesters of chemistry courses with a C grade or higher, many students still have difficulty using the symbolic representations necessary to understand core concepts in Organic Chemistry. Because these difficulties span across years, pedagogies, and instructors, we theorize that student might be struggling with the process of extracting, evaluating, interpreting, and transforming information contained within the symbolic representations. This in turn may affect student’s ability to apply their chemistry knowledge to questions that require use of specific symbolic representations.
■
AUTHOR INFORMATION
Corresponding Author
*E-mail:
[email protected]. ORCID
Courtney Stanford: 0000-0002-1159-0320 Suzanne M. Ruder: 0000-0001-9094-4010 Notes
The authors declare no competing financial interest.
■
ACKNOWLEDGMENTS Portions of this work were funded through support from the National Science Foundation, DUE-IUSE-Exploration and Design: Engaged Student Learning Grant “Collaborative Research: Eliciting and Assessing Process Skills in STEM” DUE: 1524399. Other portions of the work were funded by the Chemistry Department at Virginia Commonwealth University through the 2017 Small Grant for Innovative Teaching (SGIT). Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation or Virginia Commonwealth University.
■
REFERENCES
(1) Brewster, R. Q. Objectives of the first course in organic chemistry. J. Chem. Educ. 1939, 16 (12), 562. (2) American Chemical Society Committee on Professional Training. Undergraduate Professional Education in Chemistry: ACS Guidelines and Evaluation Procedures for Bachelor’s Degree Programs; American Chemical Society: Washington, DC, 2015. (3) Chickering, A. W.; Gamson, Z. F. Seven Principles for Good Practice in Undergraduate Education. AAHE Bulletin 1987, 39 (7), 3−7 https://files.eric.ed.gov/fulltext/ED282491.pdf (accessed Jan 2019) . (4) Lowden, K.; Hall, S.; Elliot, D.; Lewin, J. Employers’ perceptions of the employability skills of new graduates; Edge Foundation: London, 2011. (5) Singer, S. R.; Nielsen, N. R.; Schweingruber, H. A. DisciplineBased Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; The National Academies Press: Washington, DC, 2012. (6) National Research Council. A Framework for K−12 Science Education; National Academies Press: Washington, DC, 2012. (7) National Research Council. Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century; National Academies Press: Washington, DC, 2013. (8) American Association for the Advancement of Science. Vision and Change in Undergraduate Biology Education: A Call to Action; AAAS: Washington, DC, 2009. (9) Emenike, M. E.; Schroeder, J. D.; Murphy, K.; Holme, T. Results from a National Needs Assessment Survey: A View of Assessment
Future Work
To gain a better understanding of the different aspects within each question that students struggle with, future efforts will involve identifying the specific types of errors made for each question and how often these errors occur. Based on the results reported herein, our focus will be on questions that require drawing chemical structures and using curved arrow notations. As representational competency requires individuals to be successful at processing different types of information, future efforts will also include examining the connection between processing information and representational competency. By looking for evidence of student’s ability to evaluate, interpret, and transform the information embedded in the chemical structures and curved arrow notations, we hope to gain insights into how students process information and what role this plays in using content specific symbolic representations. In addition, because the students at this G
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX
Journal of Chemical Education
Article
Efforts within Chemistry Departments. J. Chem. Educ. 2013, 90, 561− 567. (10) Towns, M. Developing Learning Objectives and Assessment Plans at a Variety of Institutions: Examples and Case Studies. J. Chem. Educ. 2010, 87 (1), 91−96. (11) Johnstone, A. H. Macro- and micro-chemistry. School Science Review 1982, 64, 377−379. (12) Taber, K. S. Revisiting the chemistry triplet: drawing upon the nature of chemical knowledge and the psychology of learning to inform chemistry education. Chem. Educ. Res. Pract. 2013, 14, 156− 168. (13) Talanquer, V. Macro, Submicro, and Symbolic: The many faces of the chemistry “triplet. International Journal of Science Education 2011, 33 (2), 179−195. (14) Johnstone, A. H. Why is science difficult to learn? Things are seldom what they seem. Journal of Computer Assisted Learning 1991, 7 (2), 75−83. (15) Kozma, R.; Russell, J. Students becoming chemists: Developing representational competence. In Visualization in Science Education; Gilbert, J. K., Ed.; Springer: Dordrecht, The Netherlands, 2005; pp 121−145. (16) Taber, K. S. Learning at the Symbolic Level. In Multiple Representations in Chemical Education; Treagust, J. K. G. D., Ed.; Springer: Dordrecht, The Netherlands, 2009; pp 75−105. (17) Cooper, M. M.; Grove, N.; Underwood, S. M.; Klymkowsky, M. W. Lost in Lewis Structures: An Investigation of Student Difficulties in Developing Representational Competence. J. Chem. Educ. 2010, 87 (8), 869−874. (18) Grove, N. P.; Cooper, M. M.; Rush, K. M. Decorating with arrows: Toward the development of representational competence in organic chemistry. J. Chem. Educ. 2012, 89, 844−849. (19) Tiettmeyer, J. M.; Coleman, A. F.; Balok, R. S.; Gampp, T. W.; Duffy, P. L.; Mazzarone, K. M.; Grove, N. P. Unraveling the Complexities: An Investigation of the Factors That Induce Load in Chemistry Students Constructing Lewis Structures. J. Chem. Educ. 2017, 94, 282−288. (20) Flynn, A. B.; Featherstone, R. B. Language of mechanisms: exam analysis reveals students’ strengths, strategies, and errors when using the electron-pushing formalism (curved arrows) in new reactions. Chem. Educ. Res. Pract. 2017, 18, 64−77. (21) Grove, N. P.; Cooper, M. M.; Cox, E. L. Does Mechanistic Thinking Improve Student Success in Organic Chemistry? J. Chem. Educ. 2012, 89, 850−853. (22) Bhattacharyya, G.; Bodner, G. M. It gets me to the product”: How students propose organic mechanisms. J. Chem. Educ. 2005, 82 (9), 1402−1407. (23) Galloway, K. R.; Stoyanovich, C.; Flynn, A. B. Students’ Interpretations of Mechanistic Language in Organic Chemistry Before Learning Reactions. Chem. Educ. Res. Pract. 2017, 18, 353−374. (24) Stieff, M.; Scopelitis, S.; Lira, M. E.; Desutter, D. Improving representational competence with concrete models. Sci. Educ. 2016, 100 (2), 344−363. (25) Cooper, M. M.; Underwood, S. M.; Hilley, C. Z. Development and validation of the implicit information from Lewis structures instrument (IILSI): do students connect structures with properties? Chem. Educ. Res. Pract. 2012, 13, 195−200. (26) Cooper, M. M.; Corley, L. M.; Underwood, S. M. An investigation of college chemistry students’ understanding of structure-property relationships. J. Res. Sci. Teach. 2013, 50 (6), 699−721. (27) Kozma, R.; Chin, E.; Russell, J.; Marx, N. The roles of representations and tools in the chemistry laboratory and their implications for chemistry learning. Journal of the Learning Sciences 2000, 9, 105−143. (28) POGIL: An overview Process Oriented Guided Inquiry Learning (POGIL). In Process Oriented Guided Inquiry Learning (POGIL); Moog, R. S., Spencer, J. N., Eds.; American Chemical Society.: Washington, DC, 2008; Vol. 994, pp 1−13.
(29) Farrell, J. J.; Moog, R. S.; Spencer, J. N. A guided inquiry general chemistry course. J. Chem. Educ. 1999, 76 (4), 570−574. (30) Ferguson, R.; Bodner, G. Making Sense of the Arrow-Pushing Formalism Among Chemistry Majors Enrolled in Organic Chemistry. Chem. Educ. Res. Pract. 2008, 9, 102−113. (31) Bretz, S. L.; McClary, L. Students’ Understandings of Acid Strength: How Meaningful Is Reliability When Measuring Alternative Conceptions? J. Chem. Educ. 2015, 92 (2), 212−219. (32) McClary, L.; Bretz, S. L. Development and Assessment of a Diagnostic Tool to Identify Organic Chemistry Students ’ Alternative Conceptions Related to Acid Strength. International Journal of Science Education 2012, 34, 2317−2341. (33) Cooper, M. M.; Kouyoumdjian, H.; Underwood, S. M. Investigating students’ reasoning about acid−base reactions. J. Chem. Educ. 2016, 93 (10), 1703−1712.
H
DOI: 10.1021/acs.jchemed.8b00913 J. Chem. Educ. XXXX, XXX, XXX−XXX