Short and Long-Term Impacts of the Cottrell Scholars Collaborative

Jul 24, 2015 - Findings indicate that, in the short-term, the CSC NFW was successful in raising workshop participants' self-efficacy, shifting their t...
0 downloads 0 Views 1MB Size
Article pubs.acs.org/jchemeduc

Short and Long-Term Impacts of the Cottrell Scholars Collaborative New Faculty Workshop Marilyne Stains,*,† Matthew Pilarz,‡ and Devasmita Chakraverty§ †

Department of Chemistry, University of NebraskaLincoln, Lincoln, Nebraska 68588, United States Department of Chemistry and Biochemistry, Rowan University, Glassboro, New Jersey 08028, United States § Biology Education, IPN Leibniz Institute for Science and Mathematics Education, Kiel D-24118, Germany ‡

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

S Supporting Information *

ABSTRACT: Postsecondary chemistry instructors typically have received little pedagogical training as graduate students and postdoctoral research assistants. Moreover, professional development opportunities are often limited at their own institution. This lack of training has resulted in a gap between the instructional strategies enacted in chemistry courses and the results of discipline-based education research. Members of the Cottrell Scholars Collaborative initiated the New Faculty Workshop (CSC NFW) program in 2012 in order to address this gap. This annual, two-day workshop provides newly-hired chemistry assistant professors from research-intensive universities with training on evidence-based instructional practices. This article presents the results of a longitudinal, quasi-experimental design study that evaluates the short and long-term impacts of the workshop. Online surveys were collected immediately before and after the workshop, as well as one year later from CSC NFW participants and a control group that consisted of newly-hired chemistry faculty who did not participate in the workshop. Surveys measured faculty’s awareness and use of evidence-based instructional practices, teaching self-efficacy, and beliefs about teaching. Classroom video recordings were also collected during the fall semester following the workshop and two years later. These data were triangulated with the Student Evaluation for Educational Quality (SEEQ) survey, which was collected from students in the observed classrooms. Findings indicate that, in the short-term, the CSC NFW was successful in raising workshop participants’ self-efficacy, shifting their teaching beliefs toward student-centered teaching, and increasing their use of interactive teaching. Longitudinal data demonstrate that further pedagogical support is required in order for these impacts to be sustained. KEYWORDS: General Public, Chemistry Education Research, Professional Development FEATURE: Chemical Education Research



INTRODUCTION Within the last couple of decades, there have been tremendous advances in our understanding of effective instructional practices for Science, Technology, Engineering, and Mathematics (STEM) courses thanks to work conducted by the discipline-based education research community. In particular, there are numerous evidence-based instructional practices (EBIPs),1−4 such as Process Oriented Guided Inquiry Learning (POGIL)5 and Peer Instruction,6,7 that have been proven to enhance student learning and attitudes toward science.1,3,4,8,9 Unfortunately, pedagogical training of chemistry graduate students and faculty has often been overlooked and undervalued. This has led to a limited implementation of EBIPs in college-level chemistry classrooms. The Cottrell Scholars Collaborative New Faculty Workshop (CSC NFW) program10,11 intends to address this problem by providing a short workshop and follow-up activities that introduce newly hired chemistry assistant professors to EBIPs. The CSC NFW program follows similar initiatives in other STEM disciplines12 and complements other chemistry workshops that focus on © XXXX American Chemical Society and Division of Chemical Education, Inc.

specific instructional approaches (e.g., POGIL and the Chemistry Collaborations, Workshops and Communities of Scholars). The goal of this study is to assess its impact. Prior studies investigating the impact of STEM faculty workshops suffer from several methodological weaknesses. First, workshop impact has been mostly evaluated by measuring changes in self-reported awareness and adoption of EBIPs by the workshop participants4,12−14 even though research has shown that self-reported data do not reliably reflect actual instructional practices.4,15−17 Second, studies have focused on short-term outcomes, i.e., differences between surveys conducted immediately before and after the workshop. Longitudinal studies are rare but critical since faculty’s instructional transformation occurs over time.4 Third, few studies have included a control group.18 Since little is known about the evolution of STEM faculty’s instructional practices as they gain teaching experience,19 a control group is necessary to derive meaningful conclusions about impact. Finally, even though

A

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

community (e.g., presence or absence of instructional reform documents) levels. Extensive research has demonstrated that these influences constitute barriers to instructional transformation.23,26,28,30,35−43 The CSC NFW program targets the departmental and broader professional community levels. At the departmental level, the program requires applicants to submit a nomination letter from their department chair with their application package; this requirement is intended to raise assistant professors’ perception that teaching is valued at the highest level of the hierarchy in their department. The broader professional community factor is addressed by hosting the workshop at the American Chemical Society (ACS) headquarter in Washington, DC and inviting the ACS Chief Executive Officer to deliver a welcoming message to the participants. The third factor targeted by the CSC NFW program is faculty’s thinking about teaching. This factor includes faculty’s beliefs about teaching, sense of dissatisfaction with current teaching practices, and self-efficacy. These three constructs have been shown to influence instructional decision-making processes.28,44−54 The program intends to address this factor by providing opportunities for the participants to experience EBIPs as students, reflect on these experiences, confront their misconceptions about EBIPs, and test these practices with their peers during the workshop. We expected that these experiences would raise their self-efficacy about implementing EBIPs and shift their beliefs about teaching toward student-centered conceptions.

research has demonstrated the critical role that faculty’s thinking about teaching (e.g., beliefs about teaching) plays in instructional decisions and the need for professional development programs to alter them,20−28 few studies18 have investigated the effect of STEM faculty workshops on this construct. This study addresses these methodological shortcomings.



Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

CONCEPTUAL FRAMEWORK The design of the CSC NFW program is grounded in research on instructional transformation at the undergraduate level26 and organizational change.29−31 In particular, the program is aligned with the Teacher-Centered Systemic Reform (TCSR) Model for a college classroom developed by Gess-Newsome and colleagues (Figure 1).31 This model reveals that faculty’



RESEARCH QUESTIONS The goal of this study is to evaluate the impact of the CSC NFW program on chemistry assistant professors’ knowledge about teaching, ways of thinking about teaching, and instructional practices. Accordingly, the research question is: To what extent does participation in the CSC NFW program impact new chemistry assistant professors’ (1) awareness of EBIPs, (2) attitudes and beliefs about student-centered teaching, (3) teaching self-efficacy, and (4) classroom practices?

Figure 1. Teacher-Centered Systemic Reform (TCSR) Model for a college classroom. Figure adapted from Gess-Newsome et al.31 Reprinted by Permission of SAGE Publications.

instructional practices are influenced by personal factors, contextual factors, and instructor’s thinking about teaching. It also highlights that reform efforts should target these three factors. Personal factors include nature and extent of teaching experience, as well as prior and ongoing training in teaching. Workshops in physics and biology have demonstrated that targeting assistant professors who typically have little training and experience with teaching can be effective.12,16,32−34 The CSC NFW program thus targets newly hired chemistry assistant professors from research-intensive institutions. Contextual factors include influences at the classroom (e.g., class size), department (e.g., norms around teaching), institution (e.g., reward structure), and the broader professional



METHODS A quasi-experimental, longitudinal design was implemented to address our research question (Figure 2). Identical data were collected from participants in the CSC NFW program and from a control group consisting of chemistry assistant professors with similar characteristics to those participating in the program. This study was approved by the University of Nebraska-Lincoln Institutional Review Board office (IRB approval #20120712790 EX and 20120912845 EP).

Figure 2. Data collection associated with the quasi-experimental design used in this study. B

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Study Participants

Table 3. Topics and Activities Related to Teaching Addressed during the CSC NFW

Study participants included the chemistry assistant professors who attended the CSC NFW program in 2012 and 2013 (i.e., treatment group) and chemistry assistant professors who did not attend the CSC NFW in summer 2012 and 2013 (i.e., control group). All faculty came from research-intensive institutions. In both group, assistant professors were either about to start their academic career in the fall of 2012 or 2013 or had been a faculty member for one academic year by the time of the first piece of data was collected. Control faculty were identified by searching Web sites of chemistry departments at research-intensive institutions in the United States; they were invited to participate in the study by e-mail. A total of 47 treatment faculty, representing 58% of all the 2012 and 2013 CSC NFW participants, and 18 control faculty provided matched pre, post, and delayed surveys. As discussed earlier, teaching experience and teaching load impact instructional practices. We collected this information in the pre survey to ensure similarities on these characteristics between the control and treatment groups. Results are summarized in Table 1. Chi-square tests showed no significant

Teaching experience

Treatment (N = 47)

p

Lab TA Recitation TA Lecture as a graduate student Taught at a 4 year college Attended workshops on teaching

83% 67% 39% 33% 56%

70% 64% 47% 34% 77%

>0.050 >0.050 >0.050 >0.050 >0.050

Topics and Activities

1

Just-in-Time Teaching exercise due Oral presentation: The Difference Between Teaching and Learning Just-in-Time Teaching: Model implementation of JiTT and explain the technique Oral presentation: Introduction to scientific teaching/research-based teaching Active learning exercises: participants play the role of students; organizers model actives Teachable tidbit project - Part 1: Learning objective, backward design syllabi and selection of content Teachable tidbit report-out Lunch topic: Learning taxonomies Teachable tidbit project - Part 2: Make a content element active Teachable tidbit report-out Oral presentation: Assessing student-learning Teachable tidbit project - Part 3: Develop formative assessment to assist student learning Teachable tidbit report-out Oral presentation: Engaging large classes and model use of clickers Dinner table topics: e.g., students’ prior knowledge, designing a midterm exam Oral presentation: Addressing student diversity Teachable tidbit project - Part 4: Trial Run - Participants try out their teachable tidbit on their peers in small groups Feedback on Teachable tidbit projects What worked? What was less successf ul? Were there trends?

2

Table 1. Teaching Experience of the Treatment and Control Faculty Control (N = 18)

Day

3

a 10−15 min activity (called teachable tidbit). Each cycle starts with participants experiencing and/or learning about a studentcentered instructional practice (e.g., Just-in-Time Teaching) or principle (e.g., difference between formative and summative assessments). Participants are then provided with time to work on integrating these ideas into their teachable tidbit. Throughout the process, they work in groups and receive feedback from each other and program facilitators. On the last day, they test their teachable tidbit on their peers.

difference in teaching experience between the two groups (Table 1). The difference in training on teaching between the two groups was also not significant. Mann−Whitney tests showed no meaningful differences in the academic appointments between the two groups (Table 2). Indeed, even though

Data Collected

Table 2. Academic Appointment of the Treatment and Control Faculty Proportion of appointment

Control (N = 18)

Treatment (N = 47)

Mann−Whitney test results

Teachinga Research Servicea

40.6% 50.1% 9.0%

31.9% 53.5% 13.6%

p = 0.031; r = 0.268 p > 0.050 p = 0.012; r = 0.310

Online surveys were collected from study participants 2 weeks before (pre survey), 1 week after (post survey), and a year after (delayed survey) the CSC NFW (Figure 2). Each took 20−30 min to fill out. The three surveys contained three common sections relevant to this study (see Supporting Information): (1) awareness and adoption of EBIPs, (2) attitudes and beliefs about student-centered teaching, and (3) teaching self-efficacy. Questions measuring awareness and adoption of EBIPs were leveraged from prior studies.32,56 Study participants were asked to identify their level of familiarity and use of 16 EBIPs using a six-point Likert scale (see Supporting Information). A short description of each EBIP was also provided. Beliefs about teaching were measured through the Approaches to Teaching Inventory (ATI).57,58 The ATI has been demonstrated to provide valid and reliable data when characterizing postsecondary instructors’ position on both a teacher-centered and student-centered scale.23,57,59 Measures on this instrument have been shown to reflect instructors’ conceptions of teaching.22,52 Self-efficacy in teaching was measured with the Self-Efficacy toward Teaching Inventory-Adapted (SETI-A), which has also been shown to provide valid and reliable data.54 Validity and reliability tests (i.e., Cronbach’s alphas and confirmatory factor analyses) on the latter two instruments provided satisfactory results for our study participants (see Supporting Information).

a Statistically significant differences were observed between the treatment and control group.

the level of teaching and service appointments were significantly different between the two groups (p < 0.05), the effect sizes were small (r values in Table 2).55 Moreover, faculty in the two groups reported holding a similar teaching load: one course per semester, typically at the upper-undergraduate or graduate level. These results indicate that the two groups had similar background characteristics. Workshop

The CSC NFW program consists of a two-day workshop hosted in the summer. The number of participants has ranged from 38 in 2012 to 57 in 2014. Workshop details have been previously reported in this Journal.10 The workshop is built around several cycles (Table 3) that lead to the development of C

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

talk to their peer, and vote again),6,7 and Collaborative Learning.

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Study participants were also invited in the pre survey to volunteer to have their course videotaped. We mailed a GoPro camera with a tripod and memory card to each volunteer. We asked them to videotape their class for a week and to place the camera in the back of the room. Local IRB offices were contacted in order to obtain approval for these data to be collected. Classroom observations were conducted during the fall semesters immediately following the CSC NFW and two years later (Figure 2). We also asked the videotaped participants to share with their students at the end of the semester a URL link to the Students’ Evaluation of Educational Quality (SEEQ) survey.60 We chose this instrument because it allows us to triangulate faculty selfreports and observational data with student evaluation of the level of group work used in the study participants’ courses. Level of participation by study participants on the matched pre/post/delayed survey series, observations, and student evaluation surveys are provided in Table 4.



FINDINGS This study investigated the short and long-term impacts of a two-day professional development program on its participants’ awareness of EBIPs, attitudes and beliefs about studentcentered teaching, teaching self-efficacy, and instructional practices. In the following sections, we present the results associated with each of these potential impacts. Impact on Awareness of Evidence-Based Instructional Practices

Study participants were probed about their awareness of 16 EBIPs on the pre, post, and delayed surveys. For each EBIP, a short description was provided along with a six-point Likert scale. A participant was coded as being aware of an EBIP if s/he selected any of the following options: I am familiar but have not used it; I am familiar and plan to implement it; In the past, I have used all or part of it but I am no longer using it; or I currently use all or part of it. As Figure 3 indicates, the CSC NFW raised

Table 4. Number of Faculty Who Provided Matched Pre/ Post/Delayed Surveys, Observations, And Student Evaluation Surveys Type of data collected Matched Pre/Post/Delayed Survey Series Observation (Post/Delayed) Student’s Evaluation of Educational Quality (Post/Delayed)

Treatment Control Treatment Control Treatment Control

Invited

Useable data

Response rate

81 120 81/22 120/5 22/3 5/0

47 18 22/3 5/0 18/0 4/0

58% 15% 27%/4% 4%/0% 82%/0% 80%/0%

Figure 3. Average number of evidence-based instructional practices that the CSC NFW participants (i.e., treatment) and the control group knew on the pre, post, and delayed survey. Error bars represent the standard errors of the mean.

Analysis

participants’ level of awareness of EBIPs from an average of eight EBIPs prior to attending the workshop, to 14 immediately following participation in the workshop and 15 a year later. A Friedman test demonstrated that this increase was significant, χ2(2,N = 47) = 73.880, p 0.05; Figure 5a,b).

Long-term impacts on this scale were characterized with Friedman tests, which showed no significant differences across the three time points for both groups. Thus, the gain the CSC NFW participants made as a result of their participation in the program is not fully sustained over time. Mixed between−within subjects ANOVA tests were conducted to evaluate short and long-term impacts of the program on the teacher-centered scale. There was no significant interaction between type of group and time as well as no significant main effect of time (Figure 4b). However, the main effect comparing the two groups was significant, F(1,63) = 4.248, p = 0.043, ηp2 = 0.063. The treatment group was significantly less teacher-centered than the control group (Figure 4b). The same findings were observed when testing changes across all three time blocks. Only the main effect between the control and treatment group was significant, F(1,63) = 5.675, p = 0.020, ηp2 = 0.083. In conclusion, no E

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Journal of Chemical Education

Article

characteristics of faculty who participate in this type of program versus those who do not. Further studies are necessary to fully understand the causes of this gap. Increased teaching experience has been shown previously to positively influence self-efficacy.54 Our data are consistent with this finding. Indeed, we saw similar growth in self-efficacy between the post and delayed surveys for the treatment and control group (Figure 6).

However, there was a significant and strong interaction between time and type of group for the third item, Wilks’ Lambda = 0.922, F(1,63) = 5.357, p = 0.024, ηp2 = 0.078 for the pre−post test and Wilks’ Lambda = 0.900, F(2,62) = 3.297, p = 0.044, ηp2 = 0.096 for the pre−post−delayed test (Figure 5c). The treatment group significantly increased overtime their agreement with I am interested in implementing other strategies than lecturing in my class (from M = 4.1 on the pre survey to M = 4.3 and M = 4.5 on the post and delayed survey, respectively), while the control group’s level of agreement decreased (from 4.2 on the pre survey to 3.8 and 4.0 on the post and delayed survey). No significant interaction between time and group was observed for the last item, Group work is more appropriate in the recitation part of the course than in lecture, but we found a significant main effect for time with the short-term, Wilks’ Lambda = 0.903, F(1,63) = 6.791, p = 0.011, ηp2 = 0.097, and long-term impact analyses, Wilks’ Lambda = 0.825, F(2,62) = 6.573, p = 0.003, ηp2 = 0.175. Both groups decreased in their agreement with this item: M = 3.4, M = 3.04, and M = 2.9 on the pre, post, and delayed survey, respectively, for the treatment group and M = 3.6, M = 3.3, and M = 3.3 on the pre, post, and delayed survey, respectively, for the control group (Figure 5d). In summary, the CSC NFW program was able to shift its participants toward student-centered beliefs about teaching in the short term, but this effect diminished over time. The CSC NFW participants were less teacher-centered than the control group before participating in the program, and these beliefs were not impacted by the program. The program significantly influenced its participants’ interest to implement strategies other than lecture and this interest grew over time; however, no significant differences between the CSC NFW participants and the control group were observed in the short and long-term on the other attitudinal measures.

Impact on Instructional Practices

We triangulated several data sources to assess the impact of the CSC NFW program on the teaching of its participants. First, we asked study participants to self-report their use of 16 different EBIPs on each survey. Results are presented in Figure 7. We did not expect differences between the pre and

Figure 7. Average number of evidence-based instructional practices that the CSC NFW participants (i.e., treatment) and the control group self-report adopting in their instructional practices on the pre, post, and delayed survey. Error bars represent the standard errors of the mean.

post survey as these were collected over the course of a summer month but rather between the pre and delayed survey. As expected, there is no change between the pre and post survey for both groups, with each implementing, on average, less than one EBIP out of the 16 provided. However, the average number of EBIPs implemented a year later is significantly higher for the treatment group (M = 5.4) compared to the control group (M = 3.7), t(63) = −2.058, p = 0.044, η2 = 0.062. Second, we asked study participants to report their frequency of use of 11 different instructional behaviors (Table 5). These behaviors came from the Teaching Dimension Observation Protocol64 and were accompanied by a short description, which ensured that study participants understood what the behaviors entailed (see Supporting Information). Options for frequency of use included Never; Once or twice per semester; Several times per semester; Weekly; Nearly every class; and Multiple times per class. We expected differences between the pre and delayed surveys for the same reason mentioned above. Focusing first on the lecture-type behaviors, Table 5 indicates that the treatment group reported similar level of use of these behaviors across both surveys, with a frequency between Nearly every class and Multiple times per class; however, the control group reported a lower frequency of use of these behaviors a year later, shifting on average from Multiple times per class to Nearly every class. Mixed between−within subject ANOVA tests showed that two of these behavioral shifts were significant with moderate to large effect sizes, F(1,63) = 7.319, p = 0.009, ηp2 = .104 for Lecture on problem solving; and F(1,63) = 7.706, p = 0.007, ηp2 = 0.109 for Interactive lecture. These results indicate that the frequency of use of lecturing strategies (i.e., at least every class) did not change over time for both groups. Next, we consider behaviors that are characteristic of student-centered teaching, i.e., Group work/Discussion,

Impact on Teaching Self-Efficacy

Study participants’ self-efficacy was measured with the SelfEfficacy Toward Teaching Inventory-Adapted (SETI-A). This 32-item measure provides a score between 32 and 128. Mixed between−within subjects ANOVA tests showed statistically significant interaction between time and type of group for both the short, F(1,63) = 6.605, p = 0.013, ηp2 = 0.095, and longterm impact analyses, F(2,62) = 3.799, p = 0.028, ηp2 = 0.109 (Figure 6). The average SETI-A score increased significantly

Figure 6. Average score on the Self-Efficacy toward Teaching Inventory-Adapted (SETI-A) for the CSC NFW participants (i.e., treatment) and the control group on the pre, post, and delayed survey. Error bars represent the standard errors of the mean.

more overtime for the treatment group (from 87.5 on the pre survey to 97.7 on the delayed survey) compared to the control group (from 96.5 on the pre survey to 99.6 on the delayed survey). Interestingly, the treatment group had a significantly lower self-efficacy on the pre survey compared to the control group. The CSC NFW program essentially closed this gap (Figure 6). This gap may indicate differences in the F

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Table 5. Average Frequency of Implementation of 11 Instructional Behaviors Reported on the Pre and Delayed Survey along with Results of the Mixed between−within Subjects Analysis of Variancea Control Instructional behavior Lecture with premade visuals Lecture with handwritten visuals Lecturing on problem solvingb Lecturing with demonstration of topic or phenomena Interactive lectureb Illustration

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Deskwork Group work/discussionb Whole class Discussionb Move into classb Problem solving

Average SE Average SE Average SE Average SE Average SE Average SE Average SE Average SE Average SE Average SE Average SE

Treatment

Pre

Delayed

Pre

Delayed

5.7 0.4 6.6 0.2 6.1 0.2 4.2 0.3 6.1 0.2 5.9 0.2 3.6 0.3 3.6 0.3 3.6 0.3 4.3 0.4 5.1 0.2

5.4 0.4 6.2 0.3 5.3 0.3 3.8 0.4 5.2 0.4 5.3 0.3 3.3 0.3 3.4 0.3 3.1 0.3 3.6 0.4 4.4 0.4

5.1 0.2 5.8 0.2 5.5 0.2 3.9 0.2 5.8 0.2 5.8 0.1 3.5 0.2 3.6 0.2 3.4 0.2 4.2 0.2 4.8 0.2

5.3 0.2 5.9 0.2 5.6 0.1 3.7 0.2 5.9 0.2 5.5 0.2 4.0 0.2 4.5 0.2 3.9 0.2 4.9 0.2 4.9 0.2

Mixed between−within subject ANOVA results p > 0.05 p > 0.05 p = 0.009; ηp2 = 0.104 p > 0.05 p = 0.007; ηp2 = 0.109 p > 0.05 p > 0.05 p = 0.018; ηp2 = 0.086 p = 0.047; ηp2 = 0.061 p = 0.015; ηp2 = 0.091 p > 0.05

a The choice of frequency ranges from Never (2) to Multiple times per class (7); (1) indicated Not Applicable. bStatistically significant differences were observed between the treatment and control group.

Figure 8. (a) Average RTOP scores and (b) classification by types of COPUS profile of the video recordings collected in the fall semester following the CSC NFW program (22 treatment faculty and 5 control faculty) as well as two years later (3 treatment faculty).

Whole class discussion, and Move into class. Table 5 shows that the control and treatment groups reported an almost identical frequency of use of these behaviors on the pre survey, between Several times per semester and Weekly. Mixed between−within subjects ANOVA tests for each of these behaviors identified significant interactions between the type of group and time (Table 5). The control group reported lower frequencies of use of these behaviors on the delayed survey, averaging at Several times per semester, while the treatment group reported higher frequencies of use (average use is Weekly to Nearly every class on the delayed survey). These results indicate that, a year later, the CSC NFW participants integrated or believed that they were integrating more student-centered strategies in their teaching than the control group. The third piece of data was week-long observations of study participants’ courses. These observations were collected during the fall semester following the CSC NFW and two years later. Unfortunately, none of the control group participants provided

videos two years later. We thus compare the treatment and control group on the videos collected during the fall semester immediately following the CSC NFW and changes between the two time periods for the treatment group only. The majority of faculty who participated in the study taught upper-level undergraduate and graduate level courses. Only five of the 22 treatment faculty who provided post videos taught lower-level undergraduate courses; none of the control and treatment faculty who provided videos two years later taught this type of courses. We used RTOP and COPUS (see Methods section) to analyze video recordings and summarized these findings in Figure 8. The RTOP score measures student-centeredness of teaching practices; a score below 30 indicates a teachercentered environment, a score between 31 and 49 indicates a transitional environment, and a score between 50 and 100 indicates a student-centered environment.65 On average, workshop participants had a significantly higher RTOP score than the control group, t(58.361) = −2.238, p = 0.029, η2 = G

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Article

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Figure 9. Average level of agreement of students in treatment faculty courses (N = 18) and students in control faculty courses (N = 4) on the four items measuring the Group Interaction construct. 1 = Strongly disagree, 5 = Strongly agree. Error bars represent the standard errors of the mean.

treatment group were significant, the effect sizes for each item ranged from moderate-large: p = 0.236, η2 = 0.082 for Students were encouraged to ask questions and were given meaningf ul answers and p = -.161, η2 = 0.010 for Students were encouraged to express their own ideas and/or question the instructor; to large: p = 0.050, η2 = 0.195 for Students were encouraged to participate in class discussions and p = 0.051, η2 = 0.194 for Students were invited to share their ideas and knowledge. Student data thus seem to align with the faculty self-reported data. Overall, the observational data indicate that faculty who participated in the CSC NFW program provided more opportunities for students to share their ideas in class when compared to the control group, although the instructional style was still largely teacher-centered.

0.060 (Figure 8a). The control group average (M = 28.4) was in the teacher-centered category, while the treatment group average (M = 33.8) was at the bottom of the transitional category. The average RTOP score for the three treatment faculty who provided data two years later declined from an average of 33.8−29.4. The transformation of COPUS data into COPUS profiles63 allows further interpretation of RTOP scores (Figure 8b). Over 90% of the control group observations were classified in Lecture profiles compared to two-thirds of the treatment observations. The last third of the treatment group’s observations were classified in Socratic profiles. Two years later, observations from two of the three treatment faculty who submitted data were still classified in the Lecture profiles (Figure 8b). Observations from the third faculty shifted from Socratic and Peer Instruction to Lecture and Socratic (Figure 8b). The observation data thus indicate that the treatment group, while being slightly more interactive with students, still approached their teaching in a teacher-centered manner. The last piece of data was the Student Evaluation of Educational Quality (SEEQ) survey. Since no incentive was provided to the students, the response rates varied widely from faculty to faculty. Moreover, the total number of students enrolled in a course was estimated from the surveys and the videos. The response rates that we calculated are thus inaccurate estimates. We arbitrarily chose to not include in the analysis courses for which less than 10% of the students in the class responded. This resulted in 22 study participants (18 treatment and four control) for which data could be analyzed. The average estimated response rate across these courses was 61% ± 38%. One particular construct measured by the SEEQ survey is called Group Interaction. It includes the following items: Students were encouraged to participate in class discussions; Students were invited to share their ideas and knowledge; Students were encouraged to ask questions and were given meaningf ul answers; and Students were encouraged to express their own ideas and/or question to the instructor. Students indicated on a fivepoint Likert scale the extent to which they agreed with these statements (1 = strongly disagree and 5 = strongly agree). An independent t test showed a large effect size although nonsignificant difference in the average level of agreement for the Group Interaction construct between the students of faculty in the treatment (M = 4.3, SE = 0.11) and control group (M = 3.9, SE = 0.25), t(20) = 1.809, p = 0.086, η2 = 0.154. Figure 9 presents the average students’ level of agreement on each of the four items in the Group Interaction construct. Although t tests showed that none of the differences between the control and



LIMITATIONS One of the main limitations of this study is the small size of the control group. Although care was taken to ensure that its similarity to the treatment group on key characteristics, the control group may have some bias since they were not randomly chosen but volunteered to participate. Moreover, we found that the treatment group knew significantly less EBIPs and had significantly less teacher-centered beliefs and selfefficacy than the control group on the pre survey. These differences indicate that both groups are not completely equivalent and that the CSC NFW may be attracting a certain type of assistant professors. The inclusion of the control group still provided some useful comparisons that enabled us to draw more adequate conclusions regarding the extent of the impact of the program. A second limitation is the focus of the analyses on the lecture component of the course. Other aspects of the course that were not captured such as recitation or the laboratory, could provide a student-centered environment. However, study participants mostly taught upper-level undergraduate and graduate courses, which typically do not contain components other than lecture. Finally, the collection of student data was difficult, which is reflected in the large variation in response rates and the low number of courses that could be used for the analysis. The nature of this work prohibited us from requiring any form of incentive that would ensure higher response rates, such as offering some form of extra credit or compensation. However, the data were insightful and evaluations of professional development program should integrate similar measures to triangulate observational data. H

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Journal of Chemical Education

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324



Article

Notes

DISCUSSION AND IMPLICATIONS The goal of this study was to evaluate the impact of a two-day professional development program targeting chemistry assistant professors on their instructional practices and ways of thinking about teaching. The findings highlight that even a short program can have a substantial impact. We found support for several of the hypotheses we developed from the TCSR Model for a college classroom. In particular, the CSC NFW program enhanced its participants’ knowledge of EBIPs and teaching self-efficacy. It also appeared to enhance integration of student−student and student−instructor interactions in their classes. These results might be improved if faculty were supported upon their return to their home institution. The CSC NFW program is attempting to provide this support in two ways. First, since 2013, the organizers have invited alumni of the program to participate in monthly online and themed discussion sessions. Topics have included reflection on what has and has not worked, midterm assessments, and getting students to “buy-in” to active learning. This is intended to provide a trusted community for faculty who may no other support. Second, since 2014, the program organizers inform the Teaching and Learning Centers at the faculty’s home institution about their participation in the program and encourage the faculty and the centers to interact. Studies are ongoing to evaluate the impact of these initiatives, which intend to promote positive contextual factors for faculty to implement student-centered instructional practices. The need for local support was highlighted by the longitudinal data that were collected as part of this study. The impact of professional development may seem positive/ negative/minimal in the short term but may diminish/increase over time. For example, we saw a short-term impact on the CSC NFW participants’ student-centered beliefs about teaching, but this gain had diminished a year later. On the other hand, we saw a growth in the CSC NFW participants’ interest in implementing other instructional strategies than lecture during the year following their participation in the program. The program may also have initiated thoughts among the faculty that will only be put into action once the faculty are granted tenure. To truly characterize the effect of an instructional reform, evaluation studies must collect data longitudinally. It is unclear how long it would take before claims can be made about the success of a reform effort, although some have suggested a period of at least three years.4 Funding programs promoting instructional reforms should thus take into account the need to conduct longitudinal studies in order to measure the actual impact of the initiatives they fund.



The authors declare no competing financial interest.



ACKNOWLEDGMENTS We thank Matthew Moffitt and Kaitlyn Rosploch for their help with the logistics associated with the collection of video recordings as well as student and faculty surveys. We also thank Andrew Feig, Trisha Vickrey, Rory Waterman, and Jodi Wesemann for their helpful feedback on the manuscript. Finally, we thank the reviewers for their insightful feedback.



ASSOCIATED CONTENT

S Supporting Information *

The pre, post, and delayed surveys are provided, along with the results of the validity and reliability tests for the ATI and SETIA instruments. The Supporting Information is available on the ACS Publications website at DOI: 10.1021/acs.jchemed.5b00324.



REFERENCES

(1) Handelsman, J.; Ebert-May, D.; Beichner, R.; Bruns, P.; Chang, A.; DeHaan, R.; Gentile, J.; Lauffer, S.; Stewart, J.; Tilghman, S. M.; Wood, W. B. Scientific teaching. Science 2004, 304, 521. (2) Handelsman, J.; Miller, S.; Pfund, C. Scientific Teaching; W.H. Freeman & Company, in collaboration with Roberts & Company Publishers: New York, 2006. (3) National Research Council. Promising Practices in Undergraduate Science, Technology, Engineering, And Mathematics Education: Summary of Two Workshops; The National Academies Press, 2011. (4) National Research Council Status, contributions, and future direction of discipline-based education research; The National Academies Press: Washington, DC, 2012. (5) Moog, R. S.; Spencer, J. N. Process-Oriented Guided Inquiry Learning; ACS Symposium Series 994; American Chemical Society: Washington, DC, 2008. (6) Mazur, E. Peer Instruction: A User’s Manual; Prentice Hall: Upper Saddle River, NJ, 1997. (7) Vickrey, T.; Rosploch, K.; Rahmanian, R.; Pilarz, M.; Stains, M. Research-based implementation of peer instruction: A literature review. CBE Life Sci. Educ. 2015, 14, es3. (8) Eberlein, T.; Kampmeier, J.; Minderhout, V.; Moog, R. S.; Platt, T.; Varma-Nelson, P.; White, H. B. Pedagogies of engagement in science: A comparison of pbl, pogil, and pltl. Biochem. Mol. Biol. Educ. 2008, 36, 262. (9) Haak, D. C.; HilleRisLambers, J.; Pitre, E.; Freeman, S. Increased structure and active learning reduce the achievement gap in introductory biology. Science 2011, 332, 1213. (10) Baker, L. A.; Chakraverty, D.; Columbus, L.; Feig, A. L.; Jenks, W. S.; Pilarz, M.; Stains, M.; Waterman, R.; Wesemann, J. L. Cottrell scholars collaborative new faculty workshop: Professional development for new chemistry faculty and initial assessment of its efficacy. J. Chem. Educ. 2014, 91, 1874. (11) Cottrell Scholars Collaborative. http://www.chem.wayne.edu/ feiggroup/CSCNFW/ (accessed Jan 28, 2015). (12) Council of Scientific Society Presidents The Role of Scientific Societies in STEM Faculty Workshops: A Report of the May 3, 2012 Meeting; American Chemical Society: Washington, DC, 2013. (13) Felder, R. M.; Brent, R. The national effective teaching institute: Assessment of impact and implications for faculty development. J. Eng. Educ. 2010, 99, 121. (14) Henderson, C. Promoting instructional change in new faculty: An evaluation of the physics and astronomy new faculty workshop. Am. J. Phys. 2008, 76, 179. (15) Kane, R.; Sandretto, S.; Heath, C. Telling half the story: A critical review of research on the teaching beliefs and practices of university academics. Rev. Educ. Res. 2002, 72, 177. (16) Ebert-May, D.; Derting, T. L.; Hodder, J.; Momsen, J. L.; Long, T. M.; Jardeleza, S. E. What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience 2011, 61, 550. (17) D’Eon, M.; Sadownik, L.; Harrison, A.; Nation, J. Using selfassessments to detect workshop success: Do they work? Am. J. Eval. 2008, 29, 92.

AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. I

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Journal of Chemical Education

Article

(18) Derting, T. L.; Maher, J. M.; Passmore, H. A.; Henkel, T. P.; Arnold, B.; Momsen, J. L.; Ebert-May, D. In National Association of Biology Teachers Cleveland, Ohio, 2014. (19) Talanquer, V. Dber and stem education reform: Are we up to the challenge? J. Res. Sci. Teach. 2014, 51, 809. (20) Schuster, J. H.; Finkelstein, M. J. The American Faculty: The Restructuring of Academic Work and Careers; Johns Hopkins University Press: Baltimore, MD, 2006. (21) Murray, K.; Macdonald, R. The disjunction between lecturers’ conceptions of teaching and their claimed educational practice. Higher Educ. 1997, 33, 331. (22) Kember, D.; Kwan, K. P. Lecturers’ approaches to teaching and their relationship to conceptions of good teaching. Instr. Sci. 2000, 28, 469. (23) Trigwell, K.; Prosser, M. Congruence between intention and strategy in university science teachers’ approaches to teaching. Higher Educ. 1996, 32, 77. (24) Hora, M. A situative analysis of the relationship between faculty beliefs and teaching practice: Implications for instructional improvement at the postsecondary level. 2012, http://www.wcer.wisc.edu/ publications/workingPapers/Working_Paper_No_2012_10.pdf (accessed Jul 1, 2015). (25) Dancy, M.; Henderson, C. Framework for articulating instructional practices and conceptions. Phys. Rev. Spec. Topi. Phys. Educ. Res. 2007, 3, 010103. (26) Henderson, C.; Beach, A.; Finkelstein, N. Facilitating change in undergraduate stem instructional practices: An analytic review of the literature. J. Res. Sci. Teach. 2011, 48, 952. (27) Henderson, C.; Cole, R.; Froyd, J.; Khatri, R. Five claims about effective propogation: A white paper prepared for January 30−31, 2012 meetings with NSF-TUES program directors. 2012, http:// homepages.wmich.edu/~chenders/Publications/ 2012WhitePaperFiveClaims.pdf (accessed May 4, 2015). (28) Gess-Newsome, J.; Southerland, S. A.; Johnston, A.; Woodbury, S. Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. Am. Educ. Res. J. 2003, 40, 731. (29) Kezar, A. J. Understanding and Facilitating Organizational Change in the 21st Century: Recent Research and Conceptualizations; Jossey-Bass: San Francisco, CA, 2001. (30) Austin, A. Promoting evidence-based change in undergraduate science education: A white paper commissioned by the national academies national research council board on science education. 2011, http://sites.nationalacademies.org/cs/groups/dbassesite/documents/ webpage/dbasse_072578.pdf (accessed May 4, 2015). (31) Gess-Newsome, J.; Southerland, S. A.; Johnston, A.; Woodbury, S. Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. Am. Educ. Res. J. 2003, 40, 731. (32) Henderson, C.; Dancy, M.; Niewiadomska-Bugaj, M. The use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Phys. Rev. ST Phys. Educ. Res. 2012, 8, 020104. (33) Derting, T. L.; Ebert-May, D. Learner-centered inquiry in undergraduate biology: Positive relationships with long-term student achievement. CBE Life Sci. Educ. 2010, 9, 462. (34) Ebert-May, D.; Weber, E. P. First–what’s next? CBE Life Sci. Educ. 2006, 5, 27. (35) Anderson, W.; Banerjee, U.; Drennan, C.; Elgin, S.; Epstein, I.; Handelsman, J.; Hatfull, G.; Losick, R.; O’Dowd, D.; Olivera, B. Changing the culture of science education at research universities. Science 2011, 331, 152. (36) Brownell, S. E.; Tanner, K. D. Barriers to faculty pedagogical change: Lack of training, time, incentives, and··· tensions with professional identity? CBE Life Sci. Educ. 2012, 11, 339. (37) Childs, P. E. Improving chemical education: Turning research into effective practice. Chem. Educ. Res. Pract. 2009, 10, 189. (38) Froyd, J. Propagation and realization of educational innovations in the system of undergraduate STEM education: A white paper

commissioned for the national academy of engineering forum “characterizing the impact and diffusion of engineering education innovations”, 2011. https://www.nae.edu/File.aspx?id=36824 (accessed May 4, 2015). (39) Henderson, C.; Dancy, M. H. Increasing the impact and diffusion of stem education innovations: A white paper commissioned for the national academy of engineering forum “characterizing the impact and diffusion of engineering education innovations”. 2011, https://www.nae.edu/File.aspx?id=36304 (accessed May 4, 2015). (40) Hora, M. T. Organizational factors and instructional decisionmaking: A cognitive perspective. Rev. High. Educ. 2012, 35, 207. (41) Seymour, E.; DeWelde, K.; Fry, C. Determining progress in improving undergraduate stem education: The reformers’ tale. A white paper commissioned for the national academy of engineering forum, “characterizing the impact and diffusion of engineering education innovations. 2011, https://www.nae.edu/File.aspx?id=36664 (accessed May 4, 2015). (42) Walczyk, J. J.; Ramsey, L. L.; Zha, P. Obstacles to instructional innovation according to college science and mathematics faculty. J. Res. Sci. Teach. 2007, 44, 85. (43) Henderson, C.; Dancy, M. Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Phys. Rev. ST Phys. Educ. Res. 2007, 3, 020102. (44) Dole, J. A.; Sinatra, G. M. Reconceptalizing change in the cognitive construction of knowledge. Educ. Psychol. 1998, 33, 109. (45) Feldman, A. Decision making in the practical domain: A model of practical conceptual change. Sci. Educ. 2000, 84, 606. (46) Gregoire, M. Is it a challenge or a threat? A dual-process model of teachers’ cognition and appraisal processes during conceptual change. Educ. Psychol. Rev. 2003, 15, 147. (47) Strike, K. A.; Posner, G. J. In Philosophy of Science, Cognitive Psychology, And Educational Theory and Practice; Duschl, R., Hamilton, R., Eds.; SUNY Press: Albany, 1992; p 147. (48) Sunal, D. W.; Hodges, J.; Sunal, C. S.; Whitaker, K. W.; Freeman, L. M.; Edwards, L.; Johnston, R. A.; Odell, M. Teaching science in higher education: Faculty professional development and barriers to change. Sch. Sci. Math. 2001, 101, 246. (49) Gess-Newsome, J. In Re-examining Pedagogical Content Knowledge in Science Education; Berry, A., F, P., Loughran, J., Eds.; Routledge Press: London, 2015. (50) Ghaith, G.; Yaghi, H. Relationships among experience, teacher efficacy, and attitudes toward the implementation of instructional innovation. Teach. Teach. Educ. 1997, 13, 451. (51) Guskey, T. R. Teacher efficacy, self-concept, and attitudes toward the implementation of instructional innovation. Teach. Teach. Educ. 1988, 4, 63. (52) Gordon, C.; Debus, R. Developing deep learning approaches and personal teaching efficacy within a preservice teacher education context. Br. J. Educ. Psychol. 2002, 72, 483. (53) Postareff, L.; Lindblom-Ylänne, S.; Nevgi, A. A follow-up study of the effect of pedagogical training on teaching in higher education. High. Educ. 2008, 56, 29. (54) Prieto, L. R.; Altmaier, E. M. The relationship of prior training and previous teaching experience to self-efficacy among graduate teaching assistants. Res. High. Educ. 1994, 35, 481. (55) Lewis, S. E. In Tools of Chemistry Education Research; Bunce, D. M., Cole, R. S., Eds.; American Chemical Society: Washington, DC, 2014; p 115. (56) Henderson, C.; Dancy, M. The impact of physics education research on the teaching of introductory quantitative physics in the united states. Phys. Rev. ST Phys. Educ. Res. 2009, 5, 020107. (57) Trigwell, K.; Prosser, M. Development and use of the approaches to teaching inventory. Educ. Psychol. Rev. 2004, 16, 409. (58) Trigwell, K.; Prosser, M.; Ginns, P. Phenomenographic pedagogy and a revised approaches to teaching inventory. High. Educ. Res. Dev. 2005, 24, 349. J

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX

Downloaded by GEORGETOWN UNIV on August 27, 2015 | http://pubs.acs.org Publication Date (Web): July 24, 2015 | doi: 10.1021/acs.jchemed.5b00324

Journal of Chemical Education

Article

(59) Lindblom-Ylänne, S.; Trigwell, K.; Nevgi, A.; Ashwin, P. How approaches to teaching are affected by discipline and teaching context. Stud. High. Educ. 2006, 31, 285. (60) Marsh, H. W. Seeq: A reliable, valid, and useful instrument for collecting students’ evaluations of university teaching. Br. J. Educ. Psychol. 1982, 52, 77. (61) Piburn, M.; Sawada, D.; Turley, J.; Falconer, K.; Benford, R., Bloom, I.; Judson, E. Reformed teaching observation protocol (rtop) reference manual; Arizona Collaborative for Excellence in the Preparation of Teachers: Tempe, AZ: 2000. (62) Smith, M. K.; Jones, F. H.; Gilbert, S. L.; Wieman, C. E. The classroom observation protocol for undergraduate stem (copus): A new instrument to characterize university stem classroom practices. CBE Life Sci. Educ. 2013, 12, 618. (63) Lund, T. J., Pilarz, M., Velasco, J. B., Chakraverty, D., Rosploch, K., Undersander, M., Stains, M. The best of both worlds: Building on the copus and rtop observation protocols to easily and reliably measure various levels of reformed instructional practices. CBE Life Sci. Educ. 2015, 14.ar1810.1187/cbe.14-10-0168 (64) Hora, M.; Ferrare, J. University of WisconsinMadison, Wisconsin Center for Education Research, Madison, WI, 2010. (65) Budd, D.; van der Hoeven Kraft, K.; McConnell, D.; Vislova, T. Characterizing teaching in introductory geology courses: Measuring classroom practices. J. Geosci. Educ. 2013, 61, 461.

K

DOI: 10.1021/acs.jchemed.5b00324 J. Chem. Educ. XXXX, XXX, XXX−XXX