0
(0)

 An analysis of data from the National Assessment of Educational Progress provides insight into the relationships between mathematics coaches and/or specialists and student achievement. 

Given the ineffectiveness of traditional, whole-group, one-shot professional development workshops (Ball & Cohen, 1999), many school districts have, in recent years, turned to instructional coaches to support the teaching and learning of mathematics (McGatha, Davis, & Stokes, 2015). However, coaching is expensive (Knight, 2012), especially when experienced  teachers are removed from the classroom to serve as coaches (Mangin, 2009), and additional research is needed to better understand whether the benefits of coaching outweigh the costs.   

In theory, coaching should be an effective professional development model, since coaches can meet with teachers multiple times throughout the school year, closely observe their teaching practice, and help them address specific problems and challenges they face (Desimone & Pak, 2017; Gibbons & Cobb, 2017). As yet, though, little large-scale empirical research has been conducted into how well this model actually improves student achievement. Most studies of this topic have been small-scale and qualitative in nature, and few researchers have conducted large-scale, quantitative studies of the relationships between coaching and student achievement (Hjalmarson et al., in press; Saclarides et al., in press). 

One group of studies has shown some promising results, suggesting that math coaching was related to student gains (Balfanz, MacIver, & Byrnes, 2006; Campbell, 1996; Foster & Noyce, 2004). In these studies, however, the coaching was part of a larger school reform initiative, making it difficult to tease out the extent to which the coaching itself — and not some other part of the reform —made a difference.  

To get a clearer picture of how effective mathematics coaching really is — as well as the effects of mathematics specialists — we and other researchers have developed a small body of research that attempts to zero in on the effects of coaching itself, apart from any other variables that might influence student achievement (Campbell & Malkus, 2011; Ellington, Whitenack, & Edwards, 2017; Harbour, Adelson, & Karp, 2016; Harbour et al., 2018). As part of this work, we conducted an empirical study that drew upon restricted-use data from the National Assessment of Educational Progress (NAEP), administered by the National Center for Education Statistics (NCES), to explore relationships between  the presence and responsibilities of mathematics coaches and specialists and student achievement (Harbour et al., 2016, 2018). To our knowledge, this is the only research to date that has measured the influence of mathematics coaches and specialists on student achievement using both large-scale and nationally representative data.  

Delving into the data 

While the words coach and specialist can be defined in a number of ways (National Mathematics Advisory Panel, 2008), a common definition distinguishes a coach as someone who directly supports teachers by providing them with professional learning opportunities, whereas a specialist is someone who works directly with students (McGatha, Davis, & Stokes, 2015). However, NAEP combines the terms coach and specialist in data collection, using the abbreviation MCS to refer to both mathematics coaches and specialists. Thus, our findings and discussions below pertain to both coaches and specialists, as it is not possible to tease out the unique influence of each role.  

We analyzed a data sample that included roughly 191,000 students, from 7,500 elementary schools, who participated in the 2011 4th-grade NAEP in mathematics (including data from the accompanying student, teacher, and school surveys; NCES, 2011a, 2011b). Because NAEP is a large-scale, nationally representative dataset, our findings can be  generalized to a larger population than is possible for smaller-scale studies (Schneider et al., 2007). Also, NAEP is the only large-scale assessment dataset that makes it possible to analyze variables pertaining to MCS’s. 

Not only are NAEP’s data-collection techniques well-respected and widely regarded to be valid, its dataset is particularly rich and varied, including data on overall mathematics achievement as well as achievement in specific mathematical content domains, such as geometry, data analysis, and measurement. It also includes a wealth of other data for each participating student, teacher, and school (for example, on student demographics, teachers’ years of experience, school size, and more), allowing us to analyze various relationships among these variables.  

Thus, in addition to analyzing how the presence of MCS’s in schools, and their specific job responsibilities, related to students’ mathematics achievement, we were also able to see how these related to variables such as students’ opinions of mathematics, the mathematics content that teachers emphasized in their classes, the percentage of students receiving free or reduced-price lunch, whether a school was public or private, and many more (see Harbour et al., 2018 for a complete list of variables and controls). 

To gauge whether the mere presence of MCS’s was related to student achievement, we looked to see whether an elementary school’s policy of employing an MCS (full time, part time, or not at all) connected to students’ overall mathematics achievement score and their scores in five specific content domains (number properties and operations; measurement; geometry; data analysis, statistics, and probability, and algebra). Of the elementary schools in our sample, 62% reported that they did not employ an MCS and 28% reported having an MCS available (full time at 50% of the schools, and part time at 50%). 

Next, because the work of MCS’s can vary widely from school to school, we looked for connections between mathematics achievement (both overall and in specific areas) and principals’ reports as to how much time those coaches and specialists spent on six specific responsibilities: (1) providing technical assistance/support to individual teachers about mathematics content or the teaching of mathematics, (2) conducting professional development for groups of teachers about mathematics content or the teaching of mathematics, (3) providing mathematics instruction to students on various topics, (4) providing mathematics instruction to students at various grade levels, (5) providing mathematics remediation/intervention to some student groups, and (6) providing mathematics enrichment to some student groups (NCES, 2011b). 

Using a large-scale nationally representative dataset such as NAEP has many benefits for this sort of research, but it also has some limitations that must be considered when interpreting the results. For instance, NAEP represents student achievement at only one particular point in time, so it doesn’t allow for analyses across time. Also, while NAEP collects data on a rich range of variables, it leaves some things out — for example, it doesn’t use a precise definition of full-time and part-time employment, making it impossible to know exactly how much time the MCS’s spent in their schools. Also, while it might have been more accurate for MCS’s themselves to report how much time they spent engaged in specific job responsibilities, principals were asked to give this estimate. And most important, while these data allow us to see how specific variables are correlated with each other, they don’t allow us to say for sure that one factor caused the other. 

What we learned  

When we analyzed the data, two important findings emerged that have implications for schools and districts. First, we found a statistically significant relationship between elementary schools that had full-time MCS’s and 4th-grade students’ mathematics achievement, both overall and across the five specific mathematics content domains. In other words, we can tentatively say that elementary schools with a full-time MCS are likely to have higher 4th-grade mathematics achievement compared to schools without them, after controlling for many variables, including school and student demographic data and teaching practices. (For details on these variables and how they affect our findings, see Harbour et al., 2018). This is consistent with prior research (e.g., Campbell & Malkus, 2011; Ellington, Whitenack, & Edwards, 2017; Foster & Noyce, 2004). In contrast, our findings indicated that schools having a part-time MCS did not have significantly different 4th-grade mathematics achievement scores (overall and across content domains) than schools without one. 

Second, we noted significant relationships between the responsibilities of full-time MCS’s (as reported by principals) and achievement outcomes. Interestingly, results revealed statistically significant positive relationships between the time MCS’s spent on the two responsibilities related to supporting teachers (i.e., supporting both individuals and groups of teachers) and student achievement, but there was either no relationship or a negative relationship noted for their work with students. (See Table 1 for details on specific relationships with overall mathematics achievement and across the five content domains.)  

Again, because the data do not indicate a causal relationship, the negative results when MCS’s work more with students must be interpreted with caution. These results do not necessarily mean that having coaches and specialists work directly with students results in lower mathematics achievement scores; it is likely that full-time MCS’s in schools with lower achievement spend more time providing instruction, remediation, and/or interventions to students than they do in other schools.  

What this means for schools 

Given that a significant and positive relationship was only found between full-time coaches and specialists and 4th-grade students’ mathematics achievement, the most pressing implication for district administrators and principals to consider is to prioritize hiring full-time, not part-time, MCS’s. Given that decades of research stress that effective professional development is ongoing and time-intensive (Darling-Hammond et al., 2009; Desimone, 2009; Hawley & Valli, 1999), our findings make sense. And yet, given the high cost of coaching, it is understandable that some school districts will decide that they can only afford to have part-time MCS’s in individual schools, whether that involves sharing MCS’s among several different schools or having an MCS spend half of their time coaching and the other half of their time as an interventionist or classroom teacher. However, if, as our results indicate, schools can expect to see little to no return on the coaching investment if the MCS is only working part time, district administrators and principals would do well to be aware of potential limitations with part-time coaching when making spending decisions.  

District administrators and principals should also think carefully about how they define the responsibilities of their MCSs, particularly the amount of time they spend coaching teachers and/or working directly with students. As they write job descriptions, they should strongly consider emphasizing having MCSs work with teachers in either one-on-one or group settings. Although NAEP does not explicitly define the meaning of one-on-one and group coaching, the literature provides some examples of what it could look like. One-on-one coaching activities may include modeling, co-teaching, and engaging in the coaching cycle (goal setting, working, and reflecting); group coaching may include engaging in lesson study, analyzing classroom videos, examining student work, and studying disciplinary content with other teachers (Gibbons & Cobb, 2017). By focusing coaches’ work on learning opportunities for teachers, administrators can tentatively expect to see a return on their coaching investment.  

This research is a significant step forward in understanding the relationship between MCSs and student achievement. Such research is important, given how quickly schools have invested in hiring MCSs in recent years, as well as the high cost of this professional development model. The evidence of the positive influence of MCSs, particularly when they work with teachers, will be important to leaders as they set professional development priorities. But it is just the beginning — additional research is needed to establish a causal relationship between the work of MCSs and improved student performance and to show how their work relates to student achievement in other subject areas and grade levels.   

References 

Balfanz, R., MacIver, D.J., & Byrnes, V. (2006). The implementation and impact of evidence-based mathematics reforms in high poverty middle schools: A multi- site, multi-year study. Journal for Research in Mathematics Education, 37, 33-64. 

Ball, D. & Cohen, D. (1999). Developing practice, developing practitioners: Toward a practice-based theory of professional education. In L. Darling-Hammond & G. Sykes (Eds.), Teaching as the learning profession: Handbook of policy and practice (pp. 3-32). San Francisco, CA: Jossey-Bass Publishers. 

Campbell, P.F. (1996). Empowering children and teachers in the elementary mathematics classrooms of urban schools. Urban Education, 30, 449–475.  

Campbell, P.F. & Malkus, N.N. (2011). The impact of elementary mathematics coaches on student achievement. The Elementary School Journal, 111, 430-454.  

Darling-Hammond, L., Wei, R.C., Andree, A., Richardson, N., & Orphanos, S. (2009). Professional learning in the learning profession. Washington, DC: National Staff Development Council. 

Desimone, L.M. (2009). Improving impact studies of teachers’ professional development: Toward a better conceptualization and measures. Educational Researcher, 38 (3), 181–199. 

Desimone, L.M. & Pak, K. (2017). Instructional coaching as high-quality professional development. Theory Into Practice, 56 (1), 3-12.  

Ellington, A., Whitenack, J., & Edwards, D. (2017). Effectively coaching middle school teachers: A case for teacher and student learning. The Journal of Mathematical Behavior, 46, 177-195.  

Foster, D. & Noyce, P. (2004). The mathematics assessment collaborative: Performance testing to improve instruction. Phi Delta Kappan, 85, 367-374.  

Gibbons, L.K. & Cobb, P. (2017). Focusing on teacher learning opportunities to identify potentially productive coaching activities. Journal of Teacher Education, 68 (4), 411-425.  

Harbour, K.E., Adelson, J.L., & Karp, K.S. (2016, April). Examining the relationships among mathematics coaches and specialists, student achievement, and disability status. [Paper Session]. American Educational Research Association Annual Meeting. Washington, DC. 

Harbour, K.E., Adelson, J.L., Karp, K.S., Pittard, C.M. (2018). Examining the relationships among mathematics coaches and specialists, student achievement, and disability status: A multi-level analysis using National Assessment of Education Progress data. The Elementary School Journal, 118 (4), 654-679. 

Hawley, W.D. & Valli, L. (1999). The essentials of effective professional development: A new consensus. In G. Sykes and L. Darling-Hammond (Eds.), Teaching as the learning profession: Handbook of policy and practice (pp. 127-150). San Francisco, CA: Jossey-Bass. 

Hjalmarson, M.A., Saclarides, E.S., Harbour, K.E., Livers, S.D., & Baker, C.K. (in press). Mathematics specialists and teacher leaders: An ongoing qualitative synthesis. Proceedings of the 42nd annual meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education. Mazatlán, Sinaloa, Mexico: Department of Mathematics Education of Cinvestav and the Mexican Association for Research on the Use of Technology in Mathematics Education. 

Knight, D.S. (2012). Assessing the cost of instructional coaching. Journal of Education Finance, 38 (1), 52-80.  

Mangin, M.M. (2009). Literacy coach role implementation: How district context influences reform efforts. Educational Administration Quarterly, 45 (5), 759-792.  

McGatha, M.B., Davis, R., & Stokes, A. (2015). The impact of mathematics coaching on teachers and students. (Brief). Reston, VA: National Council of Teachers of Mathematics. 

National Center for Education Statistics. (2011a). The nation’s report card: Mathematics 2011 (NCES 2012-458). Washington, DC: U.S. Department of Education, Institute of Education Sciences.  

National Center for Education Statistics. (2011b). Reading and Mathematics School Questionnaire, 2011, Grade 4. Washington, DC: U.S. Department of Education, Institute of Education Sciences.  

National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education. 

Saclarides, E.S., Baker, C.K., Mudd, A., Livers, S.D., Harbour, K.E., & Hjalmarson, M.A. (in press). An exploration of mathematics teacher leaders in PME-NA Proceedings from 1984-2019. Proceedings of the 42nd annual meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education. Mazatlán, Sinaloa, Mexico: Department of Mathematics Education of Cinvestav and the Mexican Association for Research on the Use of Technology in Mathematics Education. 

Schneider, B., Carnoy, M., Kilpatrick, J., Schmidt, W.H., & Shavelson, R.J. (2007). Estimating causal effects: Using experimental and observational designs. Washington, DC: American Educational Research Association. 

ABOUT THE AUTHORS

default profile picture

Kristin E. Harbour

KRISTIN E. HARBOUR  is an assistant professor in the College of Education, University of South Carolina, Columbia, SC. 

default profile picture

Evthokia Stephanie Saclarides

Evthokia Stephanie Saclarides is an assistant professor in the Department of Curriculum and Instruction at the University of Cincinnati, Ohio.

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.