Three practical innovations can help district leaders ensure their coaching programs are implemented in the way they are intended.

Getty Images

 

In articles and books about teacher professional learning, you often read promising stories about the value of an instructional coach. Maybe a coach helped a novice teacher succeed in a challenging environment and stay in the profession, for example. Or a coach inspired an experienced teacher to reflect and grow.

However, getting great coaching to happen across several schools or an entire district is really hard. In fact, there’s an alarming pattern in studies of the impact of instructional coaching programs. As the number of teachers in a study sample increases, the positive effect of coaching drops (Kraft, Blazar, & Hogan, 2018). That’s no surprise to those who lead school systems. District leaders may be able to identify one or two extraordinary coaches with the magic touch, but other coaches don’t succeed consistently.

How, then, can we provide high-quality instructional coaching districtwide?

Three strategies that did the trick

In the world of research and development (R&D), reproducibility is an imperative. Nobody wants to invest in developing a program that can’t be tested, refined, and reproduced. What would be the point if only a few students benefit? So, those doing R&D on instructional coaching have wrestled with how to help coaches implement a coaching program with fidelity — truly adhering to the program design. Three innovations from that work are particularly practical for district leaders who want to see high-quality instructional coaching districtwide.

Nobody wants to invest in developing a program that can’t be tested, refined, and reproduced.

To illustrate, we describe how the three innovations are used in an extensively researched coaching program called MyTeachingPartner (MTP), which was developed by university researchers and is now offered commercially by Teachstone. When the program developers delivered MTP in a randomized controlled trial in middle and high schools, they found positive effects on student achievement (Allen et al., 2011), which they then replicated in a second trial four years later (Allen et al., 2015). More recently, the nonprofit organization Learning Forward has published a series of articles illustrating how the program helps teachers improve student engagement and describing what it entails for coaches and teachers (Carlson, 2020; Flowers, 2019; Foster, 2021).

To build on what has been learned, we recently completed a project, with support from a federal grant, to determine whether the program could be scaled across several middle and high schools with fidelity, without relying on the developer to do the coaching (as was the case in the prior studies of MTP). That is, we wanted to see if local coaches, receiving only technical support from Teachstone, could be just as effective.

Three school districts joined the project: Midway Independent School District in Texas, Hanover County Public Schools in Virginia, and Prince George’s County Public Schools in Maryland. In total, 12 coaches and 51 teachers collectively completed 250 coaching cycles over a period of five months, just before the pandemic. Teachstone, the sole national provider for MTP, was a partner on the project but is not affiliated with the American Institutes for Research (AIR), the organization where we are based and which served as a leader of the project.

Our independent evaluation of artifacts from a random sample of 38 of these coaching cycles found that Teachstone’s supports for the fidelity of the coaching were sufficient. Coaches addressed, on average, 21.5 of the 23 aspects of coaching (i.e., 93%) that the model prioritizes, as shown in Table 1. For example, the coaches focused on critical segments of video-recorded instruction, and the questions they posed to teachers followed the program’s specifications. Looking across the columns, the district averages vary only a little.

Table 1. Aspects of coaching model successfully addressed. Row 1: Fidelity of video segment selection (3 aspects): Average across all cycles: 98%%, District A average: 100%, District B average: 100%, District C average: 96% Row 2: Fidelity of coach prompts (13 aspects) Average across all cycles: 96%, District A average: 98%, District B average: 100% District C average: 95%. Row 3: Fidelity of summary & action plans (7 aspects): Average across all cycles: 886%, District A average: 89%, District B average: 85%, District C average: 85%. Row 4: Overall fidelity (23 aspects): Average across all cycles: 93%, District A average: 96%, District B average: 96%, District C average: 92% Note: Based on artifacts from the 38 coaching cycles randomly selected by AIR.

But for district leaders and others trying to deliver coaching programs, the takeaway isn’t the numbers. Rather, it’s about how to implement a coaching program with high-quality districtwide. Looking closely at how Teachstone supported the coaches, we see three strategies that deserve more attention.

Specify the what and how of the program in a detailed manual

The first strategy is deceptively straightforward: The central office leader responsible for the coaching program assembles a team to put the coaching model in writing, in the form of a manual, for coaches. We’re not referring to the typical set of training materials (agenda, slides, and handouts). Those don’t provide the right starting point. Rather, as a foundation for developing those materials, the team starts by writing out all of the what and how of the coaching program, in a way that will make sense to coaches.

When we work with program providers on R&D projects, we insist on this step because it forces the team to be clear and intellectually honest. It surfaces simple issues — like how much time coaches and teachers will really need — which often send the team back to the drawing board. It also surfaces deep thinking about what will drive the program’s impact. In other words, the manual will be prescriptive, in a good way. Yet the manual is not simply a recipe. Rather, it specifies processes and content and informs coaches’ judgment calls — by identifying when and where discretion is important and what principles to apply.

With the right team of collaborators, a manual can specify how a program really works and support coaches with practical knowledge that will help them make  decisions that make a difference.

As an illustration, the MTP manuals for coaches and teachers incorporate Teachstone’s accumulated expertise and know-how all in one place and can get quite specific. For example, the manual for coaches includes detailed guidance on asking reflective questions called prompts. It specifies three types of prompts a coach can use and describes the key, necessary features of each. The coaching manual also incorporates dozens of example prompts, designed for training. The specificity reflects the Teachstone team’s accumulated knowledge about what features of prompts tend to stimulate teacher reflection and learning. But coaches are able to use their own judgment about which prompt is needed for the specific situation, and they can adapt the sample prompts as needed, as long as they incorporate the key features.

The coaching manual also identifies common challenges in implementing the program and how to overcome them. Coaches thus have strategies at their fingertips for getting the coaching back on track when things go wrong. With the right team of collaborators, a manual can specify how a program really works and support coaches with practical knowledge that will help them make decisions that make a difference.

Have coaches and teachers use a web platform that lets you monitor the coaching

Even with a manual, it’s easy for a coach to veer away from the design of a coaching program, toward what the coach feels most confident doing. In fact, many coaches have previously been trained in other coaching models, and principals sometimes share ideas they’d like to see enacted that aren’t part of (and may even conflict with) the district’s coaching model. That’s why the central office leader responsible for the coaching program — or the lead coach — needs a way to monitor and support coaches.

To monitor coaching, the lead coach needs to see what’s really happening. A monthly check-in with each coach won’t be sufficient. And if you shadow a coach a couple of times per year, you’ll still wonder what’s really happening day-to-day. You could have coaches complete logs and check boxes to indicate their fidelity to the program model. But asking someone on paper, “Are you really doing what I told you to do?” isn’t necessarily going to give you an accurate picture.

The method that has emerged from R&D efforts is to have coaches and teachers use a web platform to facilitate some of their interactions, such as the teacher sending the coach a video of classroom instruction. The lead coach can access that platform to “see” some key aspects of what’s happening between the teacher and coach without having to ask for an update or visibly interfere. Today, this solution is readily available and inexpensive: Many web-based tools allow a teacher and coach to exchange video of a classroom and annotate it, as well as to give access to a lead coach.

This use of a web platform was an easy fit for the designers of MTP because they were already using a web platform to facilitate asynchronous interactions between coaches and teachers. (They originally did this so coaching could be delivered remotely.) Specifically, to start each MTP coaching cycle, the teacher uploads a classroom video. The coach then logs in and watches the video and responds by selecting three short clips and writing a reflective question, or prompt, for each. The teacher goes online and writes a response to each prompt, which the coach reads before the coaching conversation. (This process has the side benefit of keeping the coaching conversations relatively short because the most difficult cognitive work is already done.) After the conversation, the coach uploads a summary of the conversation and next steps.

As the MTP development team sought ways to monitor instructional coaching, they realized that the data they needed was already in their web platform (Foster, 2019). With access to the platform, the lead coach can see if the coaching is happening on the intended schedule and even drill down to see the video clips and prompts, teacher responses, and the meeting summaries, all of which reveal the fidelity of the coaching.

Leverage the monitoring data to support each coach individually

To make coaching work at scale, coaches need ongoing support. Each coach will have their own struggles with specific features of the coaching program or with assigned teachers who present unique challenges. So, to some degree, a lead coach has to coach the coaches.

Our experience on R&D projects has been that this kind of support is both feasible and productive if there are good monitoring tools (like the online interface described above) and intentional routines. In the MTP program, the monitoring and support routine is built into a monthly one-on-one call between the coach and their assigned specialist at Teachstone. Before each call, the specialist looks at the data in the online platform to see whether the coaching is progressing on schedule. The specialist also picks a recent coaching cycle and goes through its artifacts in depth, assessing each video clip, prompt, and teacher response against the program’s criteria. In the call, the specialist goes over the cycle with the coach, calls out strengths on which to build, and offers individually tailored support for coach development.

Teachstone has found that a full-time specialist can successfully provide ongoing support to up to 18 coaches simultaneously, including monthly one-on-one meetings as well as group meetings. And after a coach has a year of supported experience, the level of support can be scaled back without losing fidelity. This same approach could be used by district-level lead coaches supporting a coaching team across the district.

If the goal is to have a broad impact across a district, then the coaching program must include the guidance and supports needed to ensure it is actually implemented as planned.

Toward more effective instructional coaching

Districts and organizations that offer coaching models will have to adapt these strategies to meet their specific needs and match their particular approach to coaching. But whether you invest in an externally developed coaching program or design your own, you can: 1) Create a manual or guidebook that describes practices in detail, so coaches don’t have to guess at the right approaches; 2) leverage technology to “see” into coaching so you can make course corrections quickly; and 3) provide ongoing, individual, and data-based support for coaches.

It’s hard to imagine a coaching program working at scale without these basic supports. One or two coaches in a district may succeed, and teaching may improve significantly in some classrooms. But if the goal is to have a broad impact across a district, then the coaching program must include the guidance and supports needed to ensure it is actually implemented as planned.

Note: The American Institutes for Research (AIR) is the organization that led the project described in this article. AIR is a nonpartisan, not-for-profit organization that conducts behavioral and social science research and delivers technical assistance. Teachstone was a partner on the project and is the sole national provider for MTP. The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305A180241 to the American Institutes of Research (AIR). The opinions expressed are those of the authors and do not represent views of the Institute, the U.S. Department of Education, Teachstone, or the school district partners.

References

Allen, J.P., Hafen, C.A., Gregory, A.C., Mikami, A.Y., & Pianta, R. (2015). Enhancing secondary school instruction and student achievement: Replication and extension of the My Teaching Partner-Secondary intervention. Journal of Research on Educational Effectiveness, 8 (4), 475–489.

Allen, J.P., Pianta, R.C., Gregory, A., Mikami, A.Y., & Lun, J. (2011). An interaction-based approach to enhancing secondary school instruction and student achievement. Science, 333 (6045), 1034–1037.

Carlson, L. (2020). When teachers listen, students learn: Coaching helps teacher incorporate student voice in the classroom. The Learning Professional, 41 (5), 28-31.

Flowers, J. (2019). Accentuate the positive: Video can motivate teachers to improve their skills. The Learning Professional, 40 (6), 36-40.

Foster, E. (2021). Seeing teaching through a different lens: The MyTeachingPartner-Secondary coaching model. Learning Forward.

Foster, E. (2019). A window into teaching. The Learning Professional, 40 (6), 33-35, 40.

Kraft, M.A., Blazar, D., & Hogan, D. (2018). The effect of teacher coaching on instruction and achievement: A meta-analysis of the causal evidence. Review of Educational Research, 88 (4), 547–588.


This article appears in the February 2022 issue of Kappan, Vol. 103, No. 5, pp. 42-46.

 

 

ABOUT THE AUTHORS

default profile picture

Andrew J. Wayne

Andrew J. Wayne is a managing researcher at the American Institutes for Research, Rockville, MD.

default profile picture

Jane G. Coggshall

JANE G. COGGSHALL is a principal researcher at the American Institutes for Research, Rockville, MD.