Skip to content
Professional Growth

Measuring the Impact of Professional Learning

Share article
Measuring the Impact of Professional Learning

Woman Looking at Computer Screen Measuring the Impact of Professional Learning

When districts invest in professional learning, they expect results for both educators and students. Yet far too often, the evidence of effect is elusive. This reality can be changed when district and school leaders plan for the evaluation of professional learning when they initially design it to strengthen educator effectiveness and student success.

Six districts (Boston Public Schools (Mass.); Greece Central School District (NY); Jenks Public Schools (Okla.); Metro Nashville Public Schools (Tenn.); Prior Lake Savage Area Schools (Minn.); and Shaker Heights City Schools (Oh.) joined the Frontline Research & Learning Institute and Learning Forward to explore how to measure the impact of their professional learning. When planning professional learning and its evaluation, leaders in these districts set out to answer a series of questions to guide both the planning of their professional learning and its evaluation. Each district entered the small-scale, collaborative study with the intention to evaluate an existing or new professional learning initiative and with a desire to learn with and from each other districts engaged in the same work.

The effort of these six districts offers other districts a glimpse of what it takes to design and implement effective professional learning for educators and evaluate its results. While the duration of the study was too short to realize the desired impact of professional learning, it did provide participating districts an opportunity to grapple with the answers to key questions about evaluating professional learning. Learnings emerging from the study are highlighted in this blog.

The quality of an evaluation of professional learning depends on the quality of its planning and implementation.

As the six district leaders weighed the decision to join the study, they first grappled with the question about what to evaluate. Some had existing professional learning programs in place that were evaluable, that is ready to be evaluated (Killion, 2018). Others had professional learning programs that were not yet evaluable and required more clarity in terms of outcomes for educators and students, indicators of success, and measures of success. Others were in the early phases of developing professional learning programs to meet identified district needs.

To evaluate the implementation or effectiveness both on educators and students, evaluation requires clear delineation of desired outcomes, indicators of success, and viable measures of the outcomes. This means that district or school leaders shift the focus of professional learning evaluation to assessing coherent programs rather than isolated or occasional events.

Using extant data sources expedites evaluation of professional learning yet may limit interpretations.

As district leaders launched comprehensive professional learning programs, as Boston Public Schools did, they wanted to know if their efforts were having the desired effects. To measure their efforts, Boston leaders looked to the multiple data sources already existing in the district to consider which might be useful as indicators of the impact of their new professional learning program.

As do most districts, Boston leaders had multiple sources of data that could be repurposed without creating a data burden (Killion, 2018). However, since the data sources already in place are approximations of measures of the specific outcomes of the professional learning initiative, district leaders analyzed and interpreted the results to form conclusions about the professional learning program with thoughtful consideration regarding the data from approximate measures.

A high-quality professional learning system meets the needs of district educators and increases coherence and effectiveness of professional learning programs.

When professional learning within a district or school lacks coherence, it is difficult to measure its effects. In Jenks Public Schools, participation in the study meant an opportunity to solidify the district professional learning system and to create a planning process that ensured that professional learning met the criteria of quality from its inception.

Using a planning model, the district and school leaders are able to align professional learning with identified needs, provided adequate implementation support, and monitor implementation to increase the likelihood of results. This system includes a decision-making model for planning and implementing high-quality professional learning. This means that the professional learning educators experience is specifically designed to produce results both for them and for their students.

Existing and new programs both require evaluation, often for different purposes.

Whether professional learning programs are new in a district or long-standing, evaluation is necessary. Existing programs, such as the mentoring and induction program offered in Greece Central School District through its Teacher Center, or a new coaching program in Shaker Heights City Schools, evaluation provides valuable information for multiple purposes. In Greece Central School District, results of the evaluation inform decisions for refining the existing programs to be more beneficial to participants. Annual data collection from multiple sources about the program inform the continuous upgrades to the program. Evaluation in this case is for the purpose of refining an existing program.

In Shaker Heights City School District, evaluation offers data about the new coaching program for two purposes. First, the data collected guides coaches’ work with teachers. As the coaching program in Shaker Heights matures, the data collected can help district leaders measure effects of the program and make necessary adjustments to strengthen the program. In both districts, evaluation serves different purposes. Evaluation in this case is for program management and implementation.

District leaders in Prior Lake Savage Area Schools wanted to implement a districtwide instructional framework to support student learning. They wanted to engage multiple stakeholders in the design process so that staff both had a voice in the development of the framework as well as gained an awareness of the need for the framework. Using an innovative data gathering process, the district leadership team collected data from teachers, principals, and central office staff and analyzed the data collected for common themes. By engaging staff, the leadership team built a common understanding of the need for and a shared purpose for the instructional framework. These early data provide baseline data that can be used for pre- and post-implementation evaluation. Evaluation in this case is for program planning and for collecting baseline data that may be useful in a pre- post evaluation design.

In Metro Nashville schools, the Collaborative Inquiry Process, a system for collecting, analyzing and using a variety of data to improve schools, was developed through an alliance between REL Appalachia and the district. After the design and pilot testing phase, district leaders were ready to implement the process in five district schools, since the unit of change is not a school but rather a cluster of individuals within a school. This is the case with the Collaborative Inquiry program designed in cooperation with the Appalachian Regional Laboratory and available to teams of teachers within Metro Nashville schools. The expected unit of change in a professional learning program both influences the program’s design and implementation as well as its evaluation. Evaluation in Metro Nashville focused on measuring the quality of implementation and impact on each school’s student achievement.

Sound, flexible data systems make the evaluation more efficient and effective.

Useful evaluations take time, resources and effort. Districts with data systems that allow them to gather, track, analyze and access data quickly are able to use data frequently to monitor the professional learning program’s and participants’ progress and inform decisions to adjust their programs.

Data systems that generated analytics using multiple types of educator and student data allowed district leaders to adapt the data collected to measure both levels of effectiveness, first on educators and secondly on students. Sound data systems shifted district leaders’ effort from collecting data to using data to make informed decisions about professional learning and its impact by allowing them to analyze, interpret and use data on professional learning more rapidly to strengthen programs and impact on participants and their students.

As district leaders continue to explore effective and efficient ways to evaluate the effectiveness of their professional learning programs, they will realize the benefits of evaluation for program planning, monitoring implementation, and measuring impact on both educators and students. If professional learning is to impact educator effectiveness and student results, district leaders must commit to ongoing evaluation to inform all decisions about professional learning.

Killion, J. (2018). Assessing impact: Evaluating professional learning, 3rd edition. Thousand Oaks, CA: Corwin.