Talk to Us 
Have a Question?
Get answers  

How to Create Effective Reports for Communicating Professional Development Program Evaluation Results

Professional Growth

Wouldn’t it be great if we knew when our professional learning programs were successful? What if we knew more than just the fact that teachers liked the presenter, were comfortable in the room or learned something new?

Wouldn’t it be better to know that teachers made meaningful changes in teaching practice that resulted in increased student learning?

I posed these questions in the first of this 7-part series on conducting program evaluation of professional development for teachers. Let’s say you have moved through the first four of the five phases of program evaluation:

  • You’ve engaged stakeholders and promoted a deep understanding of the program.
  • You’ve posed a set of evaluation questions.
  • You’ve collected data based on the evaluation questions.
  • You’ve analyzed all the data.

Undergo the 5 stages of program evaluation for professional development.

At this point, you have a solid understanding of program outcomes. You have a perspective on what teachers learned, if they’re using what they learned in the classroom, and perhaps even how students are responding to changes in teaching practice. Now what? Most likely, you’re not the only one in your district who needs this information. How do you share evaluation results and with whom?

The fifth phase in the cycle of program evaluation is reporting and use of results. In this phase, consider the following:

  • Who needs to know the results of your professional development program evaluation?
  • How will results be used to make decisions?
  • What information should be shared in a report?
  • What format should a report take (e.g., document, presentation)?
  • How and when should reports be shared?

Most importantly though, consider why you will create and share evaluation reports. The answer to this and the above questions form your communication plan.

Kylie Hutchinson, author of A Short Primer on Innovative Evaluation Reporting offers this insight:

Reporting your results is one of the most important tasks of an evaluation, but doing it in an engaging way that inspires action is often neglected…There is rarely uptake of recommendations, no lessons learned, and most importantly, no program improvement (2017, p. 11).

The reason to report and disseminate results is tied to key decisions that need to be made about professional development programming. To be meaningful, evaluation reports need to be used to determine whether a program should be continued, expanded, eliminated or changed in specific ways to improve outcomes.

What belongs in a professional development program evaluation report?

The most comprehensive form of an evaluation report might include all the details:

While this list may appear logical and sequential, the order also makes for a less engaging report. To ensure the use of evaluation results, many evaluators now encourage beginning reports with the exciting part — the findings and conclusions, and actionable recommendations.

Match the report to the audience.

If you take only one lesson from this article, let it be this: Match your report to your audience. Consider who needs to know the answers to your evaluation questions and understand your findings from data analysis. Who needs to use the information you share to make key decisions about professional learning? Who might be interested in results because they are in a position to support professional learning programs?

Creating evaluation reports to meet the needs of specific audiences involves three key steps:

  • Identify your audiences. Are they administrators? Teachers? Board of Education members? Parents? Community members?
  • Understand their information needs. What is important to them with regard to professional learning in general or the specific topic? How does the professional learning program connect to their work and responsibilities?
  • Know what actions they will take with the information in your report. Are they decision-makers? Do they sit at the table with decision-makers? Are they likely to share the information with others? Are they potential supporters or detractors?

Consider multiple forms of a report.

On the surface, it may sound like a lot of work to create multiple reports, but with careful planning it’s quite manageable. Creating different versions of reports for different audiences can be an enjoyable and rewarding part of the evaluation process and contributes to deepening your own learning as you dive into the data and help others make sense of it. Think about what would hold the most appeal for your stakeholders.

Do you have an audience who needs:

  • A 1-page overview with a few highlights?
  • A 3-5 page summary of key findings?
  • A 15-minute live presentation?
  • A comprehensive written report with all the details?

Choose the audience who needs the highest level of detail and create that report first. Then, work to strip away details the other audiences don’t need. You can always make the more comprehensive forms of the report available if they want access to them.

You may enjoy this hand-picked content:

Podcast: Building a Culture of Professional Learning — The big questions that shape professional learning at Jenks Public Schools in Oklahoma, how they iterate and improve, and what it looks like to measure impact.

Beware TL;DR.

Few people I know love spending endless hours writing long reports. But, if you’re one of those people, here is another reason to carefully consider your audience and their information needs. “TL;DR” is internet slang for “too long: didn’t read.” Part of the problem isn’t necessarily the length of some reports, but the length combined with a report that isn’t visually appealing. It just doesn’t draw the reader in and keep them there.

Fortunately, there are many ways to avoid TL;DR in evaluation reporting by creating different versions for different audiences, using creative or innovative formats, and embedding visual elements.

Think outside of the document.

A written report is far from the only way to communicate evaluation results, and it’s perfectly OK to think flexibly and creatively here. I’m not necessarily suggesting a song and dance routine, but believe me, it has been done!

Here are just a few alternatives to written reports or presentations:

  • An infographic
  • A brief video posted to a website
  • An interactive website
  • Social media sharing

A Short Primer on Innovative Evaluation Reporting features many more creative ideas.

Make it visual.

No matter the style, size or length of your report, be sure to include visuals to engage your audience. Use relevant photos, icons, or illustrations along with charts or graphs to draw the audience’s attention to the main points. There are many, many websites where you can find free stock photography, but consider taking your own photos. It isn’t that difficult, requires nothing more than a smartphone and brings a stronger sense of ownership and connection to the report and to the program. Your audiences will see your teachers in your classrooms doing the real work involved in professional learning, and that is more likely to inspire engagement with the report.

Most audiences also want to see data. They want to quickly and easily understand key findings. Charts or graphs can be efficient and powerful ways to communicate data, and they don’t need to be sophisticated or complex to have impact. Simple bar, line, or pie graphs can communicate meaningful data. I’ve been actively honing my data visualization skills in my spare time by simply reading blogs and books and experimenting. Little by little, I acquire new skills and attach them to prior knowledge to build a robust toolbox and solid repertoire of visualizations I can now use create to communicate program evaluation results.

Program evaluation is essential.

Professional development remains a critical component of school success. It is essential that we continue to create and implement high quality professional learning programs within the constant constraints of budgets and time. A rigorous program evaluation process helps us deeply understand how programs are performing in our school environments and is key to educator professional growth and the continuous improvement of our schools.

 

Sheila B. Robinson

Sheila B. Robinson, Ed.D of Custom Professional Learning, LLC, is an educational consultant and program evaluator with a passion for professional learning. She designs and facilitates professional learning courses on program evaluation, survey design, data visualization, and presentation design. She blogs about education, professional learning, and program evaluation at www.sheilabrobinson.com. Sheila spent her 31 year public school career as a special education teacher, instructional mentor, transition specialist, grant coordinator, and program evaluator. She is an active American Evaluation Association member where she is Lead Curator and content writer for their daily blog on program evaluation, and is Coordinator of the Potent Presentations Initiative. Sheila has taught graduate courses on program evaluation and professional development design and evaluation at the University of Rochester Warner School of Education where she received her doctorate in Educational Leadership and Program Evaluation Certificate. Her book, Designing Quality Survey Questions was published by Sage Publications in 2018.