Investing in Evaluation Supports Long-term Program Goals

Kids sit with technology on steps outside

Evaluation may feel intimidating at first, but it doesn’t have to be overwhelming or stressful. Once you have a few tools to help you get started and some support along the way, evaluation can be rewarding… and maybe even fun! Perhaps most importantly, investing in good evaluation offers critical information to help you understand what’s working and what isn’t. This gives you valuable data for communicating your success to your community, potential funders, and other stakeholders.

Many hands work together to draw a picture

NRF recently launched an updated set of program evaluation resources to help our grantees check in on their current evaluation practices or launch a new evaluation effort. The document was created in collaboration with Dr. Kathryn Stevenson and Dr. Lincoln Larson of North Carolina State University. Dr. Stevenson and Dr. Larson are leading researchers in the field and support grantee evaluation through an ongoing partnership with NRF.

An adult sits with a child at a table holding a clipboard

The best evaluation acknowledges the specific context of the work being assessed, which means there’s no copy and paste template. Instead of a one size fits all approach, NRF’s evaluation resources focus on best practices that can be applied to many programs and adapted to fit the specific needs of each of our grantees. To get started, we recommend breaking down your program into the following components: inputs, activities, outputs, outcomes, and impact. When thinking about each of these areas consider the following questions:

  • Inputs: What resources does your organization have? Which are the most valuable? Which are the most limited?
  • Activities: What are the essential components of your program?
  • Outputs: Who is your audience? Who are you missing in your current audience? How does your audience influence your program structure?
  • Outcomes: How are participants changing as a result of your program?
  • Impact: What is your ultimate why?
A young person squats on grass taking a picture with a camer

Understanding how each of these aspects of your work are showing up and influencing your program can help you identify where you need to lean into what you’re already doing and where you can make changes. Once you’ve broken down programming into components, consider the following before creating your evaluation framework:

  1. Hone in on a few priority areas. While it can be tempting to try to measure everything, evaluation is often more successful if you can choose a few aspects of your program to measure thoroughly rather than getting a little bit of information about a lot of topics.
  2. Be thoughtful about when and how you collect demographic data. Demographic information can be important in learning more about your audience, but it’s always important to be considerate when asking sensitive questions around identity. Be sure to think about the experience of the person answering the questions when creating demographic portions of evaluation tools.
  3. Intentionally select your data collection method. Participant surveys are a go-to evaluation method, but there are lots of other options to consider like interviews, personal narratives, and art projects. The most successful methods will be the ones that feel integrated into your existing program activities rather than an add-on to what staff and participants are already doing.
  4. Leverage networks and partnerships when possible. Relationships with peer organizations can support collaborative program solving and data sharing. Working with organizations like local universities or colleges can also boost evaluation capacity.

Find existing scales, additional information, and more tips in NRF’s Grantee Evaluation Resources document.