Social Innovation Fund Releases Evaluation Resource

By Michael Smith

Social Innovation Fund Evaluation Plan Guidance report cover image.

New publication provides comprehensive guide to evaluation for the SIF  

With the strain on both public and private resources in the wake of the Great Recession, funders started asking: “How do I know my investments are actually making a difference, and are they going to programs that make the most difference?”  The Obama Administration has taken steps to answer these questions through tiered-evidence programs at the Departments of Education, Labor, and Health and Human Services.

Tiered-evidence programs take the approach that there is a continuum of evidence ranging from preliminary to strong and that programs move along this continuum by conducting more and increasingly rigorous program evaluations – moving from tier to tier as they amass evidence that their program works.

The Social Innovation Fund (SIF), one of the first of these tiered evidence initiatives, is committed to investing in evidence and evaluation that helps prove and improve promising models. Our own statistics have shown 74 percent of interventions to date entered with preliminary evidence of impact on beneficiaries. Through our investment, the grantees are employing evaluation designs that seek to move them up the evidence continuum. In order to help organizations map their way forward, we have created a guidance document that walks programs, section by section, through a comprehensive evaluation design that will meet SIF definitions and expectations.

Social Innovation Fund Evaluation Plan Guidance report cover image.
The Social Innovation Fund, one of the first of these tiered evidence initiatives, is committed to investing in evidence and evaluation that helps prove and improve promising models.

As part of our “Knowledge Initiative,” we at the SIF are committed to sharing the resources and lessons learned from our programs to benefit and strengthen the sector as a whole. For this reason, we are pleased to share the SIF Evaluation Plan (SEP) Guidance.

About the SIF Evaluation Program

The SIF relies on a framework that organizes evidence levels into three categories or tiers: preliminary, moderate, and strong . All programs funded by SIF directly or indirectly must demonstrate a minimum of preliminary evidence of effectiveness.

Once funded, programs are to follow the aforementioned evaluation and evidence process. For the SIF and other evidence-focused federal programs, the higher you are on the continuum of evidence, the more ready you are for scale and the more funding you can receive.  

Simply stated, the evaluation program of the SIF aims to:

  • Infuse evidence in programming and grantmaking decisions
  • Advance the evidence base of all funded programs and increase the number of interventions on the upper end of the evidence continuum
  • Increase the evaluation capacity of intermediaries and subgrantees
  • Help improve program models by applying data and outcomes analysis in real time

About the SIF Evaluation Plan Guidance

To ensure that the parties involved develop robust evaluation plans that meet the expectations of the SIF, we have developed the Social Innovation Fund Evaluation Plan Guidance. This document serves as a blueprint for evaluation plans that are required to go through a review and approval process as part of the SIF grant requirements. It provides a common framework and shared understanding of what rigorous evaluation means, the elements and criteria against which plans will be assessed and approved, how implementation will be monitored, and how results will be reported and shared.

Using the SIF Evaluation Plan Guidance

Comments from intermediaries confirm the value of the guidance. For example, Cindy Eby, Director of Evaluation for Mile High United Way in Denver shares that:

“(T)he SEP Guidance provided a very thorough framework for thinking through evaluation plans. Given the level of rigor expected for evaluations through the SIF, and the expectation to fairly quickly move into implementing an evaluation, access to explicit guidance to think through the many details needed for high quality quasi-experimental and experimental designs was helpful.”

Although the SEP guidance was developed to help the SIF grantees and subgrantees meet program expectations, the information may be useful to others seeking to conduct similar types of rigorous studies. As Gabriel Rhoads, Director of Evaluation and Learning at the Edna McConnell Clark Foundation in New York stated:

“We’ve ‘Clarkified’ the SEP guidance for all of our grantees!Through our past work at EMCF, we’ve realized the importance of aligning evaluation planning with program growth. This helps ensure appropriate budgeting, and also that a program is increasing the number of youth served in a way that will meet an evaluation’s sample size requirements. The SIF evaluation plan process supports this alignment well. In fact, EMCF has incorporated parts of the SIF evaluation planning template into our evaluation efforts with grantees in our non-SIF portfolios.”

We hope this guidance will be useful to other funders, nonprofits, and evaluators as well. The design process it lays out is intense – but it’s comprehensive. And the evaluations that result from it meet the exacting standards of the SIF and potentially other federal programs that emphasize evidence.

Michael Smith is the Director of the Social Innovation Fund at the Corporation for National and Community Service.

View All Posts

Notes

  1. nationalservice posted this