Right-sized Evaluation for Small Foundations

May 29, 2019 – This is a guest post by NNCG Member Prentice Zinn.  It is shared here with permission.

Trustees and staff of small foundations know how difficult it is to evaluate the impact of their grantmaking. Data collection can be expensive and attributing grants to a particular set of outcomes is often unrealistic. Knowing the obstacles to formal evaluation for most private foundations, we prefer an approach that emphasizes monitoring, assessment, and learning. This approach or mindset is an ongoing process of evaluation, embedded in small foundations’ grantmaking.

Design your approach

How can funders be more systematic in learning about the effectiveness of their grantmaking? Given the challenges of formal evaluation, what kinds of processes can a small foundation adopt that are informative yet practical?

The right-sized approach will help trustees understand how grants are contributing to nonprofit outcomes—and to the foundation’s mission. Start by clarifying intentions and setting goals that are grounded in the mission. Take time for this critical groundwork. Then—to begin to embed ongoing, practical evaluation in the foundation’s grantmaking—customize and deploy a few tools that will work for your style of learning.

Clarify your intentions for evaluation and learning

Evaluation and learning start with clearly-defined strategy and expected outcomes. Getting to that clarity can be a daunting task, requiring some difficult conversation and consensus-building. Stopping short with a vague objective is safer and less contentious, but not as strong a basis for effective evaluation. Finding agreement on a specific set of intentions is a good indication of the board’s readiness for learning.

We have found a practical framework for digging into meaningful conversation and finding clarity. Using elements described in GrantCraft’s guide to Mapping Change, a foundation board creates a simple story or chart that outlines its most basic expectations:

  • What do we expect to see?
  • What would we like to see?
  • What would we love to see?
CREDIT: APPALACHIAN MOUNTAIN CLUB, DWARF CINQUEFOIL

The narrative logic of these questions creates a container that can capture qualitatively and quantitatively a believable story about how change is happening. For example, a New England foundation intent on large-scale landscape preservation funded efforts to increase the scale and pace of land conservation. It expected to see increases in donated conservation easements to grantee organizations as an indication of progress, and was able to document these transactions.

Consider making your intentions public, as this type of  transparency will likely lead to stronger grantmaking. For more on transparency and accountability in philanthropy, visit Glasspockets.org .

Approach evaluation with realistic expectations

Once you’ve specified your intentions, customize your own approach to monitoring, assessment, and learning. To right-size this approach, start with an understanding of the foundation’s capacity, and set expectations accordingly.

  • What questions do we need to answer to determine impact, given our particular goals?
  • What data or narratives do we need to collect to answer these questions?
  • Do we have the resources to collect and analyze that data, or should we adjust our questions and expectations?

Use indicators to measure what matters

826 Boston writing program
CREDIT: 826 BOSTON

Having defined its key questions, the foundation is ready to set up a monitoring process. Before accumulating information, take time to define the data that supports your stated goals for learning.

We like the metaphor in evaluation of collecting baskets of indicators. Similar to the shopping basket that determines the Consumer Price Index, a basket of conceptually-related indicators will be helpful in assessing impact. What are your best indicators?

Find off-the-shelf-indicators.  Find useful measures of nonprofits’ work in one of several online sources of indicators. The Urban Institute’s Outcome Indicators Project, for one, provides a menu of measures for a dozen sectors. In our work with a small foundation that supports community organizing and advocacy, we chose indicators from A Users Guide to Evaluation Advocacy Planning by the Harvard Family Research Project because they closely matched the foundation’s stated goals. Similarly, in work with a funding initiative promoting land conservation, we borrowed indicators from the Land Conservation Metrics Working Group.

Create do-it-yourself indicators. There may be practical reasons for organizing and rating your own set of indicators in an evaluation rubric. Use a “Goldilocks” rubric to rate observable phenomena on a high, medium, or low scale, for instance. This type of ranking, based on narratives of credible evidence, can be more achievable than using statistically-derived indicators.

Systematize for ongoing learning

Build an evidence journal. A simple tool for organizing, the journal is usually a set of spreadsheets used to gather and analyze data within the chosen basket of indicators. A grantmaker that supports job training, for example, might have a journal category, or spreadsheet column, labeled Post Job Placement Outcomes.

Gather data efficiently. This is a familiar reminder to be judicious in requesting only the information needed to support good decision making, to think creatively and gather data efficiently, without unduly burdening the foundation’s grantees. Written reports, media, observation, websites, third party information, and open-ended interviews can go a long way toward a bountiful harvest of credible data.

Make data digestible with visuals. We naturally prefer the visual presentation of data over written narratives. Ideally, visuals encourage creative thinking and conversation. Take a minimalist approach and deploy the “do it quickly, simply, and flexibly” rule.

Simple graphs and charts are usually enough to show evidence of movement towards the foundation’s goals. Likewise, the visuals could point to grantmaking effectiveness in a number of ways:

  • Efficacy; to what degree did the funded program work as planned?
  • Efficiency; how well did the program or initiative work, compared to expectations?
  • Sustainability; to what extent did the program/initiative continue beyond the grant period?
  • Equity; who did and did not get served by the program?

Connect actions to the foundation’s mission

The learning processes you choose should be realistic, consistent with the board’s style, and embedded in ongoing grantmaking. Ideally, the right-sized process of monitoring and evaluation at a small foundation will stir up a conversation about what is going on, what is working, and how it is working.

Embedding right-sized evaluation in its grantmaking, the foundation will naturally focus on learning and improvement. Through the related, ongoing conversation, the foundation tells its story, articulating the connections between its actions and its mission.

Facebook
Twitter
LinkedIn