Measurement & Evaluation
The Impact > Measurement & Evaluation
Measuring impact is part of REDF’s venture philanthropy DNA. We’ve got 25 years of collecting, analyzing, and using data to move the field forward, inform smart public policy, and help tell the stories of people whose lives have been transformed by social enterprise so that more people have the jobs, training, and support they need to succeed.
To advance the field of Social Enterprise, in the late 1990s REDF pioneered the concept of “Social Return on Investment (SROI)”, which measures enterprises’ social benefits to society through outcomes (also advanced in subsequent papers). It’s an approach that tracks goals—such as greater income and increased housing stability for employees, reduced taxpayer burden via lower reliance on government assistance, and reduced rates of recidivism—to align our grantmaking with what works.
In 2011 our commitment to evidence-based grant making evolved with the award of a Social Innovation Fund (SIF) grant. This Federal grant enabled REDF to partner with the Mathematica Policy Institute to investigate the impact of grant capital and expert technical assistance on the beneficiaries of the social enterprises in the portfolio.
The Mathematica Jobs Study (MJS), published in 2015, measured whether participants at these social enterprises had higher employment and better life stability one year after they began their social enterprise jobs. Results were encouraging: employment rose from 18% to 51% and monthly income rose from $653 to $1,246. Additional benefits included reduced reliance on government assistance programs, and an increase in continuous stable housing. A cost-benefit analysis of REDF’s portfolio conducted by MJS found that every dollar that the portfolio enterprises spent generated $2.23 in societal benefits, more than doubling the initial investment. Read the full report.
Transforming Data into Learning
REDF collects a continuum of evidence to continuously improve our grantmaking and provide best practices to the field. Ranging from ongoing performance measurement of the financial sustainability of social enterprises to evaluating the outcomes of social enterprise employees, REDF is transforming data into learning for social enterprise practitioners and supporters.
Encouraged by the results of the Mathematica Jobs Study, in February 2018 we began to examine the impact of our investments at multiple social enterprises in our national portfolio with a higher degree of rigor and over a longer period. Conducted by REDF’s external evaluator RTI International, this study concluded in 2021 and provided greater insight into employment and life stability eighteen months after beginning a social enterprise job.
In conjunction with the evaluation, REDF created an Evaluation Learning Committee comprised of social enterprise staff to ensure that the evaluation is mutually beneficial to REDF and its grantees and provides relevant, actionable insights that the social enterprises can use to support decision-making and strategy.
Enhancing our Evaluation Effectiveness
REDF has become increasingly interested in incorporating beneficiary voice into decision-making and empowering social enterprise employees to have a say in the programs that serve them. To this end, REDF, with support from the Fund for Shared Insight, a funder collaborative that encourages the incorporation of feedback from participants—on a program’s responsiveness, focus, degree of support— incorporated a perceptual feedback study in our current evaluation to evaluate whether participants’ perceptions can predict subsequent job placement, retention and other outcomes.
Measuring our Regional Impact
REDF participated in the evaluation of the Los Angeles Regional Initiative for Social Enterprise (LA: RISE), a public-private partnership REDF manages in Los Angeles.
LA’s Economic and Workforce Development Department (EWDD) contracted with Social Policy Research Associates (SPRA) to evaluate the pilot phase of LA:RISE. The evaluation included three studies: implementation, impact, and cost effectiveness. Topline findings include: implementation and enrollment were successful; the impact on employment and earnings was larger and longer lasting among the “high-contrast” group of social enterprise employees; and linkages to housing programs were improved. In comparison to the “best alternative,” lower-touch services provided by WIOA, LA:RISE was found to be less cost effective. However, LA:RISE was created to help harder-to-serve populations not succeeding with the current public services, so this result is not surprising.
This evaluation included over 1,000 participants, placing LA:RISE in the context of a larger body of research testing enhancements to transitional employment programs.
View the final report here. The SPRA evaluation will contribute to the evidence base and help us and the social enterprise field continue to improve.
New Research From the Field
Georgetown University’s Business for Impact released “Jobs for All” Employment Social Enterprise and Economic Mobility in the United States,” a ground-breaking report that documents the unique role that employment social enterprises (ESEs) play in creating economic mobility for people who have largely been excluded from the workforce. This research highlights employment social enterprise solutions that have the potential to scale up to address some of America’s most persistent and structural employment barriers. REDF shared data and case studies on the hundreds of companies it supports across the US to inform and enrich the findings. The research was funded by the Bank of America Charitable Foundation.
“Research Evidence and Recommendations for an Employment Initiative to Serve Jobseekers Experiencing Homelessness”
An overview of the research literature on employment services for people experiencing homelessness, along with program, systems, and policy recommendations to inform the planning of an initiative to connect people experiencing homelessness to employment, was conducted by Heartland Alliance for REDF and All Home. The review revealed a number of program models and practices supported by the most rigorous research method—randomized control trials—as well as additional evidence-informed and promising program design features.