"I'm thinking of folks behind initiatives like the Sunlight Foundation, or Everyblock, or public.resource.org. Heck, I'd even reach out to the geniuses behind mysociety.org in the UK."Is there a parallel list of philanthropic accountability, nonprofit innovators? Months ago I posted these "maps" to show some of what was out there in the information market:
This paper* from the Hewlett Foundation takes a different look at the marketplace for information in the sector. This story in Monday's Washington Post looks at a nascent discussion on need for metrics and transparency - although I have to say I think the story is a wrong on almost every important issue - from the purpose of the group to the problem in the field. There is no shortage of metrics, there is a shortage of comprehensive ones. The problem is a lack of consistency or comparative understanding of which measures matter - which measure do I use if I care about program effectiveness? What about leadership quality or sustainability? Is there a way to find out about "customer satisfaction?"
We have measures and measurers - here's an incomplete list I've been keeping of measures or measurers (feel free to add to this):
- New Philanthropy Capital,
- GiveWell,*
- MissionMarkets,
- Venture Philanthropy Partners,
- Midot
- Edna McConnell Clark Foundation
- Guidestar
- Charity Navigator
- SocialSolutions
- Nonprofit Finance Fund
- REDF,
- Center for What Works,
- Keystone,
- Great Nonprofits
- Connectipedia
- SocialMarkets
- Xigi.net Social Capital Index
- Keystone's "Impact" sessions
- RootCause
- Urban Institute Outcome Indicators
- Acumen Fund/Google/Salesforce's Portfolio Data Management
- From Fragmentation to Function: Critical Concepts and Writings on Social Capital Market’s Structure
- Performance data commons
- UPenn's Center on High Impact Philanthropy
- Social Venture Technology Group
- HIP Investor
- Fast Company's Social Capitalist Awards
- IssueLab
- The Nonprofit Reporter
- SmartLink
- Think Social Impact
*FULL DISCLOSURE: I participated in the Alliance for Social Investing meeting discussed in the Washington Post story. I am no longer on the board of GiveWell. I am an advisor to The Nonprofit Reporter. I am a member of the group Think Social Impact. I know people at and have worked with many of the other organizations listed above. My company and my work are mentioned in the Hewlett report.
Tags:
6 comments:
Thank you for posting the latest information on non-profit evaluators. I agree when you wrote that there isn't shortage of metrics, but there is a lack of comprehensive ones; it seems as though many non-profits are not seeing the value in long-term measurement of effectiveness. How do you suggest a standard of consistency is made in metrics among non-profits? Your list is up to 31 players now, how does a regular donor weigh importance of the information they are provided (good v. bad non-profit based on the evaluator’s definition of effectiveness) among all these groups without spending hours searching on each page?
Kristin
Thanks for counting - and I agree absolutely - at this point, with 31 players, the questions become about comparative value, strengths, weaknesses, gaps - we need the sources to be accurate, meaningful and useful - and I think the useful part is getting lost.
Lucy
Lucy,
This is a helpful chronicle of what is out there. It is still pretty messy and will be until someone--or a group of institutions and individuals--hit on a sit of metrics that helps inform real decision making, whether that it is allocation of capital within an organization or funding across organizations. We at Acumen Fund have been trying in our small way to build a tool that allows our program team of social investors to make more informed decisions within our portfolio, but with an aspiration for allowing our peers to share some of the same data to improve the quality of their decisions. If we build from the ground up at the enterprise level, with the data (both quantitative and qualitative) that many of us are already capturing in spreadsheets within our organizations, I think we will learn alot about the patterns of effectiveness. We are definitely interested to see how this evolves and are constantly watching and learning from our many peers in the space. I personally don't lament the duplication, redundancy, divergence, etc as I think that is the sign of a healthy and creative time during which new ideas will hopefully improve on their predecessors.
Brian
Thanks for your comment. And I agree with your point about duplication, in fact I'd go beyond not lamenting it to modify it as follows - in any industry, in any product, in any field the "... duplication, redundancy, divergence, etc..." is [a] sign of a healthy and creative time during which new ideas will hopefully improve on their predecessors."
There is, however, the difference between serving the needs of individual organizations that make funding decisions and serving the needs of donors and others trying to make funding decisions. In philanthropy - where sharing ideas about what works is crucial to 1) influencing where dollars go and 2) addressing any of the social concerns we care about on a scale that can be meaningful, the investments in measures/data/indices etc are almost the "backbone" of a more informed and effective capital system.
What I think is possible here is something between indepedent "creativity and exploration" and "top down" directives - but that will allow for multiple meaningful measures to be developed AND for the world of potential users of those metrics to see that menu as it develops, choose from among its many items, and in fact, inform the development and application of new measures/metrics/data systems/indices. Getting the "crowd" involved is critical - as informants, decision makers, users. And right now, the only systems we have that reach any kind of crowd are Guidestar and Charity Navigator, both of which have proven there are millions of people who want better information and will use it, and neither of which have yet been able to offer the first slice of such information. So we have better information, being developed and tested in small batches (effectively in secret from the market of potential users) and delivery systems that do reach the broader market but are not yet delivering the best possible thinking on the subject of measuring social impact. What an incredible opportunity for the field!
My time and passion is now devoted to helping small, worthy organizations develop marketing capability, but my professional credentials come from executive strategy and marketing roles in large healthcare organizations. At my last stop, Blue Shield of California, I was responsible for the dashboard indicators monitored by the Board of Directors. We had more than 500 measures we could choose from, but the trick was to understand what outcomes were considered critical, and the drivers of those outcomes. I just introduced a large non-profit mental health agency to Kaplan and Norton's strategy maps, and the executive director and executive committee found that the process of creating the maps helped to clarify their thinking. The maps also provide guidance when identifying and prioritizing change initiatives. Years will go into the dialogue about what key "benchmarkable" indicators should be transparent. In the short-run, spending time on what drives the desired outcomes, and how THOSE inputs are measured, may do more to impact performance. Great read: Strategy Maps: Converting Intangible Assets into Tangible Outcomes by Robert S. Kaplan and David P. Norton (authors of Balanced Scorecard). And, no, I don't have any connection to the authors - just like their stuff.
New Tools
We have developed several new social or targeted investment instruments and methods. Given recent developments in information technology, it is much easier to create very powerful analysis tools to assist investment decision making. These tools and techniques allow institutional investors to allocate funds for targeted social purpose investing in an ERISA complaint, market-based manner.
We pioneered the systematic use of financial and social data in the selection and evaluation of financial institutions when we created the first social and financial impact methodology, the Fully Adjusted Return® index, in 1991.
Our methodology helped us predict, in 2003 and 2006, the current troubles in the marketplace:
See: http://twisri.blogspot.com/2008/11/what-we-said.html
See: http://www.creativeinvest.com/about.html
Post a Comment