Sunday, July 18, 2010

Measuring success

Several thoughts in the last few days have come together to prompt this post. I was thrilled to see that Disrupting Philanthropy was recommended as summer reading, along with Monitor's report What's Next for Philanthropy. What a nice shout out from Phil Buchanan and great company to be in - "What's Next" is a wonderfully well-written set of recommendations for foundations.

Phil's blog post was flattering and it made me think about how we decide what matters. It came on the heels of my being asked to tell one of the nation's largest foundations about the impact of my work - some of which was once done under contract to that foundation. In other words, I've been called to measure my impact.

I've been asked many times over the years to advise efforts on measuring social impact and I've served on countless advisory boards to all kinds of efforts from commercial software firms to nonprofit alliances to government selection committees to who knows what all. Most recently I've been brainstorming with an odd bunch of tech entrepreneurs, web developers, and social businesses on precisely these kinds of questions of meaningful influence and impact.

Since I have, on many an occasion, opined about the importance of data I often get (mis)labeled as a "data wonk" or "an outcomes geek," or something along those lines. I always find this a little funny, because in the very same paragraphs in which I've written that "data are the new platform for change," I've also pointed out that philanthropy is inherently irrational and data are only likely to be used within certain limits. I am a big believer in data but I also know its limits. I believe in asking the question about influence and impact more than I believe we can find one easy, always-useful answer.

Finally, my company, Blueprint Research & Design, is up to our eyeballs on projects where "social analytics* meet community engagement." So even as we're being asked to account for our "impact" we're helping our clients help their grantees do this same thing. This keeps us up-to-date on a wide range of social media impact work - ranging from Jessica Clark's work on public media to the new "21st century statecraft" discussions that have been personified in Alec Ross and Jared Cohen.

Here's my prediction for the future of understanding community change - we need to integrate what can be learned from social analytics into our measurement approaches to community engagement.

Finding the ways in which "data trails on the web" - click throughs, "likes," tweets, forwards, favorite lists, recommendations, diggs, etc - relate to action on the ground are key new measures of activity. Cracking the code between actions on the web and actions in the real world - what predicts what, what leads to what, what foreshadows or follows certain kinds of activities - by whom, how and when - is a key set of skills and methods for project designers, managers, funders, evaluators, policy makers, and community activists.

Why? Because more and more of our social change efforts are being designed and delivered on the web. Those that aren't fully digital (web or mobile) almost invariably have some kind of web/mobile component - whether it be for outreach, awareness, monitoring, or partnership building. To paraphrase Shel Israel, change efforts are at a "flex point" when it comes to digital media.

Just as click through rates changed advertising forever - and with it the news business, recording industry, publishing, real estate - the deep connections between generation and use of data on the web and actions taken in community are going to change how we make change happen. Being able to apply data on page views or unique visitors in a meaningful way to how neighbors organize or communities protest or vaccines get delivered or votes get cast will be as important going forward as old standbys of community organizing (serve pizza at the meeting) have been in the past.

There are whole methods and sciences of "social analytics"* being developed that look at how we travel through our web and mobile and social network worlds - but most of what is known is known from a marketing and sales standpoint. This means that major advertising companies (like Google) and corporate marketing departments (like those at Amazon or Starbucks) know how to answer these questions:
  • How many times did I abandon my shopping cart on the web site?
  • How many times was I shown a certain recommendation before I bought the book?
  • Did I use the coupon that was texted to my mobile phone to get a free cup of coffee?
We need to take this kind of intelligence and apply it to other questions, such as:
  • If I join a disease-oriented social network do I manage my medications better and am I healthier because of it?
  • If I read and comment on a story on my neighborhood on a local blog am I more or less likely to show up at the supervisors' hearing on a subject than if I read about it in the print paper?
  • What about if I submit a story to that same blog?
  • If a follow the tweets from a nonprofit am I more or less likely to donate or volunteer to that organization?
  • If I become a "fan" of an organization on a social network site will I do anything else to raise awareness of the group? Will I take any offline action to support its work?
Social analytics don't hold all the answers. They may not hold many of them - anyone with a blog knows that page views and unique visitors are highly variable and easily manipulated stats. At the same time, everyone with a Facebook account knows how easy it is to count their friends, Twitterers count retweets and all of us sigh at the number of emails in our inbox or texts on our phones. We've all gotten good at dousing ourselves in our own web-wash of data.

But we don't yet know how to connect those numbers (and which numbers to use) to our actual goals for our work or to our other methods of measuring our success. Do I tell the foundation that asked about my success measures about the number of blog readers and twitter followers? Do I do a citation search on the articles and books I've written to see who references my work and how often? Do I count the number of organizations around the globe that have hired my firm to do strategy work? Should I take the results of our client satisfaction surveys, multiply the positive responses by the number of blog readers, Scribd downloads, and Blueprint 2010 purchases and then divide by the number of negative responses? What about interviews with organizations we worked with years ago who've subsequently referred other organizations to us? Measuring success of my firm and my work hasn't necessarily gotten any easier or better since we've gotten all these potential data points - but it sure gives us lots of mathematical possibilities.


How many twitter followers you have may not matter as much as who they are and who follows them. We still need constituency voice surveys, focus groups, interviews, and ethnographic analysis to really understand how and where change happens. The web is full of data - every time you do anything on the web it generates more data. Some of it is going to be useful. What an opportunity to learn how digital data can help us all do our work better.


*Social analytics, defined by Jeremiah Owyang of Altimeter Group as "the practice of being able to understand customers & predict them using data from the social web."

Posted from at 37000 feet above sea level - I blame the typos on turbulence.

2 comments:

Phil Buchanan said...

Lucy: Thanks for this very important post. At the Center for Effective Philanthropy, we are focused on helping foundations assess and improve their effectiveness: we are grappling with an environment in which data is everywhere, but meaning from that data more elusive. So much of the challenge of performance assessment in philanthropy has to do with understanding the connections along the way to goal achievement: what leads to what. As you point out, we do not yet understand well what leads to what in the social media realm. The only way we will get there is by testing hypotheses – collecting the data necessary to explore whether the correlations we are hoping for actually pan out. The questions you raise are fascinating ones and identifying the answers could have a profoundly powerful impact.
-- Phil Buchanan

Anonymous said...

Lucy, I like your data optimism.

Unfortunately there is a dark side to data... http://bit.ly/o5bBr

Somewhere in these databases is probably the info you need to answer some of your questions ... http://bit.ly/btoHHO

Who will make sure your personal behavior data will not be misused nor misinterpreted nor mis-distributed???