Wednesday, May 05, 2010

Data deficits and surfeits

(Photo courtesy of LogoMan at logomorphosis)

This morning I heard this (totally typical) set of reports on the radio about the doings in financial markets:

Around 6 am pdt: "The Dow is up 42 points, on news of a report that the Eurozone economy is still expected to expand at .9% in 2010, despite the heavy debt burdens of Greece and other member countries."

Approximately 7 am pdt: "The Dow is down 15 points as investors worry about debt in EU countries."

These kinds of swings in stock indices are daily events. What always catches my attention is the attempt to link the market's gyrations to some cause - "investors' worries" or "data from a new report." This effort to link the ups and downs of the market, which result from a collection of of millions of independent transactions, to any a single cause has always struck me as....well...ludicrous.

Yesterday TechSoup Global* posted this roundup of news on giving online. It cites three different analyses (Convio, Chronicle of Philanthropy, and NTEN) of online action. So far this week I've received three press releases, one from FoundationSource, one from Fidelity Charitable Gift Fund, and one from Convio, about their analyses of 2010 giving. Each study looks at different data sets. Convio, Fidelity, and FoundationSource are mining the data from their own proprietary transaction platforms.

None of these sources let anyone else use their data, though the Chronicle posts its database of grants online for users to search.

This is the state of data in the sector right now. Proprietary data sets used to generate publicly shared analyses. There is no way to check the data or the analyses. There is no way to compare or mashup or integrate data sets (or even to really cross reference the analyses).

To get a little techy here (and somewhat oversimplify the problem) the holders of the data are sharing pdf's with us, instead of sharing data in RSS streams. Anyone's who's ever tried to edit a pdf knows how hard it is. Anyone who's ever had to re-enter a bunch of numbers from someone else's pdf into their own spreadsheet so they could ask the questions that interested them knows what a pain it is.

We have an abundance of proprietary analysis and a surfeit of publicly available data.

Now, not everyone is a wonk who's going to want to mash up the data or check anyone else's analysis. But some folks, such as those managing major public funding programs or foundations looking for funding partners or individual donors looking to do some deeper analysis of giving trends on their issues might. With these kinds of closed analyses, the only thing we have is what the analysts tell us. This may not be as ludicrous as linking millions of stock trades to one piece of news...and then changing your analysis an hour later when the index swings the other way....but we can do better.

If you'd like to imagine a different way of making sense of and using giving data, please join us, in person or on livestream for the first ever Philanthropy DataJam - Monday, May 10, from 12:30 - 2:00 EDT.

Presenters include experts from the White House Open Government Initiative, The World Bank's Open Data bank, The Foundation Center and its new GrantsFire Project, The Sunlight Foundation, The Charles Stewart Mott Foundation, GlobalGiving, AID DATA, and the International Aid Transparency Initiative. Thinkers and doers making and using these data tools include experts like you!

Join us at 12 for sandwiches at The New America Foundation in Washington, DC.. Join us at 12:30 from where ever you are on this site and on Twitter at #GiveData. Stay tuned for other events in the coming months.

The DataJam is made possible in partnership with The HAND Foundation, The New America Foundation, The Sunlight Foundation, and Blueprint Research & Design.

Full disclosure: I served on the Board of Directors of TechSoup Global's predecessor, Compumentor, from 2000-2008.

3 comments:

Karl said...

Lucy - I work in the UK for NCVO, an organisation that produces data on charitable giving (we met at OII!). We don't make the data directly available, but I basically agree with your argument. We've been thinking about this for a bit: the biggest challenge is getting our head around what/how we do something. We also suffer from this conflicting analysis - maybe if we make our data available others might follow...

Anyway, our giving stuff is at www.ncvo-vol.org.uk
Cheers
Karl

Lucy Bernholz said...

Karl
Thanks for the link - great to know of your data on UK giving.

I'm thinking of calling my next blog post on this topic "From PDFs to RSS feeds." There are the little changes (formats) and the big changes (working differently to share the stuff)

Thanks

Lucy

Gabriela Fitz said...

Thanks for drawing people's attention to this!

At IssueLab we have nearly 4000 reports, white papers, and case studies from almost 500 different nonprofits. But 98% of these are PDFs (and a much smaller percentage are openly licensed.)

We are sitting on a (rapidly growing) mountain of analyses, with very little access to data. Why does this matter? Because the whole purpose behind publishing research about social change efforts, archiving it, and sharing it is so that it can actually help people to better understand and perhaps even improve their social interventions.

Ideally, IssueLab is in a unique position to do cross-sector analyses and to identify where findings within particular fields match up and where they differ, but this is incredibly labor-intensive and not always possible without access to the underlying data.

We meet a lot of resistance from non-profits when we ask them to share their data. Their reasons are often understandable. But the fact is that until we can show organizations what the value is of making their data more freely available we will just have to keep cracking open those PDFs and digging out the nuggets!

Sounds like Monday's event may provide some examples that can persuade more nonprofits to share more.

Gabi