Data deficits and surfeits

(Photo courtesy of LogoMan at logomorphosis)
This morning I heard this (totally typical) set of reports on the radio about the doings in financial markets:
Around 6 am pdt: "The Dow is up 42 points, on news of a report that the Eurozone economy is still expected to expand at .9% in 2010, despite the heavy debt burdens of Greece and other member countries."
Approximately 7 am pdt: "The Dow is down 15 points as investors worry about debt in EU countries."
These kinds of swings in stock indices are daily events. What always catches my attention is the attempt to link the market's gyrations to some cause - "investors' worries" or "data from a new report." This effort to link the ups and downs of the market, which result from a collection of of millions of independent transactions, to any a single cause has always struck me as....well...ludicrous.
Yesterday TechSoup Global* posted this roundup of news on giving online. It cites three different analyses (Convio, Chronicle of Philanthropy, and NTEN) of online action. So far this week I've received three press releases, one from FoundationSource, one from Fidelity Charitable Gift Fund, and one from Convio, about their analyses of 2010 giving. Each study looks at different data sets. Convio, Fidelity, and FoundationSource are mining the data from their own proprietary transaction platforms.
None of these sources let anyone else use their data, though the Chronicle posts its database of grants online for users to search.
This is the state of data in the sector right now. Proprietary data sets used to generate publicly shared analyses. There is no way to check the data or the analyses. There is no way to compare or mashup or integrate data sets (or even to really cross reference the analyses).
To get a little techy here (and somewhat oversimplify the problem) the holders of the data are sharing pdf's with us, instead of sharing data in RSS streams. Anyone's who's ever tried to edit a pdf knows how hard it is. Anyone who's ever had to re-enter a bunch of numbers from someone else's pdf into their own spreadsheet so they could ask the questions that interested them knows what a pain it is.
We have an abundance of proprietary analysis and a surfeit of publicly available data.
Now, not everyone is a wonk who's going to want to mash up the data or check anyone else's analysis. But some folks, such as those managing major public funding programs or foundations looking for funding partners or individual donors looking to do some deeper analysis of giving trends on their issues might. With these kinds of closed analyses, the only thing we have is what the analysts tell us. This may not be as ludicrous as linking millions of stock trades to one piece of news...and then changing your analysis an hour later when the index swings the other way....but we can do better.
If you'd like to imagine a different way of making sense of and using giving data, please join us, in person or on livestream for the first ever Philanthropy DataJam - Monday, May 10, from 12:30 - 2:00 EDT.
Presenters include experts from the White House Open Government Initiative, The World Bank's Open Data bank, The Foundation Center and its new GrantsFire Project, The Sunlight Foundation, The Charles Stewart Mott Foundation, GlobalGiving, AID DATA, and the International Aid Transparency Initiative. Thinkers and doers making and using these data tools include experts like you!
Join us at 12 for sandwiches at The New America Foundation in Washington, DC.. Join us at 12:30 from where ever you are on this site and on Twitter at #GiveData. Stay tuned for other events in the coming months.
The DataJam is made possible in partnership with The HAND Foundation, The New America Foundation, The Sunlight Foundation, and Blueprint Research & Design.
Full disclosure: I served on the Board of Directors of TechSoup Global's predecessor, Compumentor, from 2000-2008.
Tags: philanthropy, data, opendata, #givedata, iati, adidata, globalgiving, foundationcenter, mott, gov20, gov2, sunlight, worldbank