A new calculus for civil society

In the analog era, when you took action for social good the math was relatively straightforward;
                                 1 action = some result.

It might have been a 1:1 relationship: 1 donation = 1 meal.

It might have been more than that: 1 volunteer act = multiple additional donations = multiple benefits

It might have included a multiplier effect: Action sustained over time = new policies = multiple benefits

It might have backfired: 1 action = negative result

What it didn't have, and part of what makes digital action different, is a "digital differential."

Here's what I mean. In the digital environment, every action creates a digital trail - data, metadata or both. So if you click to like the ALS Ice Bucket challenge (for example, to be au courant) your click supports ALS research.  It also tells your friends you care about ALS (or that you are very much in the know), it also adds to the dataset of information about you that is being built by several enterprises and held in several places, and it also adds to the dataset of the ALS campaign.

Your single click becomes a digital data point with lots of potential other uses (marketing, donation solicitation, friendship building). One action = lots of derivative uses and interpretations, some by you, most by others.

This digital differential is true for all digital data. Our actions in digital space create an additional "resource" (data) that can be used in lots of ways. These digital differentials may be used for positive or negative actions. What happens with them is not inherent in the data, it will depend on what we do with the data and how we do it.

CrisisTextLine is a great example of this. It helps crisis counseling centers reach teens via text. In the analog age (last year) when this was done by phone, the math was straightforward:

ANALOG: 1 call = 1 teen helped.

Today, the math is different:

DIGITAL: 1 text message = 1 teen helped + a dataset of digital text messages (with more than 3 million records to-date).

This is one way (there are others) that digital changes the calculus of civil society.

What do you do with that dataset? How do you protect it and the rights of the people represented within it? CrisisTextLine hopes to make it useful to scholars and policymakers. You can see their work - and their ethical decisionmaking, struggles, and open questions about this here.* The upcoming Ethics of Data conference will look at these questions and many others in a broader civil society context. Some of the thinking on data philanthropy also addresses these issues.



*At the Digital Civil Society Lab at Stanford, we held a charrette around CTL's work - because their opportunities and challenges are all of ours, they're willing to share them publicly and ask for help, and we all stand to learn a great deal from what they are trying to do and how they are trying to do it.





The last 20 years and the next 20 years

I had a great conversation this morning with Sara Davis at the Hewlett Foundation. It led to these two napkin sketches of the past and the future of organizational structures - what do you think?*

The last 20 years in nonprofits and foundations:

The next 20 years in nonprofits and foundations
*The drawings are mine and should not implicate the Hewlett Foundation in any way.

Opting in in digital civil society

One of the defining features of civil society, the independent sector, the nonprofit and philanthropic sector or whatever you want to call it...is that participation is voluntary. Giving your time or money is your own choice. Partaking of services offered, for the most part, is by choice (less so where government services are contracted through nonprofits, but stay with me here). One of the most common names for nonprofits is the "voluntary" sector. We are neither required by market forces nor obligated as citizens to participate. This notion of "voluntary" access is complicated when we are talking about people with few choices, those marginalized by prejudice or finances (or both). Those nonprofit organizations that serve these communities are often actively concerned with respecting the individuality and power of the people they serve - a concern that needs to be attended to every day in every action, small and large.

Voluntary participation is what allows civil society to be the home of expressive acts, protests, and the rights of minority groups in a majority-run democracy. A space, such as civil society, that is defined by our individual ability to participate voluntarily is also defined by our ability to choose not to participate. I don't have to give to causes I don't support or to organizations whose agendas I disagree with. In the language of today's tech world, the space in our democracy that allows me to "opt in" also provides me a space from which I can "opt out."

Civil society needs to take these qualities of individual power and "opting in" seriously into the digital age. Our work with digital data needs to align with our organizational values and missions.

If civil society claims a role in pursuing social justice than it has a special obligation to do two things - protect people's power to act and make sure that digital data aren't used to exacerbate existing power differentials.

Most of what is offered online, certainly by commercial operators, is "opt out." By default, your information is gathered, stored, shared, mined, spliced, diced, and minced. Anyone who has tried to cancel a Facebook account, build an app that doesn't automatically collect location information, or even make a donation without giving their email address knows that the default choices in the digital world are not "opt in." Rather, the most common default online seems to be "make it as difficult as possible to opt out."

We risk losing a great deal in a society if we lose the power to "opt in" and its "opt out" corollary. It's not just about tools like social media or text messaging - more and more policy decisions, corporate pricing choices, and infrastructural access decisions are being based on decisions made from digital data collected via remote sensors, satellite imagery, and massive datasets collected from data we leave behind us in our daily digital activities. These datasets drive decisions on pricing and access to services such as transportation, housing, child care, clinic access and so on - and they're being built with our data and without our consent (or even awareness). 

People need the power to choose. Precisely because a nonprofit may be the only service provider in an area, or the only culturally competent association, or the only option of any kind - it needs to be attentive to the power arrangements it puts in place, seeks to mitigate, or actively fights against.

How data are collected, analyzed, stored, shared, secured, or destroyed are acts of power. Commercial firms (for the most part) have made it clear that when it comes to digital data, they hold the power. I'm willing to bet you didn't read the Terms of Service for that app or website before you logged in, but that's what they say.

Civil society needs to attend to the expressions of power embedded in digital data. It's not just how the sector uses digital data, but how digital data is used to define the sector.




Ethics and the Age of Digital Assumption





It turns out that Facebook is manipulating your news feed and OK Cupid is playing with your heart.


Neither of these digital algorithmic head games came as a surprise to folks in the digital know, but that doesn't make them OK.

And they did come as a surprise to most of us -  who don't quite understand how software works and who still make assumptions about personal agency, control, and validity based on analog experiences. This is a world of "black box algorithms" - where software code is determining what information we see and don't see. It's a world where consent means "not really" - in which, as one wag puts it, "the biggest lie on the internet are the two words "I Agree." There's a new book out, Virtual Unreality, that makes it clear just how prevalent this "manipulation by software" really is. Be clear though - the phenomenon is not new, just the broader understanding that it is happening all around us.

At a recent innovation conference hosted by Stanford PACS and the International Red Cross and Red Crescent Society I called this moment in time "The Age of Digital Assumption." By that I meant a couple of things:
  • For nonprofits - assume your volunteers will use digital tools (probably their own, as in "Bring your own device"). Assume someone, somewhere will create a new digital tool or apply existing ones to whatever crisis you are handling (see #DetroitWater). Assume you will not be in control of the way digital information is used to respond to any particular event or disaster.
  • For people - assume your digital information is being collected, and that you don't really know what is being done with it. Assume that your privacy does not rank as high on the list of concerns of the organization collecting your data as it might on your own list. Assume that if you are giving information (or permission to gather information from your phone) such as your name, email, location data, etc. that the organization asking for it is, in fact, collecting it and doing something with it (maybe just storing it till later)
People have all kinds of feelings about Facebook or OK Cupid using their data.  One good outcome of these high profile corporate acts is that they've brought ethics into the broader general discussion about digital data and platforms in society.

Moving the issue of ethics from the fringe to the center is also part of the "Age of Digital Assumptions." This age comes after what I might call the "Age of Digital Exceptionalism" in which we were still agog at all the things our "shiny objects" can do. Now we get to the harder questions, namely, "Just because we can, should we?"

We're putting ethics at the center of the conversation in an upcoming conference hosted by Stanford PACS Digital Civil Society Lab, the Harvard Humanitarian Initiative, and the Brown Institute at the Columbia School of Journalism along with DataKind, PopTech! the Responsible Data Forum and a growing list of other partners.

Here's the basic question - What are the ethical parameters for civil society organizations using digital information and infrastructure?

Nonprofits often talk about trust and integrity as being central to their work. If so, do we want (expect) nonprofits to act differently with our data then we expect Facebook to act? Do we want (expect) nonprofits to communicate more clearly to us about what they do with our data? Do we want (expect) to be able to access civil society organizations and their services without compromising our own data (or the digital whereabouts of all the folks whose addresses and phone numbers are stored on our phone address book?)

Similarly, do we want (expect) nonprofits to access and use remote digital data in different ways than other enterprises? If wealth managers are using satellite imagery for portfolio management, does that mean nonprofits can or should use access to the same data to determine levels of poverty? Or to find ways to better deliver medicines to remote areas with bad roads? How should funders think about ethical issues, and what role do they play in shaping behavior?

What, if any, incentives should be provided to encourage civil society's use of digital data for good? What, if any, limits should be put on civil society uses of digital data?

More about the conference is online here. It's by invitation only because of space limitations, but we'll be sharing resources on the web site and conversation on twitter via #EoD14.




#DetroitWater - Part Two

A few days ago I posted about #DetroitWater. I had sent off an email to the creators before posting, and when I didn't hear back in a day I went ahead and posted the blog with the questions.
 
I think my emails got buried in the flood of attention they've been receiving. A little Twitter outreach, and the blog post itself, and they circled back to me ASAP with these answers to my earlier questions. To save you from having to click around - here's my old post again and my questions - this time with their answers.

 text of my PREVIOUS POST follows

Here's a website that allows anyone to pay the water bill (or part of the bill) owed by someone in Detroit. Lest we grow complacent, allow me to point out several noteworthy - dare I say, almost astonishing - things about this from the perspective of (what else) digital civil society:

  1. The city of Detroit is so broke it's cutting off water to residents.
  2. Many people in Detroit can't afford to pay for water. 
  3. People are taking to the street and online to protest - and the fight is on between privatized water systems and water as a common good (this is common fight in many parts of the world, and there is a strong "water is a human right" movement)
  4. Web technology built by a handful (two, I think) of people is up and running to help strangers help strangers.
  5. There are no intermediary organizations in this mix - donors pay bills directly to and through Detroit's public water department. The creators built the site, money goes to the water authority, donors and residents don't interact.
  6. PRIVACY is addressed as key issue on the site - both from the residents' and the donors' standpoints. And no grand promises are made.
I tried to get some more information about #DetroitWater but haven't heard back from the folks that I emailed. Here's what I asked them (My questions in bold, their answers follow)

My original Questions with their Answers

1. ​You mention in the privacy section that Detroit Water might identify residents. How likely is this, how would it happen, has it happened?

Tiffani and I have gone through the payment process several times and have not had names revealed. The information presented on the DWSD's site is not under our control. We collect names as a part of the submission process to get assistance and have each resident indicate whether we can give their names to the donor or not. As of yet we haven't released any names. and information collected and stored by us will not be released without the permission of the resident or the donor.

​2. ​What kind of permission did you need to get from Detroit Water Auth to use this info? How did you get the info?


We did not get permission from the Detroit Water and Sewage Department. The website has several channels for entry and one of those is simply an account number.

Residents are providing us with their account numbers, phone numbers, address and amount owed, which is what we need to verify that they are actually in need of assistance. We then pass along the account number to the donor who makes a payment directly to the DWSD on the resident's behalf.

​3. ​Are residents opting in or opting out of this? Who is deciding and how are they deciding who's bills get paid?

​Residents are opting-in.

Tiffani, myself and a team of volunteers have been manually matching donors to accounts. As we iterate and demand grows, we are adding in rigor for determining who is prioritized for help. For example, we now have a check box for people who are over 65.  Other guidelines include, must have a past due balance, must be residential and not commercial, account must have activity in terms of use. At this point we have enough donors that we aren't having to deny one account over the other, just that it meets our standards for assistance. This is subject to change.

​4. ​ Where is transaction data being stored? How secure is it?


The transactions take place on the DWSD website. So the level of security is that as expected for a site that processes payments. We do not collect payments or payment information.

​5. ​Is CFA involved in this (one of the founders has a CFA email address)? Did you create this totally on your own? Others who helped? How long did it take?
This is not a Code for America project.

The idea and iterations of the site and our processes were created by Tiffani and I. It took us a weekend to get it up and running. We are constantly iterating as needed, and on the brink of having everything automated via algorithms written by Tiffani.

We've had help with operations from a team of volunteers who are listed on our site here: http://detroitwaterproject.org/#team