Friday, September 12, 2014

Apple's Watch and the Ethics of Data

Apple made big news this week with the announcement of its new phones with mobile payment built in and it's new watch, which seems to be a platform for everything from telling you the time to heartbeat sharing (?!)

Much has been written by privacy experts, tech geeks, health policy wonks, and financial gurus about all the new stuff this new OS and device are bringing.

Personally, I appreciated the Cupertino company's doing such a fabulous job of sending a signal that  digital civil society has arrived. (And just in time for our conference on the Ethics of Data - how perfect!)

Here's the signal, in case you missed it. Apple made a big deal about how it can protect your private financial and health data, which the watch and phone will help you collect and manage. They explained how the phone/watch will/will not store your data, how it can be shared, and who will have access to it.

The following language is from Apple's own developer guide - (the rules of the road for the people creating the apps that will make the watch do more than its Casio forebears)
"27.4 Apps may not use user data gathered from the HealthKit API for advertising or other use-based data mining purposes other than improving health, medical, and fitness management, or for the purpose of medical research (emphasis added)"
My question? Medical research by whom? Harvard? (nonprofit) The Centers for Disease Control? (government) Pfizer? (Commercial company) Citizen scientists in their garages? (none of the above)

Medical research is done by businesses, government, and nonprofits. They each operate under separate rules. Some have clear institutional structures and review processes for doing research on humans. Some make sure they have your consent to be included in a study or a publicly reported "finding," others, well not so much.

The New York Times read the above developer guide and interpreted it as if reserving use of the data for medical research was in some ways protecting our privacy. From their Thursday story:
"Apple has made it clear to developers of health apps that it wants to protect privacy. Last week, it updated its guidelines for app developers, stating that apps working with HealthKit, Apple’s new set of tools for tracking fitness and health statistics, may not use the personal data gathered for advertising or data-mining uses other than for helping manage an individual’s health and fitness, or for medical research."
But here's the thing - letting the data be used for "medical research" without specifying by whom and under what conditions doesn't protect you in the least.

It's like saying only "book lenders" will have information about your reading habits. Book lenders include your local library, which has been protecting reader information from prying eyes for decades, and Amazon, which uses your data...differently.

The point is we have different expectations for different kinds of organizations - public, commercial, and not-profit - and we hold them, socially and legally, to different standards of transparency, accountability, and trust. Data cross all those lines (and those lines are already rather blurry). As they celebrated in Cupertino, it's clear that we have entirely new tools for collecting, storing, and sharing our data. We need new rules - especially if we want to maintain the trust and integrity of the nonprofit sector.




Monday, September 08, 2014

Why focus on civil society and data?

This is the third of a three-part series leading up to the Ethics of Data in Civil Society Conference at Stanford on September 15-16.

Big data and government. Big data and business. Big data and consumers.
Why are we focused on data and civil society?

First, what is civil society? This is a question that scholars can debate for years, so let's shortcut that and use this definition - "Civil society is where we use our private resources (money, time, data) to benefit a public (someone other than ourselves))."

In many places (including the U.S), the nonprofit sector and philanthropy are often used as a synonym for civil society - this is only partly true because civil society includes all those informal networks, neighborhood groups, beach cleanups, and online volunteering efforts that we do - all those mutually beneficially, associational activities that take place all over the world (and, increasingly, all over the Internet). Yes, civil society includes these institutions, but it is much more.

So why focus on the role of data and ethics? I could do the usual economic argument here and tell you about the size of nonprofits and philanthropy, the number of people employed in the sector, the trillions of dollars in revenue and assets and so on, and those numbers are big and impressive.

But the real reason to focus on digital data and ethics runs much deeper than contributions to the GDP. It has to do with the intended role of civil society in a democracy. Here's how we've been breaking this thinking down at the Digital Civil Society Lab (Part of Stanford PACS and one of the hosts of the conference):
  1.  Civil society is essential to democracy
  2. Private action for public benefit (civil society) requires that individuals can act independently – apart from government or the marketplace – and voluntarily – free from coercion (freedom of association)
  3. Civil society also depends on the permission to speak and communicate freely, to broadcast one’s views both internally to other associational members and externally to other citizens (freedom of speech).
  4. Our world is getting more digital – networks of digitized data undergird more and more of our communications and connections and more of our analog assets are being digitized (text, video, audio, DNA, physical objects)
  5. We are in a transition period where civil society actions are adapting digital tools and practices and where digital innovators are creating tools with civil society purposes – “the social is going digital and the digital is going social.”
  6. We need to see if and how the digital environment is affecting or altering the rights and abilities of individuals to voluntarily act privately for the public good; to consider the mechanisms for valuing, owning, and donating digital assets; and consider the governance, organizational, and policy implications of this sphere.
Given the ethical dilemmas raised by digital data in all aspects of our lives, a focused inquiry on the implications for civil society is a timely complement to the research and policy considerations in business and government. 

Monday, September 01, 2014

What codes inform our work with digital data?

This is the second of a three-part series on the ethics of data in civil society, leading up the Stanford Conference on September 15-16.

Ethical codes abound - here's a list of just a few of them with relevance to how nonprofit organizations or private action for the public benefit.

Asilomar Convention: Ethical decision making in higher education research

A manifesto for the future of the 'right to be forgotten' debate

The Consumer Privacy Bill of Rights (USA)

Compendium of Ethics and Standards for Nonprofits

The Data Bill of Rights


(A longer, but still not comprehensive, list of examples can be found here)

What, if anything, is different about digital data that requires us to rethink our work in civil society? Here are some characteristics of digital data that may require new ethical choices. 
  • Digital data can be collected passively, without knowledge or consent of those from whom the data are collected.
  • Digital data enable the application of predictive analytics the accuracy or validity of which are still unknown.  
  • Digital data can be stored remotely, and for unknown lengths of time, by third parties who have collected the data or purchased the information.
  • Digital data alter the time frame for our actions. For example, real-time satellite imagery lets us adjust our interventions with information never before available. Alternatively, stored data collected from a person today may be used to define actions on behalf of or against that person's children or grandchildren. It can both shorten and lengthen the relevant time frame.
Several scholars have written about ethical provocations of big data, the need for ethics related to big data, and the need for legal due process where big data, algorithmic predictions, or other software applications are concerned. Here are two resources by individuals participating in the Stanford conference.

Kate Crawford and Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms.

Neil Richards and Jonathan King, Big Data Ethics