Apple made big news this week with the announcement of its new phones with mobile payment built in and it's new watch, which seems to be a platform for everything from telling you the time to heartbeat sharing (?!)
Much has been written by privacy experts, tech geeks, health policy wonks, and financial gurus about all the new stuff this new OS and device are bringing.
Personally, I appreciated the Cupertino company's doing such a fabulous job of sending a signal that digital civil society has arrived. (And just in time for our conference on the Ethics of Data - how perfect!)
Here's the signal, in case you missed it. Apple made a big deal about how it can protect your private financial and health data, which the watch and phone will help you collect and manage. They explained how the phone/watch will/will not store your data, how it can be shared, and who will have access to it.
The following language is from Apple's own developer guide - (the rules of the road for the people creating the apps that will make the watch do more than its Casio forebears)
"27.4 Apps may not use user data gathered from the HealthKit API for advertising or other use-based data mining purposes other than improving health, medical, and fitness management, or for the purpose of medical research (emphasis added)"My question? Medical research by whom? Harvard? (nonprofit) The Centers for Disease Control? (government) Pfizer? (Commercial company) Citizen scientists in their garages? (none of the above)
Medical research is done by businesses, government, and nonprofits. They each operate under separate rules. Some have clear institutional structures and review processes for doing research on humans. Some make sure they have your consent to be included in a study or a publicly reported "finding," others, well not so much.
The New York Times read the above developer guide and interpreted it as if reserving use of the data for medical research was in some ways protecting our privacy. From their Thursday story:
"Apple has made it clear to developers of health apps that it wants to protect privacy. Last week, it updated its guidelines for app developers, stating that apps working with HealthKit, Apple’s new set of tools for tracking fitness and health statistics, may not use the personal data gathered for advertising or data-mining uses other than for helping manage an individual’s health and fitness, or for medical research."But here's the thing - letting the data be used for "medical research" without specifying by whom and under what conditions doesn't protect you in the least.
It's like saying only "book lenders" will have information about your reading habits. Book lenders include your local library, which has been protecting reader information from prying eyes for decades, and Amazon, which uses your data...differently.
The point is we have different expectations for different kinds of organizations - public, commercial, and not-profit - and we hold them, socially and legally, to different standards of transparency, accountability, and trust. Data cross all those lines (and those lines are already rather blurry). As they celebrated in Cupertino, it's clear that we have entirely new tools for collecting, storing, and sharing our data. We need new rules - especially if we want to maintain the trust and integrity of the nonprofit sector.