Are you ready?
Blueprint 2015, my sixth (!) annual industry forecast, will be available from the GrantCraft website on December 10, 2014. With six of these under my belt I'm proud to say we've cycled through the primary colors (blue, red, yellow) and the secondary colors (green, orange, and now, purple).
Are you ready?
Trust and integrity are key to nonprofits. They trade on these virtues. It's no accident that "Trusts" are the name for one type of nonprofit enterprise. The defining aspect of the nonprofit corporate structure - the non-distribution clause relating to the use of financial assets - codifies the use of financial assets for mission, allowing the public to trust that the organization will be true to its social purpose.
In the 21st Century, nonprofits are going to need to engender that same kind of trust regarding their use of digital assets (otherwise known as digital data).
This is a tremendous opportunity for the sector. Earning and keeping the trust of all (data) donors could become a defining quality for civil society organizations and help distinguish them from commercial enterprises and public agencies. Currently, many commercial operations and the government are treading lightly on the trust of their customers and constituents. Headlines from just this week:
Uber: "Whose Privacy will Uber Violate Next?"data trusts" will emerge as a new type of enterprise - but all civil society organizations should be working to maintain trust regarding data.
Class Dojo: "Privacy Concerns for Class Dojo and Other Tracking Apps for Schoolchildren"
My laugh is not nearly as engaging as Tom Magliozzi's of Car Talk but I'll do my best on December 14 when I'll be talking about Digital Civil Society on Philosophy Talk. Here's the write up about the show:
“Cyber-Activism” with Lucy BernholzTickets are available for the live show here. If you're not in the Bay Area Philosophy Talk is hosted on public radio stations around the country and available on the web.
Whether it’s making donations and signing petitions online, or using
social media to highlight political causes, cyber-activism has never
been easier. With a few clicks, we can make our voices heard around
the globe. But who’s listening, and is anything actually changing?
Does cyber-activism mobilize real-world action on the ground? Or does
it reduce political engagement to simple mouse-clicking, and
ultimately threaten the subversive nature of change? John and Ken get
active with Lucy Bernolz, co-author of “Disrupting Philanthropy:Technology and the Future of the Social Sector.”
Henry Timms (founder of #givingtuesday and my colleague via Stanford PACS) and Jeremy Heimans have a new article in the December issue of Harvard Business Review called "Understanding New Power." In it they discuss characteristics such as co-ownership and participatory governance. They highlight some of the values of the new power that they call "opt in decision making" and "open source collaboration."
In the requisite 2 x 2 matrix (this is HBR after all) the precious terrain of the upper right hand quadrant includes a mix of movements (Occupy), nonprofits (Wikipedia), benefit corporations (Etsy), and commercial enterprises.
In other words, several of the institutional forms that constitute what we've been calling the social economy embody the characteristics and values that Timms and Heimanns pinpoint as a new type of power. Go read it - see what you think.
I just got an alert from a trusted friend* to the existence of an app - Radar - which is designed to alert you if social media accounts start showing signs that your friends are in distress. The app is intended to help friends help friends in need. It was launched by a suicide crisis line in the UK called Samaritins.
But it's set off a (rightful) alarm about surveillance and privacy and algorithmic alerts. In order to work the app needs to constantly monitor all your accounts, be programmed to infer emotions from content, and alerts you if someone you follow is determined to be "in need." Problems abound - let's look at a few:
- Not everyone who might follow you is necessarily your "friend." Many are probably bots. Worse, some may be stalkers.
- Algorithmic determination of emotional states? No question there - the risk of false positives or negatives seems rather high. The app notes on it's own website that it's in beta - "and won't get it right every time." Suicidal ideation and social media apps full of trolls and troublemakers hardly seems like the place to take this chance.
- Constant monitoring of all the accounts you follow. Meaning that no consent is ever asked for from those whose accounts it's reading. And the app is storing data - does it need to?