Thursday, November 21, 2019

Artificially Intelligent Associations

My next big research project (I hope) will be coordinating a distributed team of researchers from many disciplines to better understand how our digital dependencies influence our associational opportunities and rights. For example:
  1. How do the personas that data-driven algorithms create for us align with, or not, how we see ourselves and how we associate? 
  2. How does platform control of information visibility bound our associational opportunities? 
There are many other questions and I'm working on putting together both a working group of scholars and a more complete outline of the project (Feel free to contact me if you're interested).

As with all of my research, I hope to do as much of this as possible "in public;" gathering, sharing, thinking, revising with interested parties. Here's video of one recent conversation I moderated on what I'm calling (because of the lovely alliteration) Artificially Intelligent Associations. This was recorded at Stanford University's HAI (Human Centered Artificial Intelligence) Conference on Ethics, Policy and Governance, October 29, 2019.

Featured participants are Eileen Donahoe, Terah Lyons, Sean McDonald, and Jasmine McNealy



Wednesday, October 16, 2019

Job boards for social sector, public sector, nonprofit, tech and policy jobs

Students often ask me for information on jobs. There's a noticeable (and welcome) uptick in interest in jobs in the public or social sectors at the intersections of policy, analysis, technology, and data. I asked Twitter for relevant job boards and here are the resources I received.*

Internet Freedom Festival Job Board

Tech Jobs for Good https://www.techjobsforgood.com/

Open Gov Jobs

Internet Law and Policy Jobs

Slack channel for jobs via SimplySecure

Giving Compass

Giving Tech Labs

 Ben Green's List - http://www.benzevgreen.com/jobs/ 

New America Public Interest says it has resources

Code For America Public Interest Job Board

Newsletter from Justice Codes often has great job announcements.


I'll update as I can. Let me know what else I missed.

*(No verification, endorsement or even claim that there are shared definitions at work here) 

Monday, June 17, 2019

"Hey (random woman's name applied to a connected digital speaker), donate money to...."

I heard this story last week, from the mother of a toddler.

The kid is home, playing with grandpa. Kid is just past the peek-a-boo stage, now experimenting with hide and go seek. Like most kids of this age, hiding generally involves standing on the other side of a chair or putting a piece of paper on her head. Not really hidden. But Grampa didn't get the message. When it's his time to hide he goes in the other room. Toddler takes hands off eyes, looks around. Doesn't see Grampa. Looks a little bit worried but doesn't move. Waits another minute. Shouts out, "Alexa, where's Grampa?"

I'm going to let you sit with that.

It led us to all kinds of questions. Including about advertising on these devices. Others in the group said it's small, but growing. This article says it's already here and that we (the people) like it.  All of us agreed it seems inevitable.

Question for nonprofits: you ready to pay whatever it will cost to make sure you are the one (and only) response when someone starts asking, "Alexa (others), who should I donate money to?"

Question for the rest of us: You really want some engineered algorithm (no doubt based on who paid the most) telling you where to give your money?

Sigh.



Monday, April 08, 2019

Nonprofits and artificial intelligence

I’ve participated in a lot of conferences, panels, discussions etc. about “nonprofits and AI,” “foundations and AI,” “AI for good”* and so on. The vast majority of them miss the point all together.

It’s not really a question of these organizations using artificial intelligence, which is how every one of these panels approaches it. For most civil society organizations, they may be buying software that’s going to use algorithmic analysis and some AI on a large dataset, perhaps through their vendors of fund development data or software. And then, yes, there are legitimate questions to be asked about the inner workings, the ethical implications, the effects on staff and board and so on. Important questions but hardly worth a conference panel (IMHO) - those are important software vendor considerations, and it is important for all organizations to understand how these things work, but not the “black magic” or “sector transforming phenomenon” that a conference organizer would want you to think.

The REAL issue is how large datasets (with all the legitimate questions raised about bias, consent and purpose) are being interrogated by proprietary algorithms (non-explainable, opaque, discriminatory) to feed decision making in the public and private sectors in ways that FUNDAMENTALLY shift how the people and communities served by nonprofits/philanthropy are being treated.
  • Biased policing algorithms cause harm that nonprofits need to understand, advocate agains, deal with, and mitigate. 
  • AI driven educational programs shift the nature of learning environments and outcomes in ways that nonprofit after-school programs need to understand and (at worst) remediate, (at best) improve upon. 
  • The use of AI driven decision making to provide public benefits leaves people without clear paths of recourse to receive programs for which they qualify (read Virginia Eubanks’s Automating Inequality). 
  • Algorithmically-optimized job placement practices mean job training programs and economic development efforts need to understand how online applications are screened, as much as they help people actually add skills to their applications.
This essay on “The Automated Administrative State” is worth a read.

The real question for nonprofits and foundations is not HOW will they use AI, but how is AI being used within the domains within which they work and how must they respond?



* I try to avoid any conversations that are structured as “_____ for (social) good” and all situations that are “_[blank]_ for social good” where the [blank] is the name of a company or a specific type of technology.

Friday, March 22, 2019

Software Kills

Yep, deliberately trying to provoke you with the headline. Here's what provoked me:

The news that two airplane crashes killed a total of 346 people, in part due to a software upgrade that was "optional." (read: cost more)

This story about electronic health records (software) and deaths that ensued from resultant poor medical care.

What does this have to do with philanthropy and civil society?

Philanthropic and civil society organizations are as dependent on software as are businesses and governments. Do you know how your software works? What its vulnerabilities are?

Your work may not involve the difference between life and death, but if you're collecting information on lots of people and not respecting their rights in collecting it, not protecting it once you have it, or managing it (and the software you use to hold and analyze it) in line with your mission, how much good are you really doing? Are you making the people your organization serves, or the nonprofits you fund, more vulnerable with your data practices even as you try to do good with your dollars?

Friday, February 08, 2019

Responsible data in civil society

The World Food Programme recently announced a partnership with Palantir. There’s a lot to be concerned about here - in this specific instance and in the broadest possible sense for understanding civil society’s role regarding digital data use. Please read this open letter

https://responsibledata.io/2019/02/08/open-letter-to-wfp-re-palantir-agreement/


Does your organization know how the tech companies you partner with use the data you have on your constituents? It’s not just about “ownership,” but the insights and algorithms and predictive tools that may be built on that data. Are you “locked in” to a platform - if you wanted to switch vendors for your CRM or mailing lists or social media - can your organization get your data back?

How is your organization managing data? With any more respect for the individual people from whom it comes than these big software/analytics/insight companies? If not, why should anyone trust you with their data?

These are pivotal questions - and we need answers and processes and protocols and regulation. Civil society is not meant to be either a poor cousin of big business, an outsource arm of government, or the “data washing” arm of either.

Wednesday, January 02, 2019

The year (and decade) ahead

The tenth annual Blueprint - Blueprint 2019 - went live in December. You can find it and the entire decade's archive here.
On January 23rd we'll be hosting our annual "prediction-palooza" (free, online) discussion about philanthropy predictions. Information on that is available here.
In the meantime, I've just come off a conversation with a research group preparing a report for a big foundation on the future of philanthropy. I get asked to do a lot of these. I only agree to these conversations if there is going to be a public version of the report. I'm told that's the case - this report should be available in April.

Some thoughts as I was asked to reflect on the last 5 years and then look ahead 10 years.

Looking back: 
All the new products, platforms and ways to give (DAFs, impact investing, crowdfunding platforms, text donations, cause marketing, etc.) are not adding up to more giving by more people. As Indiana University notes, since 2000 we've lost 20 million givers - at least as recorded by data on charitable donations to tax exempt nonprofits. This is over the same 19 year time frame that brought us online/mobile donations, online data resources about nonprofits,

-> Perhaps we can stop assuming product innovation equals growth?

Where have the "missing millions" gone? I doubt people have given up on giving, after all we've been doing it for thousands of years. I think we have an "instrumentation" problem. Which is to say, we're measuring the wrong things. Changes in the tax code are expected to result in changes in giving to nonprofits (and to changes in how useful itemized tax forms will be for measuring giving behavior).

--> Perhaps we can ask whether we're measuring the right things?
We need new instruments to measure how Americans give and to whom. It should include measurements of person-to-person giving (e.g., as happens on crowdfunding platforms), political donations and contributions of time, investments intended to produce social/environmental outcomes, and money raised via product purchases (BOGO or cause marketing). I've been calling for this since at least 2015 - see here, and had intimations about it back in 2008 (see here).

Looking Ahead:

Commercial platforms and nonprofit data:
Does anyone really think Facebook is going to be the same thing in 2029 that it is in 2019? Not even the folks at FB would say that. Every nonprofit and philanthropist that is managing their giving, their outreach, their donors, etc. on this single platform should beware. The rules are going to change, the compan(ies) will push back, there will be all kinds of ups/downs between now and 10 years from now - but in no imaginable future is doing right by nonprofits (in terms of their data and longterm relationships with donors) in the growth plans of that company. If you can imagine either a different FB in 10 years or no FB 10 years from now, then it seems like a good idea not to put all your data eggs in a FB basket. (Or any commercial platform driven by a business model unrelated to how well it serves the social sector).

Charity and Politics:
The legal line between these two domains is man-made. It's changed over time. It's thin and it's fragile. The current presidential administration is determined to destroy it (see Johnson Amendment fight, as well as court cases on donor disclosure in politics). There's not enough manpower at the IRS to defend any boundary that might exist. Digital data and systems make defending the line even more difficult than they were in the analog age. Many advocates and advocacy organizations would like to see the line gone. Individual people may not care as much about separating political and charitable action as policy makers and nonprofit leaders want them to. Assuming the old boundaries between these domains function as intended is fooling oneself. We should put our attention into writing rules that protect our (often conflicting) rights (association, expression, privacy), sheds light on political corruption and the excessive influence of wealth, and assumes digital data collection and exposure, rather than nostalgically assuming that protecting a legal distinction established in the 1950s is the best (or only) way forward.

Shiny Objects
Anyone but me noticing that the marketing hype about blockchain is starting to quiet down, just as people building digital systems that focus on digital security and encryption are growing in actual use? This is a good thing - stop gawking at the new packaging and let's focus on the values that particular activities require. In some cases, permanent records of transactions are a good thing (supply chain verification of objects, possibly). In other cases, distributed, immutable records may not be such a good idea (human ID, for example).

Artificial intelligence (AI), predictive algorithms, and machine learning are three more shiny objects. Most people ask me "How will nonprofits and/or philanthropists change by using AI?" I think this question has the subject and object in wrong order. The more relevant question for most donors and social sector organizations is "How will the use of AI (etc.) change what the organizations/donors are doing?" Government bodies and commercial companies are already using these tools - they shape what you see online, what benefits you qualify for, your chances of being audited by the tax authority, your chances of getting a speeding ticket, the keywords you need to enter in your job application, etc. etc. They are changing the nature of the problem space in which social sector organizations and philanthropists do their work. This is not the future, this is now. This is not the edge case exceptions of a few organizations wih some good data and data scientists, this is the great mass of organizations and donors. I'd love to see some real discussion of how philanthropy and social sector organizations can and should change to be effective in a world already being shaped by AI (etc.). Then, for dessert, we can talk about the exceptions to this rule.

It's the nature of the game that we'll chatter about the latest shiny object. What's much more interesting is how we embody shared values in new ways.