Wednesday, December 06, 2023

What does open mean?

                                                    Photo by Enrique Macias on Unsplash

Open source technology has a long history of being a counterbalancing force to closed, proprietary systems. For decades it was open source versus corporations. Then Microsoft (closed, proprietary) bought GitHub (most used repository of open source code). Today, in the AI battles, Facebook/Meta, IBM and Oracle, along with universities and the National Science Foundation, announced the AI alliance - dedicated to open AI models. This is part of the larger debate about building responsible/trustworthy/safe/ethical AI. 

So some of the world's biggest tech companies, many who have thrived on proprietary, patented, trademarked and close source code, are now arguing that an open community of developers is the way forward to protect us from the harms of AI.

This is one more step in both the commercial battles for market dominance and the definitions of words such as safety, ethical, trustworthy and responsible (in the context of AI.) For example, effective altruists and longterm (ers) (ists) focus on the word "safety." They're bogeyman is the potential for AI to destroy humanity. This group, the AI Alliance, uses the terms "open" and "responsible." They're bogeyman appears to be the other companies who've already launched proprietary models - like Google and Microsoft.

The mix of organizations and funding in these AI debates includes corporations, governments, and numerous nonprofits - not only universities, but also groups of developers and advocacy organizations. Philanthropic funding is very much in the mix. The direction of AI development is not simply an external force acting upon the nonprofit/philanthropic sector; it is being shaped by numerous actors within the sector. The meaning and purpose of "open" in this context is neither static, nor simple.

Thursday, November 30, 2023

Maybe nonprofit governance aint what it needs to be?

                                                                            M.C. Escher, Relativity Stairs    

Imagine a large - no, bigger, much bigger - nonprofit hospital, university, housing developer, or after school program. Bigger by assets than any other. Right now, there are 13 universities in the U.S.A. with more than $10 billion endowments (one of which is a "public" university), with the largest topping $50 billion. Bigger than that. 

There is one. OpenAI. Though its size is not based on endowed assets but rather speculative stock value, the organization, which is still as of this writing a nonprofit, is valued at $86 Billion. It's not clear that the organization will continue with its current structure - the events of the last few weeks resulted in a new board and promises to revisit the structure.

Others have written about what the weeks' events mean for the development of AI going forward, the effective altruism (paywall) movement, tech bros, and capitalism. I want to think about what it means - if anything - for civil society. 

First, it seems that no one in civil society or the U.S. nonprofit sector really sees the organization as anything other than a commercial firm (it has a capped profit structure, which limits the amount of profit to be returned to shareholders, but only designates profits to be reinvested in the organization (as nonprofits due) after investors are paid out). 

I can understand this view, sort of. The sector in the U.S. (as represented by its lobbying/advocacy/infrastructure groups) is still hung up on a certain kind of charitable corporation, designated as 501c3 (OpenAI is such), and doesn't pay much attention to the dozens of other structures that identify as nonprofits. Heck, it's hard to get these groups to address the problematically porous nature of c3s and c4s, they're way behind the eight ball in understanding they swim in a sea filled with informal associations, "Slack"-based "organizations" for mutual aid or volunteering, B corporations, or hybrids. So, perhaps its way too much of an ask to expect recognition among their own of the behemoth of technology development. 

Second, the OpenAI events show that the nonprofit governance model is not "strong" enough to outweigh the interests of investors. Given the model's purpose in this situation, and the information that's public, the nonprofit board that fired the CEO was acting as it was intended. I guess no one thought they'd actually do what they were set up to do. 

Third, while the argument for data trusts has largely focused on the difference between digital assets and analog ones as the reason for a new organizational form, they're still rare and probably outnumbered by hybrids of profit/non-profit forms. The AI world - especially that which professes some commitment to "ethics", "safety," "responsibility" or "trustworthiness"* - is ripe with hybrids, not trusts. But they're not limited to this field - they're plentiful in journalism, for example. I highlight this in the forthcoming Blueprint 24.

Fourth, it's not just the structure of the organization that matters, it's also the structure of the funding. Many supporters of the AI organizations we captured for our dataset (live link on December 15, 2023) are contributing via deductible donations and commercial investments. The more the donor class uses LLCs and family offices, the harder it is to determine what kind of funding they're putting where. While those who invested for a financial return in OpenAI may be happy with the result of the last few weeks, what about those who donated with an eye on the mission? 

Fifth, philanthropy is playing a not insignificant role in these developments. Individuals and organizations associated with effective altruism fund at least 10% of the 160+ AI organizations we track in Blueprint24. They're funding for AI policy fellowships and internships is particularly notable, as these individuals are now well-represented inside policy making bodies. In a very short time, philanthropy has had a significant impact on the development of a major industry, its regulatory overseers (at least in the U.S.A), and the public discourse surrounding it. Had this happened in education, healthcare, or other domains where philanthropy is active we'd see the industry press and professional associations paying close attention (and claiming all kinds of credit). Yet, as noted in the intro, voices in civil society and philanthropy have been awfully quiet about this "impact" on AI.

As someone who has been tracking and explicating the changing nature of organizations in civil society, I see OpenAI as a huge, well-publicized example of something that's been going on for awhile. The nonprofit sector ain't what you think it is. And it's codified boundaries - the legalities that distinguish nonprofit corporations from commercial ones - may not be up to the task of prioritizing mission over financial returns when the assets are digital, the potential for profit so hyped, and the domain (AI development) easy to make seem arcane and "too hard for you to understand" by insiders.

*These are some of the phrases that are being used in the debates over AI development. It's critical to keep an eye on these terms - they don't all mean the same thing, they are used interchangeably though they shouldn't be, and some of them are being used to deliberately gaslight the public about our options when it comes to developing these technologies. Just as political propagandists excel at hijacking terms  to denude them of power (see, for example, "fake news"), so, too, do commercial marketers or ideologues excel at using phrases like "safety" to seem universally meaningful, thus providing cover for all kinds of definitions. See Timnit Gebru and Émile Torres on TESCREAL.

Monday, November 20, 2023

Ideology, identity, and philanthropy (PLUS! bonus Blueprint 24 Buzzword)

                                                                        Photo by Brett Jordan on Unsplash

Has a philanthropic strategy ever before become an identity? I'm confident that neither John D. Rockefeller nor Andrew Carnegie ever referred to themselves as scientific philanthropists - names which historians have applied to them. I've heard organizations tout their work as trust-based philanthropy, but yet to hear anyone refer to themselves that way. Same with strategic philanthropy. And even if you can find one or two people who call themselves "strategic" or "trust based" philanthropists, I'm confident you can't find me thousands.

Effective altruism, on the other hand, is all three - ideology, identity, and philanthropic approach. 

Given the behavior of Sam Bankman-Fried and his pals at FTX, it's also a failed cover for fraud. But I digress. 

In the upcoming Blueprint24 (due out on December 15 - will be free and downloadable here) - I look at the role of Effective Altruism in the burgeoning universe of AI organizations. I had two hypotheses for doing so.

H1: There are 00s of new organizations focused on "trustworthy" or "safe" AI, but that behind them is a small group of people with strong connections between them. 

H2: These organizations over-represent "hybrids" - organizations with many different forms and names, connected via a common group of founders/funders/employees - for some reason.

The Blueprint provides my findings on H1 and H2 (yes, but bigger than I thought, and yes, and I give three possible reasons) and will also make public the database of organizations, founders, and funders that a student built for me. So the weekend drama over at OpenAI certainly caught my attention.

By now, you've probably read about some of the drama at OpenAI. As you follow that story, keep in mind that at least two of the four board members who voted to oust the CEO are self-identified effective altruists, as is the guy who was just named interim CEO. These are board members of the 501 (c)(3) nonprofit OpenAI, Inc.

Effective Altruism's interests in AI run toward the potential for existential risk. This is the concern that AI will destroy humanity in some way. Effective altruists also bring a decidedly utilitarian philosophy to their work - to the point of having calculated things like the value of a "life year" and a "disability-affected life year" and use these calculations to inform their giving.* 

The focus on existential threats leads to a couple of things in the real world in real time. First, it distracts from actual harms being done to real people right now.  Second, the spectre of faraway harms isn't as motivating to action as it should be - see humanity's track record on climate change, pandemic prevention, inequality, etc. Pointing to the far away future is a sure way to weaken attention from regulators and ensure that the public doesn't prioritize protecting itself. Third, far away predictions require being able to argue how we get from now to then - which bakes in a bunch of steps and processes (often called path dependencies). Those path dependencies then ensure that what's being done today comes to seem like the only things we could possibly be doing.

Think of it like this: if I tell you we're going to get together on Thursday to give thanks and celebrate community. From this, we'd decide OK, we need to buy the turkey now. Once we have a turkey, we're going to have to cook it. Then we're going to have to eat it. Come Thursday, we will have turkey, regardless of anything else. We've set our direction and there's only path to Thursday.

But what if instead, I tell you we want to get together on Thursday to celebrate community and give thanks. But we want to make sure that everyone who we will invite has enough to eat from now until Thursday as well. We'd probably not buy a turkey at all. Instead, we'd spend our time checking in on each other's well-being and pantry situation, and if we found people without food we'd find them some. We can still get together on Thursday, comfortable in knowing that everyone has had their daily needs, met. In other words, if we focus on the things going wrong now we can fix those, without setting ourselves down a path of no return. And we still get to enjoy ourselves and give thanks on Thursday.**

The focus on long term harms allows for the very people who are building the systems to keep building them. They then model themselves as "heroic" for raising concerns while they simultaneously shape (and benefit from) the things they're doing now. Once their tools are embedded in our lives, we will be headed toward the future they portend, and it will be much harder to rid ourselves of the tools. The moment of greatest choice is now, before we head much further down any paths. 

It's important to interrogate the values and aspirations of those who are designing AI systems, like the leadership of OpenAI. Not at a surface level, but more deeply. Dr. Timnit Gebru helps us do this through her work at DAIR, but also by doing some of the heavy lifting on what these folks believe. She provides us with an acronmyn, TESCREAL, to explain what's she found. TESCREAL (the bonus buzzword I promised) stands for "Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism." Listen here to hear Dr Gebru and Émile Torres discuss where these terms come from. And don't skip over the part about race and eugenics.

Effective Altruism is much more than a way to think about giving away one's money. It's an ideology that has become an identity. A self-professed identity. That reveals a power, an attraction in the approach that is unmatched, as far as I can tell, in the history of modern philanthropy. At the moment, this identity and ideology also seems to have a role in the development of AI that is far greater than many have realized. It's critical that we understand what they believe and what they're building.


*As someone with a newly acquired disability, I'd be curious about their estimation of the difference between a "life year" and a "disability-affected life year" if I wasn't already so repulsed by the idea of the value of either value.

**Agreed, not the best metaphor. But maybe it works, a little bit?

Monday, November 13, 2023

Civil society, polarization and pluralism

Photo by Ruy Reis on Unsplash

A headline in today's Chronicle of Philanthropy, reads:

"Philanthropy’s Job in Polarized America: Make Partners of Enemies, a New Poll Says"

Which raises an obvious question, 

"Why do you think philanthropy is the solution and not part of the problem?"

We often talk about civil society and philanthropy as if they only do good. And then we go on to debate the meaning of good. While that can be hard, we're often pretty clear we know what it isn't when we see it. 

So when I see headlines about Project 2025 - a coordinated effort by more than 80 nonprofit organizations (both c3s and c4s) to put loyalists to Donald Trump in positions up and down government and across state and federal jurisdictions - I don't just doubt the willingness of these groups to "make partners of enemies." I doubt the willingness or ability of groups on the democratic side of the ledger to do so either. I also doubt the willingness of most media outlets, almost all of which seem to have become aligned with one political side or the other. 

I've written a lot over the years about the blurring of the lines between charity and politics. This is most clear in the way funding now works - flowing between c3s and c4s, coming out of donors' LLCs and DAFs. The money moves in ways that removes donors names from donations and goes in and out of organizations in between reporting dates, which often come long after the money has been used. As I first wrote following the Citizens United decision in 2010, the scale and appeal of political money will be too much for charitable nonprofits to ignore. In taking such money, and even perhaps in trying to ignore such funds, nonprofit activities are increasingly aligned with one political side or the other.

We need better mechanisms for tracking money through nonprofits and into political activities. We need to be able to follow dollars into politics, no matter what kind of organization they flow through. We need to be able to track and report this funding in more useful time frames than oft-delayed tax filings. And, we need to be more honest with ourselves and in our writings about civil society and philanthropy. Which requires acknowledging that some (measurable, but not yet measured) percentage of both funders and nonprofits are deliberately pursuing political ends while masquerading as nonpolitical entities. Only when we acknowledge this reality can we begin the process of writing new rules for reporting, transparency, legitimate activities, and meaningful accountability. Which, of course, helps explain while the sectors themselves aren't necessarily interested in acknowledging this reality.

Philanthropy and nonprofits are small p political. Your theory of change, the problems you choose to address, and the ways you seek to solve them reveal political assumptions and allegiances. This has long been true. Now, as the ideologies and paths to change proposed by the country's two political parties grow ever further apart from each other, these associations become more obvious, more visible. Add to this the constant growth in political giving, and it seems that civil society is growing increasingly capital P political, and that at least some of that is due to the preferences of funders. It's hard for me to see how any of this positions civil society or philanthropy as the recourse to social and political polarization.

There are things that we can do to bridge our differences. But we should first recognize just how broadly our political differences influence things like where we live, work, shop, read, worship, play, travel, and donate our time and money. And not assume that every philanthropic or nonprofit organization is interested in or equipped to help with that bridging. It seems that some portion of them are quite invested in exactly the opposite.

Thursday, November 09, 2023

AI and the social sector


                                                          Photo by JJ Ying on Unsplash

Ah, AI. Can't avoid it. 

I've been to the conferences and workshops, read the listservs, talked to the researchers and read some of the research, played with the public tools. The Blueprint 2024 lays out my thoughts on nonprofits, philanthropy and AI for 2024. 

This coming Blueprint (available live and free on 12/15/23) skips the prediction section - and explains why. But I have some thoughts on how AI is going to unfold in the sector, especially after checking out this new resource from Giving Tuesday - the AI Generosity Working Group.

Year 0-1:  November 1, 2022 - 2023 - hype, fear, webinars, and conference talks. Lots of press. Lots of handwaving. Gadgets. Lots of executive orders and unfunded government mandates and policy proposals pushed by tech companies.

Year 2 - 2024:  More hype, lots of feel-good examples (the Red Cross is using it! AI for disasters!) and a few scandals (lawsuits over data use, data loss, etc) will fill the news. Lots of nonprofits will try things and realize they don't have the expertise on staff, are distracting resources from mission, and will go back to ignoring the topic. By this time, we'll all be using AI all the time, however, as AI capacities will be fully baked into every software product you already have - every microsoft product, Canva, Zoom, Salesforce. We're already there, actually.

Years 3 - 5: Certain domains will achieve breakthroughs with AI. These are most likely to be medical research, tech development itself, environmental analysis (including analysis of the damage AI does to the environment in terms of water usage and power consumption). Advocacy organizations working on human issues from migration to healthcare, education to food benefits, will be up to their eyeballs in litigation and integrated advocacy efforts with digital and civil rights groups for harms caused by AI. My hopeful self says nonprofits and foundations will get fully on board with data governance needs (finally) as either litigation, regulation, or insurance premiums require them to manage their data better. AI - as the scary bogeyman/breakthrough opportunity - will help organizations finally understand what data governance is about. 

Years 3 - 5: AI nonprofits and philanthropy will be "things." Product launches of AI-driven giving advisors, AI-driven advocacy campaigns, AI+Blockchain smart contract organizations in the social sector. Most, if not all, will be hype and bust. 

Year 4 +: AI will be so thoroughly baked into every commercial product on which the social sector and philanthropy depend that we'll no longer talk about it much. It would be like discussing cell phones - everyone will have it somewhere in their organizations, new expectations will emerge because of its prevalence, and we'll not be talking about it as much.

As individual organizations become dependent on AI-powered software tools, we'll reach the next level of concern - the existing regulatory regime for nonprofits and foundations will be leaking and breaking, and proposals for new structures and laws will be circulating. The sector's policy advocates will bemoan their missed opportunities, back in 2023 and 2024, to influence the regulations on AI itself. The blend of nonprofits and commercial activity and/or nonprofits and political activity, will complicate such new debates. By this time, the academy and independent research groups like AJL or DAIR will have repeatedly documented harms caused by AI and have proposed numerous remedies.

Having been ignored by industry for 4+ years, we'll see new attention to these ideas. We'll also see a burst of former AI company employees "whistleblowing" or "following their consciences," leaving industry and setting forth to solve the problems they helped create while on the inside. By the time this happens, everyone will be used to and dependent on their AI-enabled tech, and even those who are eager to stop using it will find it "too difficult" to change their tech.

Some of the above is tongue in cheek. But, like the Gartner hypecycle, this loose set of predictions is based on the experience of other breakthrough technologies. It's probably too linear - and doesn't take into account the innumerable "wild card" events that are likely to occur between now and 2028. In other words, by 2028 we'll be having the debates about AI that we had about social media in the 2016 election. Some of these we're already having - especially with regard to elections - and that's a good thing. But it's not going to stop, or even redirect, this flow of events.

It doesn't need to unfold this way at all. Sadly, I don't see enough activities, organizations, advocacy, push back, regulatory oversight out there to prevent this (all too familiar) pattern from playing out. And certainly not compared to the dollars that are being spent now by corporate marketing departments to hook nonprofits.


Wednesday, October 25, 2023

On tech and giving

                                                        Photo by Uriel Soberanes on Unsplash

I had a chance to speak on a zoom panel today. In the before times, this would not be worthy of a comment - it's what I did. Dozens in a year. However, since getting covid that became long covid, I haven't been able to do...much. Between managing the illness, doctors and tests, and staying employed I'm at my max. 

It was fun to chat with folks. The event was hosted by Stanford Alumni Association, Stanford Alumni in Public Service, the Latino Alumni Association, and a few others - there is a recording but I don't know if it will be available beyond the hosts' networks. 

During the course of the hour, I got asked about AI and philanthropy. The Blueprint 2024 (coming on December 15, 2023) has much to say about this. But during the panel I realized something I've thought about for years, but don't think I've said before. 

Here are two true things:

1. In the past 20 years there has been a lot of innovation in digital tech and a lot of tech has been applied to giving - crowdfunding, text giving, online donations, information sources, giving platforms, etc. 

2. Participation rates in charitable giving in the U.S.A. over the last 20 years have gone down. (Total giving keeps going up, but that's from more rich people making more big dollar gifts.)

More tech. Fewer givers.* No one invested in giving innovation wants to hear that. Because, sadly, innovation in giving has become synonymous with throw some tech at it. If "innovation" is supposed to lead to "more," then it's not working. 


Another thought - one I also share in more detail in the Blueprint 2024

Lots of people are focused on AI. There are vendors and others in the social sector who are eager to sell you some AI powered gizmo to improve your fundraising (this is biggest market at moment). Here's the catch:

  1. AI gets trained on data
  2. The best data we have, on financial gifts to nonprofits, is wildly incomplete, misses out on many kinds of giving, is culturally misaligned for many givers, misses everything having to do with political giving, and ignores a great deal of the giving that other tech makes possible - for example, most crowdfunding or direct gifts to individuals 
  3. So today's AIs are being trained on yesterday's bad data
  4. This is a good thing, how?  How will it help you in the future?

More on this in the #Blueprint. But happy to talk about it before then. I don't hang out on the site formerly known as twitter anymore - it's a bit too much like late night at a KKK frat party for me. You can find me on LinkedIn, Bluesky and Mastodon - c'mon over, we'll chat.

*Yes, there are problems with the data. I know that you know that I know this. I write about it all the time. In the "truth" above I'm drawing from GivingUSA data and analyses thereof over time. So the most basic data we have on giving - financial contributions to 501 c 3 nonprofits. It's not the whole story by any means, but it's pretty comprehensive and accurate for what it is.

Friday, September 15, 2023

Digital Civil Society Lab Speaker Series 2023-2024

 I am so excited about this. Join us!

The speaker series is open to anyone who wishes to engage with critical insights on the intersections and implications of digital dependencies with democratic norms and civil society values and actors. It is structured a as hybrid experience, allowing you to choose whether you’d like to attend in person or join us virtually. Light refreshments and snacks will be provided.


Register here to join in-person or register here to join the event virtually. Click here to learn more about our upcoming events.

Wednesday, July 05, 2023

If AI is so new, why does the message about it sound so familiar?

                                                    Photo by Jon Moore on Unsplash 

If you've been wondering why the rhetoric around AI sounds so familiar, I have some thoughts. 

If you read Nancy Maclean's 2017 bestseller, Democracy in Chains, and then pick up a newspaper (or open a news company's app) and read this story on funding for AI scholarship at elite universities across the country, you will notice that the funders/philanthropists in the news story are using the playbook developed by those in the historical study. 

Democracy in Chains is about the fueling of libertarianism and a political economy that favors the wealthy few - an undemocratic project based on perverting majority-based systems to serve a very rich, very determined self-interested few. It goes further than Jane Mayer's brilliant Dark Money to show the intellectual history and the broad reach of the nonprofit/think tank/university (in other words, nonprofit) infrastructure for turning ideology into public policy. MacLean's book was published in 2017 and it centers on the Koch brothers - an updated version could factor in a wide range of philanthropic/funder/investor actions from younger billionaires and include otherwise-inexplicable actions such as Musk's purchase of and destruction of Twitter, and the general weirdness (horror) of First Amendment jurisprudence (FAIR v Harvard, UNC). When we are searching to make sense of a present moment it is helpful - extremely so, in this case - to look to both short and long-term historical precedents.

When it comes to our current moment (in the U.S.) in which Supreme Court decisions seem to abandon procedural and substantive norms from one day to the next and we're all rapidly trying to learn to distinguish AI-generated text/photos/videos from those made by humans and everything from the weather to the role of elections in this democracy seem up for grab these historical events are helpful. It's not quite rhyming (as historians will remind us), but there are patterns to see that can be helfpul. Maclean shows a 50+ year arc of an ideologic project built around a minority-viewpoint that has yielded extraordinary, stealthy success. It's worth understanding those past patterns to understand our current setting.

It's no coincidence that today's funders focused on existential risks of AI are using the playbook of scholarships, fellowships, and academic centers to build cadres of like-minded thinkers.  It focuses your attention downstream, away from the present. This funding model works - especially if you take a multi-decade time frame.

Just because it "works," however, doesn't mean it is in the best interest of anyone but those funding and being funded. The Kochs' and their allies were very clear that their project benefitted a minority (wealth owners). What they needed to do was bend the systems of a majority-based democracy to serve minoritarian ends. This was not hard to do, since the U.S. Constitutional system has numerous minoritarian run-arounds (e.g., Senate apportionment, electoral college, voting rules) built into it.  We should be on the lookout for similar motivations and efforts as we think about our now AI-dominant online information sources, systems, and messa

Some of those engaged in discussions and training about existential AI risks will note that human extinction is likely to come faster from climate change, weaponized nuclear facilities, and perhaps the next pandemic then from man-hating robots. Focusing scholars and the media's attention on the potential long-term harms to all of humanity is a slick way of distracting those same communities and others from the here-and-now harms of AI-enabled disinformation, discrimination, and economic harms for people already marginalized by race, religion, identity, and/or income. Each moment that goes by in which near-term harms are ignored is another chance for the current powers to further implant, strengthen, and reap the rewards of the very path dependencies that lead to the future they claim to be fighting against. 

In short, beware the arguments of those who direct your attention to far-away catastrophes while they benefit by building those very systems now. Better to refuse, redirect, or rebuild systems that cause no harm now, for they will also cause less harm later.

Thursday, June 08, 2023

Nonprofits and political influence: it's not about golf

Screenshot from

You know that the PGA Tour is a nonprofit, don't you?*

I'm also sure you've heard the news that the Saudi government (via its public investment fund, with $600+ billion in assets) launched a new tour (called LIV) which has announced a merger with the PGA Tour. Details are being worked out (and investigated.)

Why does this deal stink so much? Sport washing by a country with a dismal human rights record is pretty obvious - especially as the country is unabashadly trying to buy soccer talent also. Certainly, families of people who were killed on September 11, 2001 are disgusted (my own included). There's a lot of media on this story about the players, the fans, the public, the sport-washing, human rights, and, of course, Trump Sr., Kushner Jr., and Mnuchin. I'll let you read all that elsewhere. 

Let's go to back to the role of the Tour as a nonprofit organization. If you check on (screenshot above) you'll find the PGA Tour with its $4 billion in assets as well as about a dozen other PGA-named nonprofits, including a 501 (c) (3) foundation with$10,000 in assets and an organization for and by the wives of PGA players

This comes along as the United States has lost control of our system for financing campaigns and the regulatory body in charge (the FEC) is hogtied by politics. Money flows from individuals and corporations to nonprofits, where the names of the donors are "washed off" and the money is passed through to politically-active affiliated organizations. Sometimes, people just "move" nonprofit funds to their own pockets. As I predicted in 2010, when the Citizens United decision was handed down, large swaths of nonprofit organizations have become money laundering mechanisms for politics. This structure - foreign government "investment" in a nonprofit that holds extravagent and expensive events at properties owned by an indicted former president running again for office - looks and smells like the making of a money washing scandal from here, before the deal is even done. 

The new entity ("NewCo" to be born from PGA + LIV) will be a commercial enterprise. Owned by the nonprofit PGA. I'm not a lawyer but I can read these signs - that means no conversion foundation or tax payback from the nonprofit. Massive commercial investments plus a nonprofit structure that will enable anonymous financial flows. A set of nesting doll organizations ripe for funding abuse by anyone, anywhere interested in political influence, but particularly convenient for foreign governments. Given the timing, expect big concerns about funding and influence in the 2024 Presidential election.

Given the cast of characters involved, I'll say it out loud now: this deal looks like the biggest money laundering machine yet to be carved out of the nonprofit tax code. I'll put my bet down now - If the deal goes through, this will become a story of campaign finance violations. And we're watching it being put together right in front of us. It may never happen due to antitrust and other reasons, but still, it's important to see what this deal intends, and realize if not this, then somewhere else.

*I'm sure you remember that the NFL was a nonprofit until 2015 - when it reorganized as a commercial entity. Happens under 501 (c) (6) of IRS Code.

Thursday, May 18, 2023

A predictable problem with predictions

Photo from Possessed Photography on Unsplash

Dateline: May, 2027

Location: Pretty much anywhere on earth

“Miriam was one of those rare people who could remember reading about her cause of death                                                                         before it happened. It wasn’t the reading that was rare - the warning had been printed in The New York Times, page A9. It also wasn’t the dying that was rare - hundreds of thousands of people would die of the same cause. It was the remembering that was rare.”

Yes, that’s fiction. I just made it up. Because I just read this story in today’s New York Times: record heat between now and 2027 due to climate catastrophe and El Niño weather patterns. It’s likely that one of the years between now and then will cross the mark of 1.5 degree celsius hotter than 19th Century average. 

So, there’s the science. The article goes on to do the work - “This will have far-reaching repercussions for health, food, water management and the environment.” 

Keep going - do the rest of the work: Those far-reaching repercussions mean fires, droughts, floods, food shortages, hunger, water wars (term used deliberately). These things mean death. I made up Miriam and I interpolated from the global recent past to get to “hundreds of thousands of deaths.” (We’ve passed the tens of thousands marker). Here’s what’s happening now - four years after devastating 2019 Australian summer. 

If you have children starting elementary school this Fall, 2027 will be here before they go on to middle school. If your child was accepted to a four-year college this Spring, they’ve just been welcomed in to the class of 2027. If you’re writing a five year (?) strategic plan for your foundation/nonprofit you’re planning this precise timeline of these disasters - how are you fitting them into those plans?

I wrote a wee bit of fiction from this news. (I’ve done some other things, actual prep. Which given the global nature of the prediction is challenging) How do we respond to predictions like this - Action? Stasis? What are you doing? What can we do together? 

Thursday, May 11, 2023

Transitional philanthropy

We're in an incredible moment. After decades of research and advocacy and warnings we are now living through the weather and natural disaster effects of climate collapse. We're also more than a few meters down the pitch of living with pervasive artificial intelligent systems. 

Ways of life from agriculture to writing, architecture to transportation are transitioning. The practices for adapting to more sustainable, more energy efficient, lower impact methodologies are being refined, shared, modeled and implemented at scale in some places. 

 And then there's this (which I reprinted with permission in the Blueprint 2022)

My question is are there examples of philanthropy that are clearly rooted in a sense of transition from one state to another? There are funds named for transitions - or at least there is the Just Transitions Fund - but are there others? If there are, what defines them? What are they transitioning to? Where are the experiments, innovations, regulatory reconsiderations, imaginaries, and alternatives in philanthropy and civil society that make use of (but don't venerate) our current capacities (for almost instant global communication, for example) and that pursue a vision of human thriving on a climate-damaged planet? How would such philanthropy work, what would it look like, what would it do differently from now, and how would it change itself in order to justify its continued existence? 

That last question is not meant to be rhetorical. The time frame for irreversible climate collapse is now about the length of time an American child spends in elementary school or just barely longer than the term of an elected Senator. The time frame for harms from badly designed AI to manifest has passed, it's already underway and we're well down that path.

We're on the path to both realities. We can see them up ahead and are already experiencing the harms we know will grow. It's illogical to do things the way they've been done during a transitional moment, unless your goal is to maintain the status quo. I've yet to meet the foundation or philanthropist who (explicitly) states such as their goal so this should be a time of tremendous experimentation and hopeful innovation. I'd love to see it - please point me in the right direction.

Tuesday, May 09, 2023

Deepfake nonprofits

                                                                                    Eileen Pan on Unsplash

There's been a lot of writing over the last three decades about the blurring of boundaries between nonprofits, governments, and markets. 

These analyses usually focus on the use of profit-generating tactics by nonprofits (blurring them with market institutions), the growing involvement of nonprofits in public policy (usually discussed either in terms of dark money or organizations with multiple tax statuses such as c3s and c4s), and the use by governments of philanthropy-style incentives (e.g. prizes or matching grants) or direct government involvement in supporting specific companies the way investors do. The whole social enterprise movement is an example of blurring lines between philanthropy and business.

In this context, this story of a government "watchdog" group is fascinating. The article describes how every inquiry into the group by a reporter is met with a different classification claim. Starting in 2021, it claimed to be a 501 (c) 3 nonprofit, then it removed the 501 (c) 3 part, then in 2022 it referred to itself in a lawsuit as “an unincorporated association of retired and former public servants and concerned citizens that is dedicated to restoring public trust in government.” And then, in January 2023, it labeled itself simply “a collection of individuals.”

Needless to say, whatever it is, it has not been filing any paperwork or tax documents that might explain who is involved and where the money is coming from. 

Just in time for the 2024 U.S. Presidential election, which we can expect to be defined by AI generated information warfare online, we should also be on the lookout for more of these "deepfaked" IRL organizations.

Tuesday, May 02, 2023

Who will hack whom?

Nonprofits and foundations have been slow to realize that they live in a world of dirty tricks, bad faith messaging, trolls, DDOS attacks, and data breaches. In other words, they inhabit the same internet the rest of us do - one where determining what information to trust is almost a full-time job.
Industry intermediaries - organizations that provide information about nonprofits and foundations to the public - live in this same world. Not only are political groups setting up nonprofits as fronts for political money laundering, they are using the nonprofit information infrastructure to help them spread falsehoods, to slap back at organizations opposed to their views, and basically every other trick of online information harassment.
Here's a picture (a screen shot on my end) of what I assume is a hacked and defaced organizational page on Candid - the biggest provider of data on nonprofits and foundations. 

                                Screenshot taken from Candid website, 3:30 pm pst, May 2, 2023

That's the home page of an organization called American College of Pediatrics - an anti-gay, anti-trans lobbying group. Clearly, someone doesn't agree with their views.* The group also lost 10000 records from a Google drive it left online and unprotected - information leaked includes all kinds of donor and member information.

The internet and world wide web are trash piles of information. AI systems, such as ChatGPT, spew statistically-produced baloney. All of them are readily designed to facilitate harm and lies. It didn't have to be this way, but it is. Many people are clutching their pearls, having ignored the insights and warnings of those who've been pointing out these harms for decades. 

It doesn't have to continue this way, though it seems to be doing just that. Glory and hysteria go in cycles - from app to app, crypto to GPT.  Despite all the warnings of a cliff ahead, we seem to be driving faster and faster toward it.

*FWIW, I don't agree with their views. I did not deface their page on Candid.

Wednesday, April 19, 2023

X, Tw*tter, and nonprofits

Today, reporters tell us that staff at X Corp. (the company now responsible for Tw*tter) are going through its database and removing - one by one - emergency services that subscribe to the platform's API - thereby cutting off these departments (fire, emergency services) from using the social media service in emergencies.

GLAAD discovered yesterday that Tw*tter had suspended its efforts to protect transgender people, deliberately removing language in its Hateful Conduct Policy that penalized misgendering and dead-naming.

The Chronicle of Philanthropy reported today that nonprofits are sticking with Twitter, despite....everything going on over there at the company and on the platform.

Hmmm. Why are Niemöller's's words ringing in my ears? You know the ones: "First they came for the..."

Thursday, April 13, 2023

Nonprofits and privacy laws

                                                                                                    Photo by Mike Scheid on Unsplash

Colorado privacy law INCLUDES nonprofits - 

This is as it should be. Nonprofits gather, hold, and rarely protect an enormous amount of very sensitive data of very high value. Think about it - your donations to, volunteer time at, and service from a nonprofit says a LOT about you - much more personal information than your favorite ice cream flavor. Marketers and politicians LOVE this information. And they use it to even further segment and divide us.

Philanthropists need to step up and help nonprofits protect their data and the whole sector needs to massively improve their data governance and protection processes OR stop collecting data OR stop lobbying their way out of accountability.

Thursday, March 30, 2023

Nonprofits, campaign finance and more blows to democracy

                                                        Photo by Jason Dent on Unsplash 

I changed my job in response to the Citizens United decision by the Supreme Court of the United States (SCOTUS). I was convinced at the time (2010) that the Court's decision would lead to the transformation of many nonprofits from advocacy organizations to money laundering tools for political donors. I was right.

It's been hard to prove the scope of this for the very reason it's happening. Nonprofit law allows for donor anonymity; campaign finance law does not. By using nonprofits to "wash" their names from political donations, it makes it very hard to track money back to its source. The amazing web of connections that Jane Mayer drew out in her book Dark Money and ProPublica documented here shows how hard this can be. These concerns were part of what led Rob Reich, Chiara Cordelli and I to write Good Fences: The Importance of Institutional Boundaries in the New Social Economy (2013).


The rules on donor anonymity that come from the nonprofit sector have proven to be remarkably adaptable tools for "washing" donors' names from political contributions. This can be done by moving money from a c3 to a c4. It can be done by opening and closing a c3 or c4 in-between the required reporting periods. It can be done by creating layers of relationships between c3s and c4s and crowdfunding platforms. It can be done - and is being done - because the laws about nonprofits (and the regulators of them - state attorneys general, the Internal Revenue Service (IRS), and, in the case of Florida, the state Department of Agriculture & Consumer Services) intersect somewhat orthogonally with the laws about elections and political donations (and with the FEC and state level oversight bodies).

What's worse, is that Citizens United was only a point on a path. There are trend lines that can be spotted and forces identified working very hard to further dilute any distinctions between charitable anonymity and political anonymity. Today, in an article by Rick Hasen, an election law expert, I read that we are heading toward:

"..a world in which many of the remaining regulations of money in politics could well be struck down as unconstitutional or rendered wholly ineffective by a Supreme Court increasingly hostile to the goals of campaign finance law and extremely solicitous of religious freedom."(fn)

I can't quote more of the article - and shouldn't have quoted that much - as the article is in draft form and was discussed at a conference celebrating Professor Ellen Aprill. (Grateful to the blog post by Gene Takagi that led me to the event). You can download the draft paper here

In a nutshell, Professor Hasen uses Professor Aprill's work to show the intellectual and legal history that will likely use religious freedom to deregulate political donations. How? Via the deregulation of political activity in churches and houses of worship. There's much more to it (read the paper) but that gets us started. 

What does this mean for nonprofits? More politics. More money laundering. Less trust. 

What does it mean for democracy? More blurring of boundaries between nonprofit and commercial corporations. More anonymous money in politics. Less trust. More plutocratic control. 

It's not a positive tale. But thanks to Professors Aprill and Hasen, we've been warned. So, what are we going to do about it?

(fn)Richard L. Hasen, Nonprofit Law as the Tool to Kill What Remains of Campaign Finance Law: Reluctant Lessons from Ellen Aprill,"Forthcoming, 56 LOYOLA OF LOS ANGELES LAW REVIEW (2023) (special festschrift symposium honoring Ellen Aprill)

Wednesday, March 29, 2023

Civil society and the splitting of the US


                                                                    Photo by Viktor Talashuk on Unsplash

Civil society organizations are on the front lines of advocating for or against the most divisive issues in the United States. The following list is organized by rights. The links are almost entirely to civil society organizations fighting to protect the rights to free expression, free assembly, voting, reproduction, and work. Their civil society opponents on these issues are noted under each section.

(I'm sure there's more to add here - feel free to send additions to or comment below)

Book bans, educational censorship and attacks on free expression

Pen America reports there have been 86 state bills proposed that would censor a wide swath of educational materials and ban books, mostly on Black people, LGTBQ+ people, and discussions of critical race and queer theory (college level). An increasing number of these bills allow a single person to request removal of any number of books, and for those books to be removed before any kind of review. Thirty-two states and more than 150 school districts have implemented book bans.

        Notable nonprofits for book bans:

Moms for Liberty, formed in 2021, has 200 local chapters. It is both a c3 and a c4. Other national groups with branches include US Parents Involved in Education (50 chapters), No Left Turn in Education (25), MassResistance (16), Parents’ Rights in Education (12), Mary in the Library (9), County Citizens Defending Freedom USA (5), and Power2Parent (5).

Another 38 state, regional, or community groups advocating for book removals appear unaffiliated with the national groups or with one another.

        Notable nonprofits against: PEN America, American Library Association, many others

Protest bans and attacks on free assembly

Thirty-nine states have passed laws limiting protest. While a handful of jurisdictions have passed laws limiting the use of facial recognition by police, most places have not done so. In 2021, half of the 42 US federal agencies that are part of law enforcement owned or used facial recognition technology. Corporate use of SLAPP lawsuits against individual protestors are rising in numbers. Open carry laws for handguns exist in 36 states and you can carry a long gun openly in 44 states. Guns at protests are hard to square with the idea of peacable assembly. 

Notable organizations promoting protest bans: Police associations, Republican officials,

Notable organizations fighting against them: Civil Liberties Defense Center, ACLU, BLM


Voting rights

As of 2021, nineteen states had passed laws making it harder to vote. Eighteen states were carrying over 152 bills to restrict voting in 2022. 

Notable organizations promoting voting restrictions: Americans For Prosperity, Heritage Foundation, ALEC

Notable organizations fighting against restrictions: Voting Rights Alliance, ACLU, some election administration groups, Fair Fight, Brennan Center, Lawyers Committee for Civil Rights

Reproductive rights

Have split the U.S. in two - with 24 states banning access to abortion. These states are also adding vigilante bonuses and surveilling communications and travel.

Notable organizations promoting reproductive restrictions: see this list

Notable organizations fighting for access to healthcare: see this list

Right to work

These laws, whose name implies one thing but which actually focus on restricting the right for labor to organize, exist in 27 of the 50 states.

Notable organizations promoting voting restrictions: Americans For Prosperity, Heritage Foundation, ALEC, Republican Party

Notable organizations fighting against restrictions: AFL-CIO, SEIU, Center for American Progress, Democrats

Behind all of these organizations are donors. Some are heavily supported by individuals, others by foundations, others by corporations. Many rely on crowdfunding or on a mix of all of these funding structures. Behind each issue, on each side, is a mix of 501c4 and 501c3 organizations - an approach that makes it easy to hide the identities of donors whose interests are primarily political but who desire anonymity. New case law on donor anonymity in such situations, and conservative groups efforts to enable even greater anonymity for political donors, further complicates our ability to know who is funding what. 

I don't have a conclusion to offer to this post. Yet. Instead, view this as "first draft thinking" for Blueprint 2024. I welcome your feedback.