Monday, September 02, 2024

New (nonacademic) books on philanthropy

Here are two new books - one by one of my favorite novelists - that I'll be reading in the next few weeks. Thoughts and feels will be noted in #Blueprint25 (Yes, I'm working on it. My health makes it harder. Stay tuned - doing my best to get the 16th annual one done.)

Rumaan Alam, author of Leave the World Behind, has a new book out called Entitlement. The LA Review of Books says: 

"We follow Brooke Orr, a dynamic woman handling a massive responsibility—managing an octogenarian billionaire’s earthly fortune and assisting him in giving it all away. Taut, unsettling, and alive to the seductive distortions of money, Entitlement is a riveting tale for our new gilded age, a story that confidently considers questions about need and worth, race and privilege, philanthropy and generosity, passion and obsession. It is a provocative, propulsive novel about the American imagination." Sign me up.

And, in the "tradition" of Anand Giridharadas's Winner Takes All, a World Economic Forum insider, Thierry Malleret takes on the globalist crowd with his self-published work, Deaths At Davos. Semafor Media describes the book this way: 

"The self-published thriller centers on The Circle, a WEF-like institution consumed by self-interest whose cardinal rule is that “money always has the last word.” The Circle is “a handsomely sophisticated comfort zone for people who had already changed the world, not necessarily for the better, and wanted to cover their tracks.”

If you prefer TV,  Maya Rudolph, whom I adore, is back with more of Loot, the TV show about a billionaire's widow and the fortune she tries to put to good use. At least she'll be doing this when she's not being Kamala Harris on SNL.


 

 




Tuesday, February 13, 2024

The GOP threat to civil society

                                                    Photo by Richard Stovall on Unsplash

Democracy in the USA is not "naturally" withering, it is under attack. And the call(s) are coming from both inside the house and outside, domestic and foreign. One source of attack is the Republican Party. Threats can't be beaten if they aren't named. I strongly suggest both foundations, their associations, and their media stop "both-sidesing" this and call out the threats to the sector that are coming from their own.

First and foremost, Donald Trump's campaign has declared it will be "taxing, fining, and suing excessively large private university endowments" to fund its own propaganda-driven alternative university. Now, big private universities don't usually inspire a lot of sympathy, I get that. I'm an alum of them and they don't make me all warm and fuzzy. But be clear, none of this has anything to do with anti-semitism (which gets a quick shout-out in the document linked above). It's part of a sustained campaign against perceived liberal or left(ish) civil society. The presumed candidate of the Republican Party is promising/threatening to seize endowment assets from universities it doesn't like. I'll say it again, the GOP is running on a platform that involves taking funds away from nonprofits it doesn't like. If that doesn't make the philanthropy industry stand up and take notice (and, one might hope, action), I can't think of a bigger threat that the sector would be ignoring. And this from a candidate who's been repeatedly sued for the way he ran his nominal foundation

All nonprofits and foundations, their professional and lobbying associations, and the media dedicated to them should decry a platform such as proposed in Agenda47. And, what's that I hear? Yup, crickets.

Or worse, InsidePhilanthropy worked hard on this rundown of funding for democracy, (behind their paywall yell at them, not me). It's good reporting on a survey done by the Democracy Fund that focuses on giving to democracy efforts and causes related to it. But it counts funding on just one side of the equation. It counts funding by funders in the political center or on the left. It doesn't count the other side - there is no accounting of efforts to undermine democracy. The story mentions book bans, school board fights, and transgender bathroom hysteria as examples of undemocratic philanthropy. But it neither tallies the amount of philanthropic dollars spent on these issues nor names any of the funders. That's not helpful. Those are philanthropic dollars going to efforts that undermine democracy - and they're by no means all the way such money is being spent (Supreme Court favors, anyone? Social media trolls, disinformation, and campaigns such as that run by Christopher Rufo with help from Congresswoman Stefanik to oust female college presidents of color? The list is long)

Attacks on democracy are secretively well-funded even as they appear to be led by grassroots individuals. Counting the funding on the pro-side and not on the attack-side makes it seem as if the  attacks are just part of the process of democracy. And that may be true. But if its true its true in the sense that democracy will always have critics, and some of those will be doing their best to destroy democratic participation by those they don't like.

One of the two political parties in this country is running on a platform that includes seizing endowment assets. Yes, the campaign platform of the GOP is "vote for us and we'll put government in charge of higher education and destroy some of the nation's longest-lived independent institutions. For all the vitriol these universities attract, there's a helluva lot of rich people trying hard to get their kids admitted to them).You may not feel sorry for Harvard, but you'd be a fool for thinking this is just an attack on the Crimson. That's what the GOP wants you to think, but it's not (all) they want to do.

If foundations, philanthropy, and nonprofits don't stand up to defend civil society from Agenda47 before November, they'll deserve what happens, post-election.

Wednesday, December 06, 2023

What does open mean?

                                                    Photo by Enrique Macias on Unsplash

 
Open source technology has a long history of being a counterbalancing force to closed, proprietary systems. For decades it was open source versus corporations. Then Microsoft (closed, proprietary) bought GitHub (most used repository of open source code). Today, in the AI battles, Facebook/Meta, IBM and Oracle, along with universities and the National Science Foundation, announced the AI alliance - dedicated to open AI models. This is part of the larger debate about building responsible/trustworthy/safe/ethical AI. 

So some of the world's biggest tech companies, many who have thrived on proprietary, patented, trademarked and close source code, are now arguing that an open community of developers is the way forward to protect us from the harms of AI.

This is one more step in both the commercial battles for market dominance and the definitions of words such as safety, ethical, trustworthy and responsible (in the context of AI.) For example, effective altruists and longterm (ers) (ists) focus on the word "safety." They're bogeyman is the potential for AI to destroy humanity. This group, the AI Alliance, uses the terms "open" and "responsible." They're bogeyman appears to be the other companies who've already launched proprietary models - like Google and Microsoft.

The mix of organizations and funding in these AI debates includes corporations, governments, and numerous nonprofits - not only universities, but also groups of developers and advocacy organizations. Philanthropic funding is very much in the mix. The direction of AI development is not simply an external force acting upon the nonprofit/philanthropic sector; it is being shaped by numerous actors within the sector. The meaning and purpose of "open" in this context is neither static, nor simple.

Thursday, November 30, 2023

Maybe nonprofit governance aint what it needs to be?

                                                                            M.C. Escher, Relativity Stairs    

Imagine a large - no, bigger, much bigger - nonprofit hospital, university, housing developer, or after school program. Bigger by assets than any other. Right now, there are 13 universities in the U.S.A. with more than $10 billion endowments (one of which is a "public" university), with the largest topping $50 billion. Bigger than that. 

There is one. OpenAI. Though its size is not based on endowed assets but rather speculative stock value, the organization, which is still as of this writing a nonprofit, is valued at $86 Billion. It's not clear that the organization will continue with its current structure - the events of the last few weeks resulted in a new board and promises to revisit the structure.

Others have written about what the weeks' events mean for the development of AI going forward, the effective altruism (paywall) movement, tech bros, and capitalism. I want to think about what it means - if anything - for civil society. 

First, it seems that no one in civil society or the U.S. nonprofit sector really sees the organization as anything other than a commercial firm (it has a capped profit structure, which limits the amount of profit to be returned to shareholders, but only designates profits to be reinvested in the organization (as nonprofits due) after investors are paid out). 

I can understand this view, sort of. The sector in the U.S. (as represented by its lobbying/advocacy/infrastructure groups) is still hung up on a certain kind of charitable corporation, designated as 501c3 (OpenAI is such), and doesn't pay much attention to the dozens of other structures that identify as nonprofits. Heck, it's hard to get these groups to address the problematically porous nature of c3s and c4s, they're way behind the eight ball in understanding they swim in a sea filled with informal associations, "Slack"-based "organizations" for mutual aid or volunteering, B corporations, or hybrids. So, perhaps its way too much of an ask to expect recognition among their own of the behemoth of technology development. 

Second, the OpenAI events show that the nonprofit governance model is not "strong" enough to outweigh the interests of investors. Given the model's purpose in this situation, and the information that's public, the nonprofit board that fired the CEO was acting as it was intended. I guess no one thought they'd actually do what they were set up to do. 

Third, while the argument for data trusts has largely focused on the difference between digital assets and analog ones as the reason for a new organizational form, they're still rare and probably outnumbered by hybrids of profit/non-profit forms. The AI world - especially that which professes some commitment to "ethics", "safety," "responsibility" or "trustworthiness"* - is ripe with hybrids, not trusts. But they're not limited to this field - they're plentiful in journalism, for example. I highlight this in the forthcoming Blueprint 24.

Fourth, it's not just the structure of the organization that matters, it's also the structure of the funding. Many supporters of the AI organizations we captured for our dataset (live link on December 15, 2023) are contributing via deductible donations and commercial investments. The more the donor class uses LLCs and family offices, the harder it is to determine what kind of funding they're putting where. While those who invested for a financial return in OpenAI may be happy with the result of the last few weeks, what about those who donated with an eye on the mission? 

Fifth, philanthropy is playing a not insignificant role in these developments. Individuals and organizations associated with effective altruism fund at least 10% of the 160+ AI organizations we track in Blueprint24. They're funding for AI policy fellowships and internships is particularly notable, as these individuals are now well-represented inside policy making bodies. In a very short time, philanthropy has had a significant impact on the development of a major industry, its regulatory overseers (at least in the U.S.A), and the public discourse surrounding it. Had this happened in education, healthcare, or other domains where philanthropy is active we'd see the industry press and professional associations paying close attention (and claiming all kinds of credit). Yet, as noted in the intro, voices in civil society and philanthropy have been awfully quiet about this "impact" on AI.

As someone who has been tracking and explicating the changing nature of organizations in civil society, I see OpenAI as a huge, well-publicized example of something that's been going on for awhile. The nonprofit sector ain't what you think it is. And it's codified boundaries - the legalities that distinguish nonprofit corporations from commercial ones - may not be up to the task of prioritizing mission over financial returns when the assets are digital, the potential for profit so hyped, and the domain (AI development) easy to make seem arcane and "too hard for you to understand" by insiders.

*These are some of the phrases that are being used in the debates over AI development. It's critical to keep an eye on these terms - they don't all mean the same thing, they are used interchangeably though they shouldn't be, and some of them are being used to deliberately gaslight the public about our options when it comes to developing these technologies. Just as political propagandists excel at hijacking terms  to denude them of power (see, for example, "fake news"), so, too, do commercial marketers or ideologues excel at using phrases like "safety" to seem universally meaningful, thus providing cover for all kinds of definitions. See Timnit Gebru and Émile Torres on TESCREAL.