Today's New York Times features an article on how tech companies are dismantling their Trust and Safety Teams (free article). This strikes me as akin to price gouging by oil companies during the inflationary moment we're in - taking advantage of the great tech-layoff-contagion to get rid of something they don't seem to have ever really wanted.
Let's just acknowledge that we can't trust anything posted on social media (and the most vulnerable, the most outspoken, and the rest of us are all facing more harm). We can't trust the answers from ChatGPT and the tech companies are racing each other to implement similar AI systems into their search products and elsewhere. At the same time, the companies are less and less interested in making any data available to independent researchers who might check the companies' own claims. There are lots of efforts to ensure access - the EU's Digital Services Act, proposed legislation called the Platform Accountability and Transparency Act in the US, and the Coalition for Independent Technology Research - but none are perfect and all must reckon with serious concerns about people's privacy.
I've noticed an uptick in my email of research reports from nonprofits and advocacy groups. I suppose this makes sense in a time of continued pressures on journalism and the swamp of bad information that is the internet. How should we know to trust these reports? Chances are each of us will only receive such reports from organizations with which we're already aligned or organizations that have bought email lists from other organizations with which we're aligned. That sets us all up for an ever-growing pile of one-side-ism.
I'm pretty sure I've never ever received a report from an organization that criticizes the organization or its outcomes. Occasionally, I receive one after a scandal in which the report guarantees me the problem has been solved. I have received some self-searching emails about the claims of sexual harassment in the Effective Altruism community, by people in the community (I am not in it) but those are about "culture" and "governance" not the work itself so much.*
Here's my question for nonprofits and foundations and activists and associations - to civil society, basically - how do we trust you and your research?
This is a sector-wide issue. What mechanisms, credentials, cross-checks, editorial practices, industry norms need to be developed and implemented before civil society's signals become indistinguishable from incessant noise?
*Kelsey Piper, who identifies as an effective altruist, has a decent example of soul-searching about EA and harassment in her newsletter dated February 15, 2023 for Vox. Although she nods to the homogeneity of the EA community she doesn't draw any further inferences to the problems in its giving approach, governance, or harassment.