Monday, June 11, 2018

The nonprofit/volunteer layer of the internet's content infrastructure

I've been thinking about the role that  volunteers and nonprofits play in providing curatorial and editorial support to the internet ever since 2014 when I learned that Twitter was going to "partner" with several women's organizations following the murders in Santa Barbara, California that were carried out by a man in a self declared "war on women."

Facebook's dependence on NGOs to combat hate speech in Myanmar brought this up again. Especially when the NGOs told Zuckerberg, "No, you didn't live up to your end of the bargain."

And then I heard Tarleton Gillespie speak about his new book, Custodians of the Internet, which put to rest any naive fallacies I once held about these companies not actively curating their platforms.

There was also  YouTube's announcement earlier this year that it would rely on Wikipedia entries to help it deal with conspiracy theories. The company didn't even bother to tell the nonprofit in advance (let alone try to consult with the nonprofit as if it might have a say about this plan). This hasn't worked out that well for either YouTube or Wikipedia.  Let's think about this. Wikipedia is run by a nonprofit but the work is done by a global network of volunteers, who - everyone knows - are by no means representative of the global population. YouTube is part of Alphabet, one of the world's wealthiest companies, and is itself one of the world's biggest social networks. It has it's own curatorial teams. And yet, as Wired notes, both Facebook and YouTube are outsourcing their responsibilities to nonprofits.

This seems unseemly even if you just think about it from an economic standpoint - big company relying on unpaid labor? Sounds like exploitation. When you start thinking about it in terms of the health of nonprofits or civil society the exploitation seems even worse.

Just like the open source community has built all kinds of technology that companies rely on, so too are nonprofits providing a kind of critical digital infrastructure in terms of their community voice, commitment to a set of ideals, expertise, and concerns for the vulnerable. Yet the current set of "partnership" arrangements seem destined to throw the nonprofit under the bus - the company saves money, gains reputation, and offloads both costs and liability. The nonprofit gets...what?

Tuesday, April 17, 2018

Digital identity and civil society

I posted this reflection over on DigitalImpact.org - regular readers of the Blueprint - send me your notes!

For those who don't want to click over (and you should) the piece discusses the technological work being done on digital identities - where you would control yours - and its implications for civil society and philanthropy. Go on, read it.

Thursday, April 12, 2018

Civil society and platform curation

One of many things that have been made more public during this week's congressional hearings with Mark Zuckerberg is the way in which the platform curates content. Zuckerberg bemoaned the reality that it's his job to decide who sees what when.
For those who study curation and platforms and internet law this is not new. I'm writing this while listening to Tarleton Gillespie discuss his forthcoming book (recommended) Custodians of the Internet. He's describing the rules, technologies, and people that make up the "moderation apparatus" - the systems that determine who sees what information, when, and from whom. Gillespies argues that this moderation is essential to what the platforms do - it is their value proposition. This runs counter to the longstanding mythos of the open web.

One of the elements of this "moderation apparatus" that Gillespie describes that catches my eye is the role of civil society organizations and nonprofits. Big companies, like Facebook but probably not only Facebook, rely on civil society to do their dirty work. 

In Myanmar, civil society groups that were working with Facebook to take down hateful and violent postings pushed back when Zuckerberg claimed that the company was doing all it could to address these issues. The civil society groups noted that the company was essentially relying on them to voluntarily moderate the site and wasn't providing them with the engineering resources that were needed to do this. They secured a verbal commitment from Zuckerberg to improve the process.

Here's what this means:
  • Facebook was shifting its responsibilities to civil society.
  • Civil society groups aren't equipped for, or paid for, this role. 
  • Civil society groups - by design - are fragmented and contentious. Choosing some of them to do moderation is a value-laden, editorial decision.  
  • Civil society is - from Facebook's perspective in this example - just a low cost, outsourced labor source.  It also, no doubt, shifts liability from Facebook to civil society (not least for the human psychological effects of moderating photos and posts about harm and violence).
Here's what I want to know:
  • How widespread are these kinds of commercial/civil society moderation/curation relationships?
  • How do they work - who's contracted for what? who's liable for what? what recourse exists when things go wrong?
  • What do civil society groups think of this? When might it be a good solution, from civil society's perspective?
  • Some civil society groups - such as Muslim Advocates and Color Of Change - are calling for a civil rights audit of Facebook. Senator Cory Booker took this idea into the hearings. This sort of advocacy and accountability demands of the platforms makes more sense to me as the role of civil society - not doing the work, but demanding the work be done. Your thoughts?
Seems to me this starts to elicit some really interesting questions about role/relationship of nonprofits, companies and government in digital space.


Wednesday, April 11, 2018

The burden of digital data

This article from India Development Review captures some of my thoughts on civil society and digital data.

http://idronline.org/civil-society-and-the-burden-of-data/