I've been thinking a lot about values, society, and technological design. I'm trying to articulate what, when, and how the values that shape civil society should be expressed with and within technology. What I do know is that it's not just at the point of use. We need tools that default to the values we care about, not just those that serve the company that built the tool or the government that regulates its use. We can't continue to duct-tape our tech tools into our social sector work or political protest.
So far, most of what I've read about this, and the thinking I participated in at the Ethics of Data Conference, talks about ethical decision making across the data lifecycle. This is an important start. But it doesn't seem to start early enough in the tech/data development process. The values of the end user and the end uses need to be designed into the tools from the beginning.
Think about this. In civil society, for example, voluntary participation is not just a key value, it's a defining premise of the sector. In the tech world, the idea of voluntary participation translates into "opting in" (or more often, "opting out") This is usually managed through a deliberately opaque "consent process," which really doesn't describe what's happening if you are barred from the service if you don't sign on. That's not voluntary or consent-driven engagement, that's force.
Civil society should be thinking about processes and using tools that require informed consent and provide the informing. Active consent. What I recently called "Gung ho consent, with an opportunity to stop consenting at any point." As data collection becomes part of everything we use ethicists are already thinking "beyond consent." As new technological infrastructures like the blockchain come along, there's a pressing opportunity to consider the values that shape them and how civil society can shape their use. And if an individual doesn't want to consent to your data collection processes, do you really want to cut him or her off from your food bank, homeless shelter, museum exhibit or vaccination? In the interest of the greater good, do we really want access to core services to depend on someone's consent to an organization's data processes? That's not what I call voluntary.
When it comes to new technologies or applications of technologies we don't need to go through the pendulum swing of "exuberant hype, omnipresent use, creeping doubt" that has become familiar courtesy of search engine data collection, social media data exhaust, peer to peer car sharing "God views," and wearables that creep us out. There are other approaches.
One way to think about this is to think of tech and data design like we think of electoral district boundary setting. It may not seem as sexy as running for office, but boy does it matter. Gerrymander the district lines and you exert a lot of control over who runs and who wins. Over time and across place how these lines are drawn can come to matter at the macro level - such as the makeup of the US House of Representatives - a body which can ultimately have global influence. All because of how the lines were drawn.
Design defaults in hardware and software are as value-laden as district boundaries. What stays in, what gets collected, what gets stored and where - these decisions nudge our behaviors, at the individual level and at the macro level.
It may seem tough to align our values with the design principles that shape our technologies, but it can be done - on both the supply side and the demand side. The Responsible Data Forum is one example of a "supply-side" approach to tools like this - designing tools with and for communities of activists so that the technological defaults are those of the communities. On the demand side - we as individuals (consumers) - also have power. As Dan Gillmor, a noted tech journalist says, when he chooses tools he wants to put his faith in the values of a community not in a company. If all of civil society thought this way it would dramatically shift the power dynamic between the user of a tech device, the person whose data is being stored in/captured by the device, the device itself, and the designers of the code and hardware. Users would understand what the devices were doing, the device would default to users' choices, and the world of choice in devices would expand.
Bringing these values to bear in the design and development of data processes or new tech tools isn't easy. I was inspired (somewhat ironically) by an observation about how the role of industrial design has changed tech innovation, courtesy of Jony Ive at Apple. In a 2015 New Yorker profile of the company's lead designer, Ian Barker writes:
"Typically, Robert Brunner explained, design had been “a vertical stripe in the chain of events” in a product’s delivery; at Apple, it became “a long horizontal stripe, where design is part of every conversation.”This is how we need to think about societal values and technology devices. Something that is considered throughout the technology creation process. Not the values of the device makers but the values of the end users and the sector in which the tools will be used. That would lead to technology that serves us, not situations where we need to jury-rig the tools for our ends. Values aligned technology - something to aspire to, especially as digital sensors and networked connectivity become part and parcel of everything.
This is part three of three in a series on digital values and civil society. Part one is here and part two is here.