Most digital technologies are designed by, or at least
brought to mass adoption by, commercial enterprises. This is often, but not
always done, on the back of government funded infrastructure or research. Civil
society, which exists as a counterbalance to and with some intended separation
from, both markets and governments, often adopts new technologies without first
considering how the tools might conflict with the sector’s own values.
Some technological approaches, such as artificial intelligence
(AI), have attracted enough public detractors that industry is responding
with its own policy
association “principles of practice,” bringing an undeniable stamp of “regulation
pre-emption.” AI, of course, has been in the public’s mind for decades, courtesy
of robot cartoons and science fiction. There's a robust debate among tech leaders about the future of AI.
For other technologies, where the need for placating the
public is less urgent, the typical deployment strategy goes something like
this:
·
R&D, either in universities or industry
·
experimentation in specific applications,
·
commercial take up where experimentation is
successful
·
rampant application to broader business opportunities,
·
crossover experimentation on social issues
·
belated social sector response when application
generates “unintended” (though not necessarily unpredictable) consequences
Is there a better way to do this? Can the social sector
pre-emptively develop a set of guardrails for the application of new
technologies so that predictable harm (at least) can be minimized or prevented?
Consider the blockchain. The technology's protocols were originally developed
as a means of enabling trust and accountability in a decentralized manner. The
first application to gain popular attention were currencies and currency
exchanges. Now, the blockchain is being used (or proposed to be used) for other
types of trusted exchanges that require some form of independent
accountability.
In order to function without a central repository, the
blockchain requires the creation of a permanent record of an action which is
verifiable by the larger network.
Those technological requirements result in a few features
that have come to dominate public discussion of the blockchain. These include,
but are not be limited to:
·
It is immutable. Once a piece of information is
added to the chain it cannot be changed.
·
It is decentralized and verification is built
into the technology. There is no single point of control.
It is these technological features that need to be assessed
against the values of purpose of a particular task or action. Is immutability
of record a good thing? Is it in line with the goal seeking to be achieved? If
the action being taken involves tracking material goods in a supply chain than
the the answer may be yes. If the action being taken involves tracking a human
being through space and time, then the answer is not as straightforward. It’s
easy to imagine cases where a person might not benefit from a permanent record
of their presence – escaping violence, seeking assistance to which stigma is
attached, peaceably protesting injustice to name a few.
Now let’s consider the other commonly pitched feature of
blockchain - decentralized verification. If there is no single point of control
for governing the system, then there is also no point of redress for an
individual who may be wronged by it. Since “social good” often centers around
changing dynamics between individuals and systems (think education, health
care, disaster relief, migration rights for just a few examples), applying a
system that provides no redress for individuals is unlikely to be seen as an
improvement (at least by those individuals supposedly being helped).
Social sector applications of new technologies need to
consider the tradeoffs in values between the mission being pursued and theencoded values of the technology itself. Business applications of new
technologies are often focused on the commercial prerogatives of efficiency,
scale, or cost, and the primary perspective is that of the implementing
organization. Social good applications must align with a significantly more
diverse, complex, and structural set of values, while not compromising the
rights of the people theoretically being assisted.
Civil society needs to adopt and adapt to the digital age we
live in. Many technological applications are appropriate. But in assessing these
opportunities, we must consider not just each new and shiny technology but also
the values they encode. And the social sector should assess this alignment in
relationship to the rights and opportunities of the intended beneficiaries, not
to the organization implementing the technology.
No comments:
Post a Comment