Free is good, right? It's certainly a very attractive price to nonprofit organizations, which are always, shall we say, "resource constrained." (read: broke)
There are lots of reasons to be wary of free, but I'm not going to go into all of them here (again).
Let's just focus on why its a bad idea to become dependent on any single tech service provider - be it a social media platform or a storage service or a shared document host.
It boils down to one simple reason - you're subject to their rules, at all times.
Here's a headline from today: "Why is Google Docs Terrifyingly locking people out of their Documents?" The examples listed include research on "wildlife crime" and work on "post socialist Europe." Users tried to log in this morning only to find out that their work was suddenly in violation of Google's Terms of Service. Their documents were now off limits.
What happened? At least according the story above, Google updated its software code which may have made "its spam detection go rogue." Or not. We'll only know what Google chooses to tell us.
For those who were working on those documents and are now locked out, they can't get any work done and who knows what they may have lost. This in and of itself ought to scare you into 1) backing up and 2) backing up. But isn't that the plus of these online documents - you don't have to back up? Hmm, maybe not.
More importantly, if the examples of work I had cited about above had included "documenting White House lawyers hired since January," or "lists of immigration assistance centers," or "a table of registered gun owners addresses sorted by distance from nearest elementary school" you might be less likely to believe a software glitch and more concerned that something else was going on.
Either way, you'd still be stuck. Google's first response to inquiries about today's "mishap" - "We'll provide more information when appropriate" wouldn't be very comforting.
Later in the day, Google issued this response: "This morning, we made a code push that incorrectly flagged a small
percentage of Google Docs as abusive, which caused those documents to be
automatically blocked. A fix is in place and all users should have full
access to their docs."
Which raises yet another question - before you and your team start working on a shared document, do you check your work against Google's Terms of Service? Remember, you have been warned - the system is scanning your documents at all times. There is nothing private or protected about the information you're putting there, and it's continued existence depends on the ToS which you probably haven't read.
Free is a tough price to beat. But it does mean you get what you pay for, plus the potential for censorship.
*Yes, I know I'm writing this on blogger, owned and hosted by Google. I back it up, offline.
Tuesday, October 31, 2017
Friday, October 27, 2017
What did silicon valley do to democracy and the media
I'm delighted to be moderating this conversation with Nate Persily and Franklin Foer.
Join us - November 13, Stanford University.
More information is here.
Join us - November 13, Stanford University.
More information is here.
Wednesday, October 25, 2017
Ethical tech adoption in civil society
-->
Doing so requires articulating a set of sector values that
would apply to multiple technologies, or at least a means of checking
technologies for fit not at the “shiny object” level but at the
mission-alignment, core values level. There are some such efforts to do so - at least one in AI and public services and the responsible data principles could be seen as a digital data level version of this.
Most digital technologies are designed by, or at least
brought to mass adoption by, commercial enterprises. This is often, but not
always done, on the back of government funded infrastructure or research. Civil
society, which exists as a counterbalance to and with some intended separation
from, both markets and governments, often adopts new technologies without first
considering how the tools might conflict with the sector’s own values.
Some technological approaches, such as artificial intelligence
(AI), have attracted enough public detractors that industry is responding
with its own policy
association “principles of practice,” bringing an undeniable stamp of “regulation
pre-emption.” AI, of course, has been in the public’s mind for decades, courtesy
of robot cartoons and science fiction. There's a robust debate among tech leaders about the future of AI.
For other technologies, where the need for placating the
public is less urgent, the typical deployment strategy goes something like
this:
·
R&D, either in universities or industry
·
experimentation in specific applications,
·
commercial take up where experimentation is
successful
·
rampant application to broader business opportunities,
·
crossover experimentation on social issues
·
belated social sector response when application
generates “unintended” (though not necessarily unpredictable) consequences
Is there a better way to do this? Can the social sector
pre-emptively develop a set of guardrails for the application of new
technologies so that predictable harm (at least) can be minimized or prevented?
Consider the blockchain. The technology's protocols were originally developed
as a means of enabling trust and accountability in a decentralized manner. The
first application to gain popular attention were currencies and currency
exchanges. Now, the blockchain is being used (or proposed to be used) for other
types of trusted exchanges that require some form of independent
accountability.
In order to function without a central repository, the
blockchain requires the creation of a permanent record of an action which is
verifiable by the larger network.
Those technological requirements result in a few features
that have come to dominate public discussion of the blockchain. These include,
but are not be limited to:
·
It is immutable. Once a piece of information is
added to the chain it cannot be changed.
·
It is decentralized and verification is built
into the technology. There is no single point of control.
It is these technological features that need to be assessed
against the values of purpose of a particular task or action. Is immutability
of record a good thing? Is it in line with the goal seeking to be achieved? If
the action being taken involves tracking material goods in a supply chain than
the the answer may be yes. If the action being taken involves tracking a human
being through space and time, then the answer is not as straightforward. It’s
easy to imagine cases where a person might not benefit from a permanent record
of their presence – escaping violence, seeking assistance to which stigma is
attached, peaceably protesting injustice to name a few.
Now let’s consider the other commonly pitched feature of
blockchain - decentralized verification. If there is no single point of control
for governing the system, then there is also no point of redress for an
individual who may be wronged by it. Since “social good” often centers around
changing dynamics between individuals and systems (think education, health
care, disaster relief, migration rights for just a few examples), applying a
system that provides no redress for individuals is unlikely to be seen as an
improvement (at least by those individuals supposedly being helped).
Social sector applications of new technologies need to
consider the tradeoffs in values between the mission being pursued and theencoded values of the technology itself. Business applications of new
technologies are often focused on the commercial prerogatives of efficiency,
scale, or cost, and the primary perspective is that of the implementing
organization. Social good applications must align with a significantly more
diverse, complex, and structural set of values, while not compromising the
rights of the people theoretically being assisted.
Civil society needs to adopt and adapt to the digital age we
live in. Many technological applications are appropriate. But in assessing these
opportunities, we must consider not just each new and shiny technology but also
the values they encode. And the social sector should assess this alignment in
relationship to the rights and opportunities of the intended beneficiaries, not
to the organization implementing the technology.
Friday, October 06, 2017
U.S. nonprofits under attack
Are you still trying to make sense of how digital tools facilitate efforts to shut down civil society? Read this for my quick primer on how this works. There will be a lot more in the Blueprint 2018 - coming on December 14.
And here's a story from the Electronic Frontier Foundation that adds to the list of how and who (tl:dr - email spear phishing)
And here's a story from the Electronic Frontier Foundation that adds to the list of how and who (tl:dr - email spear phishing)
(Photo from EFF: https://www.eff.org/deeplinks/2017/09/phish-future)
Subscribe to:
Posts (Atom)