"There are two ranges in the growth of tools: the range within which machines are used to extend human capability and the range in which they are used to contract, eliminate, or replace human functions. In the first, man as an individual can exercise authority on his own behalf and therefore assume responsibility. In the second, the machine takes over—first reducing the range of choice and motivation in both the operator and the client, and second imposing its own logic and demand on both. Survival depends on establishing procedures which permit ordinary people to recognize these ranges and to opt for survival in freedom, to evaluate the structure built into tools and institutions so they can exclude those which by their structure are destructive, and control those which are useful. Exclusion of the malignant tool and control of the expedient tool are the two major priorities for politics today" — Ivan Illich, Tools For Conviviality (1973)
This year Frankenstein turned 200. It was 200 years ago that Mary Shelley published the first version of a story that would go on to achieve canonical and archetypal status in the world's imagination. In the last 100 years or so, the story has come to be known chiefly through film and television. While the story may endure, in part, because of its transmutations in popular culture, what has endured is a sadly diminished and impoverished version of Shelley's narrative. Perhaps all that has survived is the mere image of the monster and a vague sense of foreboding about scientists who dare to, as we say, "play god."
This is unfortunate because Shelley bequeathed to us a subtle, though at points slightly overwrought, tale that embodied a wise and critical account of the nature of human civilization. While it certainly approached the question of forbidden knowledge, that was not, as I read it, its main theme. At the heart of the book was the question of responsibility. Elsewhere this week, I offered the following observations: Victor is responsible for bringing a creature into the world. Victor abandons the creature and out of this abandonment, out of this monstrous refusal of responsibility, the creature's monstrosity begins to emerge. It's not really the case that the creature "slips" Victor's control. Rather, Victor's refusal of responsibility enables and fuels the creature's rampage. Total mastery of his creation was never possible, but the worst of what follows flows from his refusal of responsibility. Victor's initial forfeiture of his duty toward his creation is followed by subsequent evasions of responsibility for his creature's crimes, crimes with which he is entangled because of his initial dereliction. Refusal of moral responsibility, in his case, yields criminal guilt.
Someone commented that these observations were really about Facebook, Google, Uber, etc. Indeed. The resonance is plain to see. At every turn, we encounter evasions and refusals of responsibility for technology, from those who make it, those who profit from it, and those who use it. Rhetoric, ideology, habits, practices, and the architecture and scale of technology facilitate these evasions and refusals.
As Hans Jonas noted more than forty years ago, we need to rethink and reimagine the nature of responsibility in light of the challenges presented by modern technology. Few things are more needful, yet it would appear that emerging technology is often characterized precisely by a tendency to obscure, disperse, and mask responsibility. Writing about humanist technology criticism back in 2015, I suggested that a humanist critique of technology would entail a preference for technology that, among other characteristics, does not obfuscate moral responsibility. This seems all the more urgent with each passing year and each new revelation about the human and social costs of emerging technology. Some material to think with along those lines follows.
News and Resources
"Even their designers cannot exactly explain how self-learning parts of their algorithms make their final decisions. This offers a clever path to avoiding responsibility." Primer on Mathwashing. On the same theme, consider this 2015 post: Resisting the Habits of the Algorithmic Mind.
"I’ve been arguing for years that the integration of digital media devices and psychological techniques is one of the most underappreciated developments in the history of computing." Luke Stark on the Cambridge Analytica affair in light of the history of computer science.
"As a thought experiment, try to think of the most absurd, invasive, extractive thing you can imagine people might do with the available technology." David Golumbia and Chris Gilliard have assembled a litany of real world examples in "There Are No Guardrails on Our Privacy Dystopia." They get bonus points for invoking Jacques Ellul.
"What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general." Zeynep Tufekci on "YouTube, the Great Radicalizer."
David Zweig considers a browser extension that strips the Twitter interface of its visible metrics. "After three weeks of using the Demetricator," Zweig writes, "the nature of Twitter, for me, changed completely."
At the Librarian Shipwreck blog, a consideration of the efficacy of deleting Facebook. In a New York Times op-ed, Siva Vaidhyanathan urges readers not to delete Facebook but to "do something about it." I'm uneasy with this counsel.
In "Tending the Digital Commons: A Small Ethics toward the Future," Alan Jacobs writes about the virtues of having an online domain of one's own. "To the extent that people accommodate themselves to the faceless inflexibility of platforms," Jacobs warns, "they will become less and less capable of seeing the virtues of institutions, on any scale."
Bruce Sterling challenges the regnant ideology of the "smart" city: "Instead of being speed-of-light flat-world platforms, all global and multicultural, they’ll be digitally gated communities, with 'code as law' that is as crooked, complex, and deceitful as a Facebook privacy chart."
Re-framings
Jacques Ellul on responsibility [video]:
“In a society such as ours, it is almost impossible for a person to be responsible. A simple example: A dam has been built somewhere, and it bursts. Who is responsible for that? Geologists worked on it. They examined the terrain. Engineers drew up the construction plans. Workmen constructed it. And politicians decided that the dam had to be in that spot. Who is responsible? No one. There is never anyone responsible. Anywhere [...]
“But no one is free either [....] Just consider, for example, that atrocious excuse. It was one of the most horrible things I have ever heard. The person in charge of the concentration camp Bergen-Belsen was asked during the Auschwitz trial the Nuremburg trials regarding Auschwitz and Bergen-Belsen: But didn’t you find it horrible? All those corpses? He replied: What could I do? The capacity of the ovens was too small. I couldn’t process all those corpses. It caused me many problems. I had no time to think about those people. I was too busy with that technical problem of my ovens.
“That was the classic example of an irresponsible person. He carries out his technical task and he’s not interested in anything else.”
__________________
The late Hans Jonas, one of our leading theorists of responsibility in a technological age, in "Toward a Philosophy of Technology." Jonas is cited in the essay by Alan Jacobs above. His main work on the subject is The Imperative of Responsibility: In Search of an Ethic for the Technological Age. You can read an engaging exchange between Jonas and Hannah Arendt on technology and ultimate values here.
"The same holds of the different kind of questions raised for ethics by the sheer fact of the formal dynamics of technology. But here, a question of another order is added to the straightforward ethical questions of both kinds, subjecting any resolution of them to a pragmatic proviso of harrowing uncertainty. Given the mastery of the creation over its creators, which yet does not abrogate their responsibility nor silence their vital interest, what are the chances and what are the means of gaining control of the process, so that the results of any ethical (or even purely prudential) insights can be translated into effective action? How in short can man's freedom prevail against the determinism he has created for himself?"
__________________
Daegan Miller on the liberation cartography of Henry David Thoreau:
"Perhaps wildness is always a trespasser disrespecting the artificial boundaries of power. 'They who laid out the town should have made the river available as a common possession forever,' Thoreau wrote near the end of his life, in his great anti-capitalist essay 'Huckleberries'; and he used all his surveying skill to stake a claim for the bulrush and the ash tree, for the bathers and those who hunted for freshwater clams, for himself and for all Concordians in his map. 80 The river persuaded him to use his cartographic training as a means to protest the privatizing of the public goods that ought to benefit every living thing, to disavow the cheap utopian assurances of individual gain so dearly bought. Call it liberation cartography: 'I find that I have a civil right in the River,' Thoreau wrote in 1853."
Recently Published
DNA Kits, Alchemy, and the Essence of Technology
Why We Can’t Have Humane Technology
In the first installment of this newsletter, I reflected on Ursula Le Guin's short story, "The Ones Who Walk Away From Omelas." I suggested that this story approached the truth of our situation and the choices that confront us. It also seems evident, however, that the choice to walk away is not equally open to everyone. Moreover, if and when we choose to walk away, we should also ask ourselves who we leave behind. As we press for others to take responsibility for technology, we should also examine not only what it might mean for us to take responsibility for our use of technology but also for our neighbor.
Curiously, it was Cain in the book of Genesis who famously sought to evade responsibility for the murder of his brother with the rhetorical quip, "Am I my brother's keeper?" It was also Cain who was subsequently the first to build a city, and it was among his descendants, according the author of Genesis, that the techne of civilization—agriculture, metallurgy, and the arts—first flourished. The narrative seems to evoke a primordial affinity between human making and the proclivity to sidestep responsibility. May we strive for better.
Cheers,
Michael