"Faulty technology can render the environment uninhabitable. Radical monopoly can force the demand for affluence to the point of paralyzing the ability to work. Overprogramming can transform the world into a treatment ward in which people are constantly taught, socialized, normalized, tested, and reformed. Centralization and packaging of institutionally produced values can polarize society into irreversible structural despotism. And, finally, engineered obsolescence can break all bridges to a normative past. In each or several of these dimensions a tool can threaten survival by making it unfeasible for most people to relate themselves in action to one of the great dimensions of their environment."
— Ivan Illich, Tools for Conviviality
I don't write much about gaming. It's not exactly my beat. I'm not entirely sure what my beat is, but it's not gaming. That said, there was a piece in the Times this weekend by Jennifer Senior on the immensely popular online game, Fortnite, that prompted a few thoughts. I suspect you've heard of Fortnite, the game that, by one count, is closing in on 250 million users. If you're the parent of a teenage boy, you're likely all too familiar with the game and more than a little vexed by it. Perhaps you know the parents of teenage boys, and you've had to endure their lamentations. If you've somehow remained blissfully ignorant of the Fortnite phenomenon, here is a quick primer from the Times article:
A brief tutorial for the Fortnite unlettered: Think “The Hunger Games,” but with less gore and more contestants. Each player is dropped onto a candy-colored island and armed with only a pickax. He or she joins 99 others — some friends, perhaps, but mostly strangers (and mostly adults) — and spends the rest of the game scavenging for weapons, building fortifications, hiding, exploring and laying waste to everyone in sight. The last person standing wins.
Many parents are concerned about the game and the impact it is having on their children. I should say, as a parent, that I'm not about to belittle the worries of other parents. I get that these concerns are sometimes misguided and overblown, but I also think that it's bad practice to assume that present fears are wholly illegitimate simply because you can think of analogous concerns in the past that in retrospect appear "ridiculous" to you. The moral calculus is often more complicated. More about that momentarily.
Senior observes that what has made Fortnite so popular, and what is often missed by parents, is that playing Fortnite is a social event. Even when a player is alone in a room, they are connected to their friends on the platform. This is true, and she is not the first to notice this. As her son puts it when he's off to play Fortnite, "I'm going to see my friends now." In fact, as Senior suggests toward the end of the piece it may be useful to think of Fortnite more as a social network than as merely a gaming platform. But it is in the development of this insight that I think the piece gets a bit off track.
Correctly observing Fortnite's social dimension, Senior goes on to claim, citing game developer Owen Williams, that Fortnite is "an actual place." "It's like going to church, or the mall," Williams wrote, "except there's an entire universe to mess around in together." Further on, Senior writes that playing Fortnite is "the equivalent of dropping in on a cocktail party."
This, it seems to me, is not quite right. I'm all for calming and allaying overwrought fears and anxieties about technology, but not by misconstruing the nature of the technology in question. I wouldn't go so far as to say that there is nothing to the analogy, only that too much is made of it here. Moreover, the differences might be more important than the ostensible similarities. Senior, it seems, recognizes this to some degree. Speaking of her son, she writes, "All that socializing via headset has whetted his appetite for embodied interaction." Parenthetically, she wonders if he is an outlier in this.
The key word, of course, is embodied. The experience of place is an irreducibly embodied experience. Let us quickly acknowledge that playing Fortnite is not a disembodied experience. The body is very much involved, and when players gather in the same room to play together they do so as bodies in close proximity. So if it is not a simple dichotomy of embodied and disembodied experiences, what are we left with?
We're left needing to pay close attention to how playing Fornite is a differently embodied experience than gathering at the mall or attending a church service or dropping in at a cocktail party, and then determining the significance of that difference. Fortnite is not a place in the sense that Senior suggests, but it does shape the user's experience of whatever place they happen to be in. Relatedly, it also worth exploring whether all forms of socialization are equal. Okay, no, that's not worth exploring because the answer is obvious: of course they're not. What is worth exploring is how one mode of socializing—mediated by gaming platforms, for example—differs from another and whether those differences are ultimately consequential.
The question of whether or not such differences are consequential, and the further question of whether or not they are morally consequential will naturally admit of competing answers. This is why we should not be so quick to dismiss past fears as "ridiculous" in hindsight. They may appear ridiculous only because we no longer share the same moral community within which the fears made sense. We may no longer value the goods that were then displaced, nor might we aspire to the virtues that may have been undermined. And that's not necessarily a matter of progress on our part.
When parents and other concerned adults—recall Hillary Clinton's mid-2000s criticism of Grand Theft Auto—worry about games they are usually worried about violent or sexualized content. More specifically they tend to be worried about the possibility of there being a causal link between violent games and violent behavior. Alternatively, they may worry about whether their children are spending too much time playing video games. Senior talks a bit about this latter worry and about how Fortnite's design may induce compulsive play.
Confession: I spent too much time playing video games in the early 90s, and at least one of those, Mortal Kombat, was then the subject of much hand-wringing about graphic displays of violence. I don't think my parents ever feared that I would become violent as a result of playing video games, but I don't believe they were altogether pleased either. In truth, today I'm not altogether pleased with my choices either. I was never tempted by violence, but I can't quite claim that playing hours of Mortal Kombat, most of those socializing with friends, was good.
As I think about these questions today, with regards to both gaming and other forms of technology, I'm chiefly struck by the limited scope of our moral reasoning. As with so many discussions about the moral or ethical status of technology, we are focused on questions of harm Indeed, we might even say that we are focused on questions of measurable harm. In this way, debates of this sort already play out on ground circumscribed by a certain technocratic vision. On the one hand, focusing on measurable instances of harm takes for granted a particular set of assumptions about what exactly counts as morally consequential. What of harms that may not be subject to quantification? Can something harm me without also hurting me? It also tends to ignore or else severely limit any consideration of the good, broadly speaking. In other words, to say that something is not measurably harmful is not to say that it is good. Or, put otherwise again, what does not harm me may not make me virtuous either. If the language of virtue sounds quaint or puritanical, that, too, is a symptom of the moral myopia I'm trying to understand. (I don't know that I've made my assumptions on these matters explicit in writing anywhere, but much of what I do write about technology tends to draw on the tradition of virtue ethics. The idea is simple: we are always becoming who we will be and that becoming is correlated to our practices and the technologies or material realities that sustain them.)
The technocratic vision I mentioned also includes a political element. Liberal democratic (small l, small d) political culture has, generally speaking, also bracketed questions of the good, and it has done so by design, choosing instead to maximize freedom. (But freedom, properly understood, is only the condition of the moral life not its end.) The idea, of course, is that in a pluralistic society it may be impossible to reconcile competing visions of the good. So, particular and substantive moral accounts of the good do not get much traction in the public sphere, and such arguments do not pass the bar of ostensibly public reason. Naturally, then, such public debates, and consequently are own habits of moral reasoning, get locked into matters of measurable—which is to say supposedly objective, scientifically verifiable—harm.
So, I'd suggest re-framing the question. "Will playing Fortnite cause my child (or me) quantifiable harm?" is not the only question we should be asking. Given what is usually meant by that question, the answer is almost always "no." But, in addition to that question, we might also ask how playing Fortnite figures into our pursuit of the good life, individually and as members of distinct moral communities. Answers will vary, of course, and they will most likely be conveyed by narratives rather than statistics. I suspect they will also be less sensational and more nuanced than the answers the first question tends to get. Moreover, the latter question and the answers that follow will not elicit simple and programmatic action points, rather they will elicit the deployment of practical wisdom.
To be clear, I'm not writing all of this because I care a great deal about Fortnite. The point is that these are questions we can and should ask of most technologies and, also, of the broader pattern of technology that characterizes our societies so far as these are discernible. These kinds of questions are usually not far from my mind, but I've been thinking more about them of late as I read Albert Borgmann's reflections on technology and democracy in Technology and the Character of Contemporary Life (1984). There Borgmann argues that "the substance of the good life must be taken into consideration if radical political reform is to become a live option." I won't get into much detail here, perhaps more on the blog later, but Borgmann distinguishes between a formally just society, a substantively just society, and a good society. "I want to argue," he writes, "that just as the constitutional [formally just] definition of society remains incomplete and corruptible without a statement of substantive justice, so the just society remains incomplete and is easily dispirited without a fairly explicit and definite vision of the good life."
According to Borgmann, liberal democracy both needs and fears technology: "It needs technology because the latter promises to furnish the neutral opportunities necessary to establish a just society and to leave the question of the good life open. It fears technology because technology may in fact deliver more than it has promised, namely, a definite vision of the good society and, more important yet, one which is 'good' in a dubious sense." The point I think Borgmann is getting at here is that under the guise of neutrality or instrumentality, the particular shape of modern technology actually smuggles into our experience, tacitly at the level of practice and habit, a definite vision of the good, one which may finally undermine democracy. I don't know, seems like he's onto something important.
News and Resources
• Frank Pasquale on rebranding repression as rational nudging: "There are no limits to the regimentation a scoring system might produce. Marketed as 'affective computing,' these methods could easily compute optimal affect, prescribing expressions and thoughts to match them."
• An interactive look at "how China turned a city into a prison" through the deployment of ubiquitous surveillance technology.
• Christopher Mims takes a look at "the secret trust scores that companies use to judge us all," otherwise known as the West's version of a social credit system.
• Bad Virality, Rob Horning's latest column for Real Life: "Extremism isn't simply extracted from the human heart of darkness by neutral machine learning processes capable of uncovering our 'true desires.' The desire for divisive content is generated by the environment in which it thrives. It inspires using attention metrics as a kind of proxy weapon in wars among rival factions, even as it seems to constitute a rabbit hole that testifies to the individual's diligence, daring, or savvy. Platforms don't want to build one big happy community. They want to make many smaller communities who all hate each other."
• Going for Gold: "The discovery of Newton's alchemical manuscripts—containing no fewer than a million words, some of the pages mutilated by the acids used during his quest of the philosopher's stone—led to a flurry of scholarly activity. More on Newton and and his intense alchemical studies.
• MIT's Rosalind Picard talks about a smartwatch she developed that detects potentially deadly seizures before they happen (video).
• April Glaser with this week's case of "why we can't have nice things": "Apps and databases made for identifying and mapping native plants and birds have had to rebuild their infrastructure in recent years to obfuscate endangered species. It's the only way to protect them from poachers who are savvy enough to take advantage of citizen science open data projects and nature forums where enthusiasts share photos and locations of plants and animals with fellow nature lovers."
• On humans as "moral crumple zones," a very useful formulation: "Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a robotic system may become simply a component—accidentally or intentionally—that is intended to bear the brunt of the moral or legal penalties when the overall system fails." Via Robin Sloan.
• Like most academic titles, this one is not exactly affordable. Nonetheless, looks interesting: Enchanting Robots: Intimacy, Magic, and Technology.
Joseph Weizenbaum sent Lewis Mumford a copy of an article he had written. Mumford liked the conclusion. Courtesy of Zachary Loeb.
Here are a few lines from Iris Murdoch's The Sovereignty of the Good. I commend the whole of this slim volume to you.
"Moral language which relates to a reality infinitely more complex and various than that of science is often unavoidably idiosyncratic and inaccessible. Words are the most subtle symbols which we possess and our human fabric depends on them. The living and radical nature of language is something which we forget at our peril."
"Goodness and beauty are not to be contrasted, but are largely part of the same structure. Plato, who tells us that beauty is the only spiritual thing which we love immediately by nature, treats the beautiful as an introductory section of the good. So that aesthetic situations are not so much analogies of morals as cases of morals."
"It is in the capacity to love, that is to see, that the liberation of the soul from fantasy consists. The freedom which is a proper human goal is that freedom from fantasy, that is the realism of compassion. What I have called fantasy, the proliferation of blinding self-centered aims and images, is itself a powerful system of energy, and most of what is often called 'will' or 'willing' belongs to this system. What counteracts the system is attention to reality inspired by, consisting of love .... Freedom is not strictly the exercise of the will, but rather the experience of accurate vision which, when this becomes appropriate, occasions action. It is what lies behind and in between actions and prompts them that is important, and it is this area which should be purified. By the time the moment of choice has arrived the quality of attention has probably determined the nature of the act."
"It is frequently difficult in philosophy to tell whether one is saying something reasonably public and objective, or whether one is merely erecting a barrier, special to one's own temperament, against one's own personal fears."
"Humility is a rare virtue and an unfashionable one and one which is often hard to discern. Only rarely does one meet somebody in whom it positively shines, in whom one apprehends with amazement the absence of the anxious avaricious tentacles of the self."
Recent post on the blog:
Technology and the Inadequacy of Values Talk
I think this installment was a heavier on my words and lighter on links. I'm still calibrating the best balance in this regard, especially as I publish weekly, but I think that will probably become the norm. Thanks to those of you who have written with feedback. It's been helpful and encouraging.
If you reply to this email, I'll get it in my Inbox. Always good to hear from readers.
If you know someone who might enjoy reading, pass along a link.
If you want to support this work, you may do so here or here.