“Concerning social memory in particular, we may note that images of the past commonly legitimate a present social order. It is an implicit rule that participants in any social order must presuppose a shared memory. To the extent that their memories of a society’s past diverge, to that extent its members can share neither experiences nor assumptions.”
— Paul Connerton, How Societies Remember
“Grim milestone.” I lost track weeks ago of how often I had encountered the phrase. Here in the U.S. we’ve passed one more such milestone: 100,000 COVID-19 deaths.
It is striking to me, just now, how little we’ve grieved the loss. I don’t mean to say that families and friends of the dead have not grieved their loss, rather that as a nation we seem to have done very little of it. Of late, there have been calls for some expression of national mourning, but it’s not clear if anything of the sort will materialize beyond the lowering of flags to half-staff for a few days. Tragically, like almost every other social reality, of course, these deaths have become just another front in the culture wars—a strike, I should acknowledge, against my earlier suggestion that the virus may resist the operations of hyperreality.
Curiously, we were approaching 100,000 deaths over the Memorial Day weekend, a national holiday given over to remembering our war dead. Of course, it’s not altogether clear to me that the day retains its ostensibly solemn nature for the majority of Americans. The market, after all, tolerates no liturgical calendars, religious or civic, that do not ultimately initiate us into the rites of consumption. So our lack of public mourning should not, perhaps, come as much of a surprise.
It is true that the COVID dead did not enlist to fight in some noble cause on our behalf, neither were they the victims of a galvanizing enemy attack. Thus their deaths lack the character of military service or sudden catastrophe that ordinarily calls forth public grieving. They died as private and often isolated individuals—fathers and mothers, sons and daughters, brothers and sisters, friends and lovers—to be privately mourned, according to our ordinary practice. Except, of course, that under present circumstances, even these ordinary practices have been denied to the dead and loved ones.
Yet, because so many have died in such a brief time, the tragedy takes on an undoubtedly collective and public character. It demands acknowledgement and a reckoning, not simply a tallying. As I write this, however, it begins to feel almost as if we’re prepared to move on. We were shocked on the first day that 100 died and later 1,000, but we somehow acclimated to anywhere from 1,500 to more than 2,000 deaths a day for a few weeks. For some, it is clearly politically expedient to do so; for others, a certain fatigue appears to have set in. Moreover, the temporal imperatives of digital media are unresponsive to the specific quality of events.
Deeper still, I suspect that we’ve lost touch with a common repository of rites, rituals, gestures, language, and ceremonies through which we might acknowledge, commemorate, and perhaps even sublimate the death and suffering of our fellow citizens. Surprisingly, a society fixated on the possibilities of technical automation is nonetheless indifferent if not overtly hostile to what we might think of as the fruits of cultural automation, that is a stock of deeply engrained, reflexive scripts that help us navigate instances of profound loss or joy, relieving us of the paralyzing burdens of spontaneity. But it’s not exactly that a choices was made to cast these aside. Culture in this broad sense arises organically and begins disintegrating long before it is obvious to most except the keenest observers. Which is why it appears to happen, as Hemingway said of financial ruin, gradually, then suddenly.
Certain communities may retain such customary rites, of course, but the nation does not. Perhaps it never did, except to the exclusion of sizable minorities. To put this another way, we have no civil religion, only mannerist manifestations of such—rites merely simulated rather than inhabited, shot through with self-awareness, divisive rather than unifying, subject to charges of hypocrisy and bad faith. (This is not a brief for a civil religion, merely description.) Further, I suspect that the share of individuals covered by communities that retain binding and living forms of this sort is itself shrinking.
It is not only a matter of acknowledging and reckoning with grief, though this is important enough. It is also a question of meaning and judgement and how a society hangs together (or doesn’t).
Consider the language that does populate our public discourse. It is the “language” of quantification: numbers, statistics, models, data, economic indicators, and their attendant values—speed, efficiency, optimization. This language, this way of knowing the world, has its place, but ideally in the service of values and goods that cannot be accounted for numerically. It can be a valuable aide to thought and judgment, but it cannot substitute for either. Unfortunately, the language of quantification is often invoked to settle questions that are ultimately moral and political.
We do so, in part, because quantification carries the veneer of objectivity and thus the possibility of commanding assent from all “reasonable” people. In the absence of broadly shared moral frameworks, healthy political institutions, a common experience of the world, and adequate levels of social trust, we resort to the language of quantification as if it alone could resolve our disputes and resolve our deepest quandaries.
They can’t. But in the face of perpetual outrage cycles, nihilism, injustice, violence, apathy, and division, some instinctively turn to data theater, the performance of data analysis as if it settled matters and absolved individuals of the responsibility of judging and acting and assuming responsibility for their words and deeds.
But we must see these dynamics for what they are: desperate attempts to shore up a public that no longer exists as such. Media theorists, and Søren Kierkegaard, have long told us that “the public” is an effect of the media that generate it. As I’ve tried to argue at length elsewhere, we cannot expect business as usual in the aftermath of a profound transformation of our media ecosystem. (To be clear for the sake of readers who may be coming to the language of media ecology for the first time, when I speak of media here I am not talking merely about CBS, MSNBC, or the New York Times. I’m talking about the social consequences of orality, writing, print, electronic communication, and now digital technologies.)
Whatever you conjure in your imagination when you hear about “the public” or “the public sphere,” you are invariably imagining some unit that can be thought only because it is sustained by a particular media ecosystem. The public as we imagined it in the post-war era of mass electronic media no longer exists, or it exists only in zombie form. It wasn’t a fiction, exactly, but it was a construct not a given. Its consequences are still with us, of course. Its assumptions linger and its values and norms still hold sway with many, but its power is waning and its death throes are reverberating throughout society. You can lament this development or celebrate it. What we must not do is pretend that it is not happening.
If you would like to observe one of the most obvious manifestations of this dynamic, you need only consider Twitter’s recent decision to add a “fact-check” (it is not even that) to one of the president’s tweets as if this were somehow an adequate response to the problem they were ostensibly addressing. Now, better yet, try to imagine a solution that might generate consensus, one that would not accelerate the powerful disintegrating forces at work in our society. If you can think of one, you’re wiser and more imaginative than me, which, I readily grant, is not saying much.
It is not that digital media acts unilaterally as a causal force generating a new society ex nihilo. It is, as McLuhan noted long ago, that new media generate cultural retrievals, obsolescences, enhancements, and reversals. It reconfigures society by working on the pre-existing materials. And the consequences of this reconfiguration can be profound, felt at both the level of the individual psyche and the social fabric.
For example, consider the effect of digital media on memory. If collective memory is a crucial element of a cohesive, well-functioning society, if, as a Ivan Illich has observed, what we call different cultures are merely the manifestations of different means of remembering—then what are the consequences of the radical re-ordering of how we remember occasioned by digital media?
And what of a common experience of some abiding present? Media have always refracted the greater world to our senses, whether that was a medieval map of the cosmos, modern era travelogues of an explorer’s journey, television’s presentation of a distant war, or the experience of an election cycle on a digital media platform. What shape can a wider common world take when it is experienced through the lens of our algorithmically constituted, cyborg timelines?
What, too, of the heightened self-consciousness that digital media occasions, both at the personal and social level? If new cultural forms necessarily take root under the cover of a benign darkness, below the level of consciousness, how are they to emerge under the withering light of awareness that drenches everything in irony? “Everything that lives, not vegetative life alone,” Hannah Arendt observed, “emerges from darkness and, however strong its natural tendency to thrust itself into the light, it nevertheless needs the security of darkness to grow at all.” Elsewhere, she observed, “A life spent entirely in public, in the presence of others, becomes, as we would say, shallow. While it retains its visibility, it loses the quality of rising into sight from some darker ground which must remain hidden if it is not to lose its depth in a very real, non-subjective sense.”
To be clear, what is new is obviously not the divisions or the strife or the conflict. What is new, and here I might be mistaken in my estimation, is the absence of a cultural infrastructure, ordinarily taken for granted, by which the divisions, the strife, the conflict might be adjudicated, resolved, or even spoken about intelligibly—the absence, that is, of a sufficiently broad common world that can gather enough of us around it. And perhaps it is not right to say that this is new. It may be better simply to say that we are in such a period of cultural upheaval, and have been for some time, of which there have been many throughout human history. I will say, though, that what does seem somehow novel is how the agents of disintegration are such that it is hard to see how they do not simply keep acting in such a way as to be constantly eroding the foundations of a new more stable cultural order. Of course, I’m happy to admit that it may always have seemed so to those living through such times.
But perhaps that’s best left for another newsletter. I’ve delayed sending this one long enough, and, if you’ve made it this far, kept you long enough. I should add, too, that none of this should be read as the counsel of despair. Let us each look around us, right there in front of us … there are things to be done.
News and Resources
On “doomscrolling” and its antecedents: “Giving a lecture in Exeter on 19 November 1914, the minister G. M. Newcombe related an anecdote about a friend of his who spent half his day reading war news in The Times, finishing only when the Exeter evening paper arrived in the house. ‘Naturally,’ he said, everyone was ‘interested in the great crisis, but excessive newspaper reading had a tendency to throw some people off their balance.’”
Academic article on “The (in)credibility of algorithmic models to non-experts”: “Although model-professionals often work in close collaboration and over time develop practices to scrutinize a model, they often remain black boxes to all but a few specialists. This suggests that meaningful oversight of machine learning algorithms that transcend our cognitive capacity presents a formidable challenge for those not working with a model on a daily basis. It follows that transparency of algorithms to the general public seems problematic at best given that they may not have the expertise, the time, or inclination to engage with a model.”
On the rise of online “cults.” The digital city is stranger than many of us realize but still answers to rather primal longings: “Ms. Ong’s fans said that joining Step Chickens has helped them feel less isolated in the midst of widespread stay-at-home orders. ‘I think a lot of people want to be a part of something,’ said Sam Schmir, 20.”
Swiss documentary explores the history of chairs: “Chair Times: A History of Seating – From 1800 to Today.”
A bit different than the usual fare, but interesting read on “kid culture,” akin to what I’ve taken to calling the professionalization of childhood: “The best moments for us have been when there is no adult culture or kid culture, no fantasy world conjured by tired, bored grownups struggling to provide perpetual creative stimulation, and no pretend universe in which we are not really parents, but are instead hip single people sipping $14 kombucha mules on a Saturday night at this new bar. The best moments are when we can all do our thing, be it sidewalk chalk or discussing the downfall of US society or dancing in a tutu or reading poetry. The moments when we are part of a larger community, not subjugated to the singular, quasi-sacred project of the kid, nor trying to flee it, just two 37-year-olds and a five-year-old making our way through the world.”
Drew Austin on quarantine as “the future big tech wanted us to want”: “This was already the perceptual logic of the internet: a nonspatial, atemporal universe in which everything feels always already available for instrumental use, whether we want it to be or not. Before the lockdown, this could manifest as power over information. Now without much physical experience to contextualize it, this feels both overwhelming and insufficient, failing to adequately organize experience or meet our social needs on its own.”
Is gardening an art form? If it is it’s the kind of art I like, bedded in the material, nearly domestic, subject to happenstance and weather. Most of the winter had been very bleak. The smile was an aberration. The days were short and grey and riddled with bad news. I developed a habit of spending Sundays with seed catalogues and lists of old roses, plotting floral fireworks that wouldn’t go off for months. Such consolations are nothing new. In her diary of 1939, Virginia Woolf records hearing Hitler on the radio. Her husband Leonard was in the garden he’d painstakingly constructed at Monk’s House, their damp green cottage in Rodmell, East Sussex. “I shan’t come in,” he shouted. “I’m planting iris, and they will be flowering long after he is dead.”
— Ivan Illich quoting Merleau-Ponty in a short talk on “Computer Literacy and the Cybernetic Dream”:
[…] a danger Maurice MerleauPonty clearly foresaw almost thirty years ago. He then said - and I quote - that “cyberneticism has become an ideology. In this ideology human creations are derived from natural information processes, which in turn have been conceived on the model of man-as-a-computer.” In this mind-state, science dreams up and “constructs man and history on the basis of a few abstract indices” and for those who engage in this dreaming “man in reality becomes that manipulandum which he takes himself to be.”
— The opening stanzas of Auden’s “The Shield of Achilles”:
She looked over his shoulder
For vines and olive trees,
Marble well-governed cities
And ships upon untamed seas,
But there on the shining metal
His hands had put instead
An artificial wilderness
And a sky like lead.
A plain without a feature, bare and brown,
No blade of grass, no sign of neighborhood,
Nothing to eat and nowhere to sit down,
Yet, congregated on its blankness, stood
An unintelligible multitude,
A million eyes, a million boots in line,
Without expression, waiting for a sign.
Out of the air a voice without a face
Proved by statistics that some cause was just
In tones as dry and level as the place:
No one was cheered and nothing was discussed;
Column by column in a cloud of dust
They marched away enduring a belief
Whose logic brought them, somewhere else, to grief.
I will shortly be trying a new experiment with the newsletter: an Ivan Illich reading group. I’m working out the details, but the main feature will be a synchronous discussion thread on Substack. I’ll be choosing a combination of shorter books and/or essays, which we’ll work our way through over the summer. Tools for Conviviality and Deschooling Society will almost certainly be on the list. This experiment was born out of my realization that Illich would be a useful guide as we navigate institutional failures and try to imagine how things might be done differently. I am, however, keeping this a paid subscribers only affair. Two main reasons: it limits the size of the group to more reasonable scale for something like this and my sense is that the closed setting may allow for more open conversation. We’ll be starting up in the next three weeks or so. Of course, feel free to join up, even if only for the short while that we run the reading group.