Welcome to this latest installment of the Convivial Society. I’ve had a series of interconnected thoughts kicking around over the past few days, so I’ve resorted to a format I deployed once before and put these down in a numbered list of reflections. I’ll trust you’ll note the themes that run through these and how they variously intersect, but I’ll specifically mention the following: place, commerce, boundaries, and the commons. As always, if you find the newsletter helpful, consider yourself duly encouraged to share it with others. As was the case once before, I’m not able to provide the audio version with this installment, but those of you who prefer listening can expect it in your inbox within a few days.
1. We never go back. This is not to say that elements of the past can never reassert themselves or re-appear in interesting ways, but we never go back to a past state of affairs because you cannot undo what has happened since. Even when elements of the past are retrieved or patterns echo, they will have been changed by their passage to the present. I say this, in part, because when new technologies appear, it is tempting to cast them in light of older technologies or in relation to older social states. Some of this reflects the understandable tendency to make sense of what is novel by reference to what is familiar, hence the fact that we still speak of web “pages.” But it applies, too, particularly with new media, to the idea that we are thrust into an older form of culture by a new technology. To note one recent example, when Clubhouse, an audio-only social platform, was a big deal earlier this year, it was not uncommon to come across someone claiming that it marked the return of oral culture, culture characterized by the spoken word rather than writing. (There were, to be sure, more and less sophisticated versions of this claim.) But this was always impossible. Better to say, I think, that Clubhouse or Discord might retrieve certain aspects of orality, but there can be no return to orality because you cannot undo the effects of literacy. This is why Walter Ong spoke of “secondary orality” when he wanted to analyze how radio and television re-shaped a literate society. When I’ve used the orality/literacy frame to make some sense of social media, I’ve spoken of digital media “scrambling” earlier dynamics as a way to account for the aspects of both continuity and discontinuity. In any case, there’s no going back.1
2. A couple of years ago, I had to make an early morning run to the grocery store for diapers. I noticed that a beer I liked was on sale, so I picked up a six-pack and brought it along with the diapers to the cashier. A fine combination, I’m sure they must’ve thought. But I wasn’t able to buy the beer. It was too early. Blues laws, sometimes called Sunday laws, are a vestigial organ in the legal body. They were initially intended to restrict commercial activities, particularly on Sundays, in order to encourage church attendance. Alternatively, they restricted the purchase of certain goods, alcohol and tobacco notable among them, to certain times of day. Apparently buying beer at 7:30 in the morning suggested a certain impiety on my part. In any case, some of you reading will be old enough to remember that it was also common for stores to close early or altogether on Sundays, not by law necessarily but by custom. These older laws and customs reflected an implicit understanding that commerce has its place and its limits. Tracing the genealogy of this practice would eventually bring you back to the Jewish Sabbath, a weekly day of rest from one’s labors.
3. In his classic work on the Sabbath, the great Jewish rabbi and philosopher of the last century, Abraham Heschel, wrote, “There is a realm of time where the goal is not to have but to be, not to own but to give, not to control but to share, not to subdue but to be in accord. Life goes wrong when the control of space, the acquisition of things of space, becomes our sole concern.”
“He who wants to enter the holiness of the day,” he observed, “must first lay down the profanity of clattering commerce, of being yoked to toil. He must go away from the screech of dissonant days, from the nervousness and fury of acquisitiveness and the betrayal in embezzling his own life. He must say farewell to manual work and learn to understand that the world has already been created and will survive without the help of man. Six days a week we wrestle with the world, wringing profit from the earth; on the Sabbath we especially care for the seed of eternity planted in the soul.”
4 . The most violent episode in John Bunyan’s 17th century allegory of a spiritual journey, Pilgrim’s Progress, occurs at Vanity Fair, modeled after the old medieval and Renaissance fairs, which were chiefly traveling centers of trade and commerce. While these fairs came and went, Vanity Fair was distinguished in Bunyan’s allegory by the fact that it was in operation all year round. While passing through the fair, the main character and his companion are beaten and berated. The friend is eventually martyred. Their crime? Their refusal to buy something.
5. Mark Zuckerberg has been talking up the metaverse of late.2 “Our overarching goal across all of these initiatives,” he recently told employees, “is to help bring the metaverse to life.” With Drew Austin, I’m not entirely sure if this is an “attention-grabbing troll or a genuine strategic priority.” Whatever the case, the term was coined by author Neal Stephenson for his dystopian 1992 novel Snow Crash, where, as Brian Merchant noted, “it serves as entertainment and an economic underbelly to a poor, desperate nation that is literally governed by corporate franchises.” Not exactly the most encouraging source material. In the more cheery promotional narratives, the metaverse will unite various disparate elements of our digital lives into a seamless shared reality accessed through VR goggles or AR apps. According to NVIDIA’s vice president of simulation technology Rev Lebaredian, “Ultimately we’re talking about creating another reality, another world, that’s as rich as the real world.”
6. “As rich as the real world” almost comes off as a magnanimous concession given that the so-called “real world” is sometimes characterized as a tedious and impoverished realm compared to the wonders of the Digital City. In a recent installment, I cited Marc Andreessen’s claim that a preference for non-digitally mediated “reality” was an expression of “reality privilege.” In his view, “the vast majority of humanity, lacks Reality Privilege — their online world is, or will be, immeasurably richer and more fulfilling than most of the physical and social environment around them in the quote-unquote real world.” Andreessen knows that some will reasonably say we should then get busy making sure that we improve the “real world” experience for everyone. But times up for reality, Andreessen argues: “Reality has had 5,000 years to get good, and is clearly still woefully lacking for most people.” “We should build -- and we are building --” he adds, “online worlds that make life and work and love wonderful for everyone, no matter what level of reality deprivation they find themselves in.”
I readily admit that I have no idea where the 5,000 year timespan comes from. That said, might we call this the intellectual and perhaps even moral case for the metaverse? The possibly trollish use of the word privilege does a lot of work here, of course. It trades on the moral capital of opposition to inequality and injustice. But then there’s a subtle equivocation or reframing that happens, too. The problem is recast as one of ontological deficiency rather than economic or political failure. It suggests that what is broken is reality itself rather than what we have made of it. Thus the solution is building a technologically mediated simulation that can improve on this broken reality rather than the work of building a more just society. But again, Neil Stephenson got their first, and more honestly.
This is not to discount the fact that digital media has, in fact, been a boon to many, helping, for example, to alleviate the loneliness of those who might remain isolated and alone without it, or supplying opportunities for many who would’ve languished otherwise. But one can acknowledge and celebrate such things without disparaging “reality” or implying that a good life for most people will depend on their immersion in ever more elaborate digital simulations.
But is there not some truth to claim that reality pales in comparison to the digitally mediated worlds on offer? My most straightforward answer is, of course, no. But viewed from a certain angle, perhaps. As an example, consider the case of someone who has only lived where light pollution obscures all but a few of the brightest stars. Under these conditions they may have good reason to believe that there’s not much to the night sky. Of course, the problem is not that reality is impoverished, rather it’s that they can no longer see it. I’d suggest that this pattern recurs throughout our experience. For a host of reasons, mostly having to do with the way we have structured the human-built world and the habits of perception fostered by modern media, it is hard to see what is right before us all the time. As Ivan Illich once put it, “Existence in a society that has become a system finds the senses useless precisely because of the very instruments designed for their extension.”
7. “[O]ur human and earthly limits, properly understood,” Wendell Berry has argued, “are not confinements but rather inducements to formal elaboration and elegance, to fullness of relationship and meaning. Perhaps our most serious cultural loss in recent centuries is the knowledge that some things, though limited, are inexhaustible. For example, an ecosystem, even that of a working forest or farm, so long as it remains ecologically intact, is inexhaustible. A small place, as I know from my own experience, can provide opportunities of work and learning, and a fund of beauty, solace, and pleasure — in addition to its difficulties — that cannot be exhausted in a lifetime or in generations.”
8. Drew Austin described the metaverse this way: “a more regimented simulacrum of public space where a wider range of interactions are easier to monetize—a virtual environment in which we’ll finally have digital walls where we can hang our NFTs, and where we can rub elbows with Marvel’s embodied IP.” He quotes Wendy Liu, who, considering a short definition of metaverse, quipped, “virtual reality with unskippable ads.” Rob Horning’s analysis appears in Austin’s short essay, too: “Facebook would also like to secure the ability to prevent people from any right to absence … The metaverse is fundamentally a place you will be forced to be.”
9. One way of telling the story of modernity would be to describe how commerce colonized more and more of our world and our experience by overcoming the technical and cultural limits that stood in its way. Aspects of the world now appear to us framed by the implicit challenge: Commercialize this. This is hardly a novel observation, I grant. But it is worth noting how digital technology has shaped and been shaped by this dynamic.
The old blue laws, and the practice of Sabbath that came before them, are just two examples of limits placed on commerce (and labor) that have lost their cultural authority. It’s worth noting that their cultural authority was wrapped up with technical obstacles, which is to say that it was easier to limit commerce when geography and inconvenience did a good bit of the work for you. There is now no time during which it is not possible to engage in commercial activity, and I am hard pressed to think of instances where it is discouraged by the force of custom or principle. Neither, of course, is there a spatial barrier to consumption, we don’t have to be anywhere in particular to engage in commercial activity. Vanity Fair is in our pockets and it is always open for business.
Additionally, even if we are not directly engaged in consumption or labor at any given moment, our experience is often structured so as to promote commerce. It is saturated with ads, however effective they may or may not be, but, more importantly, our experience is increasingly framed as standing reserve for the market. So, for example, as collecting, storing, and analyzing data became easier, data-gathering devices proliferated in our homes to extract this data with a view to generating more commerce.
The point is not that commerce is bad. I buy things. You buy things. We all need to buy things. I’m glad to pay someone for their labor. I’m glad to be paid for mine. The point, and certainly not an original one, is that we should be wary of allowing the logic of the market to colonize all facets of our experience.
10. Digital technologies have also blurred the boundaries that kept work confined to certain times and places. The same kind of tools that allows us to engage in commerce anywhere and at any time also make it possible for some of us to work anywhere and at any time. In some cases this has been a boon and it has created new opportunities and better outcomes. In others, the experience may be less benign. Often this turns on the question of agency and control. Who exactly sets the terms of this new freedom to work anywhere and at any time?
It is true that the sharp line between work and home is a relatively recent development. For most of human history, where we worked and where we lived were by and large one and the same. So, in historical perspective, the neat separation between work and home that characterized modern, industrialized societies during the past century (although barely that) may ultimately appear as an abberation. It may seem, then, that digital technologies have retrieved an older form of life.
This is an example of a pattern worth noting. Whereas the modernity proceeded by differentiating, fragmenting, and specializing on the model of the machine, the digital age is marked by connection and entanglement. McLuhan opened Understanding Media with observations along these lines. “The restructuring of human work and association was shaped by the technique of fragmentation that is the essence of machine technology,” McLuhan argued. So in the modern, industrial world dominated by print, itself a mechanization of the word and a proto-industrial technology, seemingly neat distinctions and separations were the order of the day.3 Private life was sequestered from public spaces, work was clearly distinguished from home, reason and emotions were distinct, as were mind and body, nature and the human, fact and value, etc. First under the aegis of electronic and then digital media, these sharp lines were harder if not impossible to sustain.
But you never go back. What has happened cannot be undone. Digital media does not make whole what had been broken apart. It’s rather more like having the pieces thrown into a pile together. Work from home is not a return to agrarian modes of relatively autonomous subsistence. For most people, it is a job and a boss that are being introduced into the rhythms of home life, in which children, as has been widely recognized, are not meaningfully integrated but rather appear chiefly as logistical problems to be solved. What will be needed, in my view, is a new way of thinking about work altogether, not merely a migration of old jobs into new settings. And it may be that we get there, and that digital technologies will play a key role in making it happen. But the metaverse as it is presently being packaged is, from this vantage point, a tool that is already obsolete, centered as it is on virtual simulations of traditional office work.
11. It seems that the conglomeration of devices, apps, platforms, and networks that are now being repackaged as the metaverse simply push us along the path toward commercialization and datafication, the drive to render our experience quantifiable and subject to computational analysis. Life conducted within the metaverse is already reduced to data. If we were running up against the limits of profitably data-mining human experience in the so called “real world,” then translating even more of our experience into a realm of virtual simulacrum would open up a new frontier. Alternatively, if you’ve run out of physical goods to sell and physical spaces in which to place ads, then a new persistent virtual realm solves those problems. Either that or purchase ads in what may become the equivalent of billboards in space whose messages will be streamed via Youtube.
12. A note from Anne Helen Peterson’s latest newsletter: “In Vermont, there’s a statewide law prohibiting roadside bulletin boards — the culmination of a movement that goes back to the 1930s. When I first moved there, I didn’t even notice. But after a few weeks, I realized how quiet my mind had been on the long drives from the school where I worked into town. My attention was where I wanted it to be: on the road, on the landscape, in the music I was listening to, in my own thoughts.
It shouldn’t be too much to ask for spaces in one’s life that can remain sacrosanct, where we’re not subject to surveillance, where we’re not targeted for sales, where what we make doesn’t have to be immediately commodified and what we do can remain resistant to measure …”
11. From Matthew Crawford in 2015: “Attention is a resource; a person has only so much of it.4 And yet we’ve auctioned off more and more of our public space to private commercial interests, with their constant demands on us to look at the products on display or simply absorb some bit of corporate messaging. Lately, our self-appointed disrupters have opened up a new frontier of capitalism, complete with its own frontier ethic: to boldly dig up and monetize every bit of private head space by appropriating our collective attention. In the process, we’ve sacrificed silence — the condition of not being addressed. And just as clean air makes it possible to breathe, silence makes it possible to think.
What if we saw attention in the same way that we saw air or water, as a valuable resource that we hold in common? Perhaps, if we could envision an ‘attentional commons,’ then we could figure out how to protect it.”
12. Drew Austin concluded his newsletter by noting Ivan Illich’s argument for a commons of silence. I’ve cited that same essay by Illich before, but allow me to do so again here. “Just as the commons of space are vulnerable and can be destroyed by the motorization of traffic,” Illich argued, “so the commons of speech are vulnerable, and can easily be destroyed by the encroachment of modern means of communication.”
“As enclosure by the lords increased national productivity by denying the individual peasant to keep a few sheep,” Illich continued, “so the encroachment of the loudspeaker has destroyed that silence which so far had given each man and woman his or her proper and equal voice. Unless you have access to a loudspeaker, you now are silenced.”
Let me add something else that Illich wrote about the commons in another context:
“A commons is not a public space. A commons is a space which is established by custom. It cannot be regulated by law. The law would never be able to give sufficient details to regulate a commons. A typical tree on the commons of a village has by custom very different uses for different people. The widows may take the dry branches for burning. The children may collect the twigs, and the pastor gets the flowers when it flowers, and the nuts from it are assigned to the village poor, and the shadow may be for the shepherds who come through, except on Sundays, when the Council is held in the shadow of the tree.
The concept of the commons is not that of a resource; a commons comes from a totally different way of being in the world where it is not production which counts, but bodily, physical use according to rules that are established by custom …”
Illich spent the latter part of his life trying to resuscitate concepts like the commons, the vernacular, and subsistence. In each case, he was trying to preserve ways of being in the world that were not dominated by the logic of the market.
As Austin observed, “a Facebook or Microsoft metaverse would be the opposite of a commons: a theater for intellectual property in which sensory experience is a fully commoditized resource, every lacuna a content opportunity, and every moment of silence a reason for someone else to speak louder.”
Or, as McLuhan put it nearly 60 years ago, “Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit from taking a lease on our eyes and ears and nerves, we don't really have any rights left.”
13. In The Human Condition, Hannah Arendt explained how common sense had once been understood not as banal notions that were commonly held, but as the work of all of our senses working in tandem to perceive a world held in common with others. “Only the experience of sharing a common human world with others who look at it from different perspectives,” she wrote, “can enable us to see reality in the round and to develop a shared common sense.”
She also warned that “a noticeable decrease in common sense in any given community and a noticeable increase in superstition and gullibility are therefore almost infallible signs of alienation from the world,” and, thus, the seedbed of totalitarianism.
So, to put this another way, the metaverse would do for common sense, as Arendt understands it, what enclosure did to the commons. Having our perception of the world increasingly mediated by proprietary technologies that immerse us in ever more sophisticated realms of digital simulacra is a way of surrendering the experience of a shared reality with others.
14. It was recently suggested to me, in a discussion about embodiment and perception, that the phrase “come back to your senses” seemed rather loaded with significance As with the idea of “common sense,” we’ve taken coming back to your senses to mean something vaguely intellectual. But what if we took it literally? What if staying sane meant doing a better job of anchoring our experience to our senses?
Or, as Illich put it in lines I’ve cited here on more than one occasion, “Therefore, it appears to me that we cannot neglect the disciplined recovery, an asceticism, of a sensual praxis in a society of technogenic mirages. This reclaiming of the senses, this promptitude to obey experience […] seems to me to be the fundamental condition for renouncing that technique which sets up a definitive obstacle to friendship.”
Which is not to say that understanding past techno-social configurations on their own terms and tracing historical trajectories cannot be helpful. Such an exercise can illuminate the significance of new technologies by casting them against a different ground so that their distinctive figure might be more apparent.
While metaverse talk broke into the discourse over the last two months or so, it can be found in connection with Facebook and Oculus from at least 2016.
It is also true, as Bruno Latour argued, that while modernity told itself a story about how it was characterized by what it managed to separate (faith/reason, human/nature, politics/religion), the truth was that this story veiled all manner of hybridizations.
Maybe thinking of attention as a resource is part of the problem.