Welcome to the Convivial Society, a newsletter that is ostensibly about technology and culture but more like my effort to make some sense of the world taking shape around us. For many of you, this will be the first installment to hit your inbox—welcome aboard. And my thanks to those of you who share the newsletter with others and speak well of it. If you are new to the Convivial Society, please feel free to read this orientation to new readers that I posted a few months ago.
0. Attention discourse is my term for the proliferation of articles, essays, books, and op-eds about attention and distraction in the age of digital media. I don’t mean the label pejoratively. I’ve made my own contributions to the genre, in this newsletter and elsewhere, and as recently as May of last year.1 In fact, I tend to think that attention discourse circles around immensely important issues we should all think about more deliberately. So, here then, is yet another entry for the attention files presented as a numbered list of loosely related observations for you consideration, a form in which I like to occasionally indulge and which I hope you find suggestive and generative.
1. I take Nick Carr’s 2008 piece in The Atlantic, “Is Google Making Us Stupid?”, to be the ur-text of this most recent wave of attention discourse. If that’s fair, then attention and distraction have been the subject of intermittent public debate for nearly fifteen years, but this sustained focus appears to have yielded little by way of improving our situation. I say the “the most recent wave” because attention discourse has a history that pre-dates the digital age. The first wave of attention discourse can be dated back to the mid-nineteenth century, as historian Jonathan Crary has argued at length, especially in his 1999 book, Suspensions of Perception: Attention, Spectacle, and Modern Culture. “For it is in the late nineteenth century,” Crary observed,
within the human sciences and particularly the nascent field of scientific psychology, that the problem of attention becomes a fundamental issue. It was a problem whose centrality was directly related to the emergence of a social, urban, psychic, and industrial field increasingly saturated with sensory input. Inattention, especially within the context of new forms of large-scale industrialized production, began to be treated as a danger and a serious problem, even though it was often the very modernized arrangements of labor that produced inattention. It is possible to see one crucial aspect of modernity as an ongoing crisis of attentiveness, in which the changing configurations of capitalism continually push attention and distraction to new limits and thresholds, with an endless sequence of new products, sources of stimulation, and streams of information, and then respond with new methods of managing and regulating perception […] But at the same time, attention, as a historical problem, is not reducible to the strategies of social discipline. As I shall argue, the articulation of a subject in terms of attentive capacities simultaneously disclosed a subject incapable of conforming to such disciplinary imperatives.”
Many of the lineaments of contemporary attention discourse are already evident in Crary’s description of its 19th century antecedents.2
2. One reaction to learning that modern day attention discourse has longstanding antecedents would be to dismiss contemporary criticisms of the digital attention economy. The logic of such dismissals is not unlike that of the tale of Chicken Little. Someone is always proclaiming that the sky is falling, but the sky never falls. This is, in fact, a recurring trope in the wider public debate about technology. The seeming absurdity of some 19th-century pundit decrying the allegedly demoralizing consequences of the novel is somehow enough to ward off modern day critiques of emerging technologies. Interestingly, however, it’s often the case that the antecedents don’t take us back indefinitely into the human past. Rather, they often have a curiously consistent point of origin: somewhere in the mid- to late-nineteenth century. It’s almost as if some radical techno-economic re-ordering of society had occurred, generating for the first time a techno-social environment which was, in some respects at least, inhospitable to the embodied human person. That the consequences linger and remain largely unresolved, or that new and intensified iterations of the older disruptions yield similar expressions of distress should not be surprising.
3. Simone Weil, writing in Oppression and Liberty (published posthumously in 1955):
“Never has the individual been so completely delivered up to a blind collectivity, and never have men been less capable, not only of subordinating their actions to their thoughts, but even of thinking. Such terms as oppressors and oppressed, the idea of classes—all that sort of thing is near to losing all meaning, so obvious are the impotence and distress of all men in the face of the social machine, which has become a machine for breaking hearts and crushing spirits, a machine for manufacturing irresponsibility, stupidity, corruption, slackness and, above all, dizziness. The reason for this painful state of affairs is perfectly clear. We are living in a world in which nothing is made to man’s measure; there exists a monstrous discrepancy between man’s body, man’s mind and the things which at present time constitute the elements of human existence; everything is in disequilibrium […] This disequilibrium is essentially a matter of quantity. Quantity is changed into quality, as Hegel said, and in particular a mere difference in quantity is sufficient to change what is human in to what is inhuman. From the abstract point of view quantities are immaterial, since you can arbitrarily change the unit of measurement; but from the concrete point of view certain units of measurement are given and have hitherto remained invariable, such as the human body, human life, the year, the day, the average quickness of human thought. Present-day life is not organized on the scale of all these things; it has been transported into an altogether different order of magnitude, as though men were trying to raise it to the level of the forces outside of nature while neglecting to take his own nature into account.”
4. Nicholas Carr began his 2008 article with a bit of self-disclosure, which I suspect now sounds pretty familiar to most of us if it didn’t already then. Here’s what he reported:
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
At the time, it certainly resonated with me, and what may be most worth noting about this today is that Carr, and those who are roughly his contemporaries in age, were in the position of living before and after the rise of the commercial internet and thus had a point of experiential contrast to emerging digital culture.
5. I thought of this paragraph recently while I was reading the transcript of Sean Illing’s interview with Johann Hari about his new book, Stolen Focus: Why You Can’t Pay Attention—And How to Think Deeply Again. Not long after reading the text of Illing’s interview, I also read the transcript of his conversation with Ezra Klein, which you can read or listen to here. I’m taking these two conversations as an occasion to reflect again on attention, for it’s own sake but also as an indicator of larger patterns in our techno-social milieu. I’ll dip into both conversations to frame my own discussion, and, as you’ll see, my interest isn’t to criticize Hari’s argument but rather to pose some questions and take it as a point of departure.
Hari, it turns out, is, like me, in his early-40s. So we, too, lived a substantial chunk of time in the pre-commercial internet era. And, like Carr, Hari opens his conversation with Illing by reporting on his own experience:
I noticed that with each year that passed, it felt like my own attention was getting worse. It felt like things that require a deep focus, like reading a book, or watching long films, were getting more and more like running up and down an escalator. I could do them, but they were getting harder and harder. And I felt like I could see this happening to most of the people I knew.
But, as the title of his book suggests, Hari believes that this was not just something that has happened but something that was done to him. “We need to understand that our attention did not collapse,” he tells Illing, “our attention has been stolen from us by these very big forces. And that requires us to think very differently about our attention problems.”
Like many others before him, Hari argues that these “big forces” are the tech companies, who have designed their technologies with a view to capturing as much of our attention as possible. In his view, we live in a technological environment that is inhospitable to the cultivation of attentiveness. And, to be sure, I think this is basically right, as far as it goes. This is not a wholly novel development, as we noted at the outset, even if its scope and scale have expanded and intensified.
6. There’s another dimension to this that’s worth considering because it is often obscured by the way we tend to imagine attention and distraction as solitary or a-social phenomena. What we meet at the other end of our digital devices is not just a bit of information or an entertaining video clip or a popular game. Our devices do not only mediate information and entertainment, they mediate relationships.
As Alan Jacobs put it writing in “Habits of Mind in an Age of Distraction”:
“[W]e are not addicted to any of our machines. Those are just contraptions made up of silicon chips, plastic, metal, glass. None of those, even when combined into complex and sometimes beautiful devices, are things that human beings can become addicted to […] there is a relationship between distraction and addiction, but we are not addicted to devices […] we are addicted to one another, to the affirmation of our value—our very being—that comes from other human beings. We are addicted to being validated by our peers.”
This is part of what lends the whole business a tragic aspect. The problem of distraction can just as well be framed as a problem of loneliness. Sometimes we turn thoughtlessly to our devices for mere distraction, something to help us pass the time or break up the monotony of the day, although the heightened frequency with which we may do so may certainly suggests the signs of compulsive behavior. Perhaps it is the case in such moments that we do not want to be alone with our thoughts. But perhaps just as often we simply don’t want to be alone.
We desire to be seen and acknowledged. To exercise meaningful degrees of agency and judgment. In short, to belong and to matter. Social media trades on these desires, exploits them, deforms them, and never truly satisfies them, which explains a good deal of the madness.
7. In her own thoughtful and moving reflections on the ethical dimensions of attention, Jasmine Wang cited the following observations from poet David Whyte:
“[T]he ultimate touchstone of friendship is not improvement, neither of the other nor of the self. The ultimate touchstone is witness, the privilege of having been seen by someone, and the equal privilege of being granted the sight of the essence of another, to have walked with them, and to have believed in them, and sometimes, just to have accompanied them, for however brief a span, on a journey impossible to accomplish alone.”
8. Perhaps the first modern theorist of distraction, 17th century polymath Blaise Pascal had a few things to say about diversions in his posthumously published Pensées:
“What people want is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.”
“Nothing could be more wretched than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.”
“The only thing that consoles us for our miseries is diversion. And yet it is the greatest of our miseries. For it is that above all which prevents us thinking about ourselves and leads us to destruction. But for that we should be bored, and boredom would drive us to seek some more solid means of escape, but diversion passes our time and brings us imperceptibly to our death.”
9. Pascal reminds us of something we ought not to forget, which is that there may be technology-independent reasons for why we crave distractions. Weil had a characteristically religious and even mystical take on this. “There is something in our soul,” she wrote, “that loathes true attention much more violently than flesh loathes fatigue. That something is much closer to evil than flesh is. That is why, every time we truly give our attention, we destroy some evil in ourselves.”
We should not, in other words, imagine that the ability to focus intently or to give one’s sustained attention to some matter was the ordinary state of affairs before the arrival of digital technologies or even television beforehand. But this does not mean that new technologies are of no consequence. Quite the contrary. It is one thing to have a proclivity, it is another to have a proclivity and inhabit a material culture that is designed to exploit your proclivity, and in a manner that is contrary to your self-interest and well-being.
10. Human beings have, of course, always lived in information rich environments. Step into the woods, and you’re surrounded by information and stimuli. But the nature of the information matters. Modern technological environments present us with an abundance of symbolically encoded information, which is often designed with a view to hijacking or soliciting our attention. Which is to say that our media environments aggressively beckon us in a way that an oak tree does not. The difference might be worth contemplating.
Natural, which is to say non-human environments, can suddenly demand our attention. At one point, Klein and Hari discuss a sudden thunder clap, which is one example of how this can happen. And I can remember once hearing the distinctive sound of a rattlesnake while hiking on a trail. In cases like these, the environment calls us decidedly to attention. It seems, though, that, ordinarily, non-human environments present themselves to us in a less demanding manner. They may beckon us, but they do not badger us or overwhelm our faculties in a manner that generates an experience of exhaustion or fatigue.
In a human-built environment rich with symbolically encoded information—a city block, for example, or a suburban strip mall—our attention is solicited in a more forceful manner. And the relevant technologies do not have to be very sophisticated to demand our attention in this way. Literate people are compelled to read texts when they appear before them. If you know how to read and an arrangement of letters appears before you, you can hardly help but read them if you notice them (and, of course, they can be designed so as to lure or assault your attention). By contrast, naturally encoded information, such as might be available to us when we attend to how a clump of trees has grown on a hillside or the shape a stream has cut in the landscape does not necessarily impress itself upon us as significant in the literal sense of the word, as having meaning or indicating something to us. From this perspective, attention is bound up with forms of literacy.3 I cannot be hailed by signs I cannot recognize as such, as meaning something to me. So then, we might say that our attention is more readily elicited by that which presents itself as being somehow “for me,” by that which, as Thomas de Zengotita has put it, flatters me by seeming to center the world on me.
If I may press into this distinction a bit further, the question of purpose or intent seems to matter a great deal, too. When I hike in the woods, there’s a relative parity between my capacity to direct my attention, on the one hand, and capacity of the world around me to suddenly demand it of me on the other. I am better able to direct my attention as I desire, and to direct it in accord with my purpose. I will seek out what I need to know based on what I have set out to do. If I know how to read the signs well, I will seek those features of the landscape that can help me navigate to my destination, for example. But in media-rich human-built environments, my capacity to direct my attention in keeping with my purposes is often at odds with features of the environment that want to command my attention in keeping with purposes that are not my own. It is the difference between feeling challenged to rise to an occasion that ultimately yields an experience of competence and satisfaction, and feeling assaulted by an environment explicitly designed to thwart and exploit me.
11. Thomas de Zengotita, writing in Mediated: How the Media Shapes Your World and the Way You Live In It (2005):
“Say your car breaks down in the middle of nowhere—the middle of Saskatchewan, say. You have no radio, no cell phone, nothing to read, no gear to fiddle with. You just have to wait. Pretty soon you notice how everything around you just happens to be there. And it just happens to be there in this very precise but unfamiliar way […] Nothing here was designed to affect you. It isn’t arranged so that you can experience it, you didn’t plan to experience it, there isn’t any screen, there isn’t any display, there isn’t any entrance, no brochure, nothing special to look at, no dramatic scenery or wildlife, no tour guide, no campsites, no benches, no paths, no viewing platforms with natural-historical information posted under slanted Plexiglas lectern things—whatever is there is just there, and so are you […] So that’s a baseline for comparison. What it teaches us is this: in a mediated world, the opposite of real isn’t phony or illusional or fiction—it’s optional […] We are most free of mediation, we are most real, when we are at the disposal of accident and necessity. That’s when we are not being addressed. That’s when we go without the flattery intrinsic to representation.”
12. It’s interesting to me that de Zengotita’s baseline scenario would not play out quite the same way in a pre-modern cultural setting. He is presuming that nature is mute, meaningless, and literally insignificant. But—anthropologists please correct me—this view would be at odds with most if not all traditional cultures. In the scenario de Zengotita describes, premodern people would not necessarily find themselves either alone or unaddressed, and I think this indirectly tells us something interesting about attention.
Attention discourse tends to treat attention chiefly as the power to focus mentally on a text or task, which is to say on what human beings do and what they make. Attention in this mode is directed toward what we intend to do. We might say that it is attention in the form of actively searching rather than receiving, and this makes sense if we don’t have an account of how attention as a form of openness might be rewarded by our experience in the world. Perhaps the point is that there’s a tight correlation between what I conceive of as meaningful and what I construe as a potential object of my attention. If, as Arendt for example has argued, in the modern world we only find meaning in what we make, then we will neglect forms of attention that presuppose the meaningfulness of the non-human world.
13. Robert Zaretsky on “Simone Weil’s Radical Conception of Attention”:
Weil argues that this activity has little to do with the sort of effort most of us make when we think we are paying attention. Rather than the contracting of our muscles, attention involves the canceling of our desires; by turning toward another, we turn away from our blinding and bulimic self. The suspension of our thought, Weil declares, leaves us “detached, empty, and ready to be penetrated by the object.” To attend means not to seek, but to wait; not to concentrate, but instead to dilate our minds. We do not gain insights, Weil claims, by going in search of them, but instead by waiting for them: “In every school exercise there is a special way of waiting upon truth, setting our hearts upon it, yet not allowing ourselves to go out in search of it… There is a way of waiting, when we are writing, for the right word to come of itself at the end of our pen, while we merely reject all inadequate words.” This is a supremely difficult stance to grasp. As Weil notes, “the capacity to give one’s attention to a sufferer is a very rare and difficult thing; it is almost a miracle; it is a miracle. Nearly all those who think they have this capacity do not possess it.”
14. As I see it, there is a critical question that tends to get lost in the current wave of attention discourse: What is attention for? Attention is taken up as a capacity that is being diminished by our technological environment with the emphasis falling on digitally induced states of distraction. But what are we distracted from? If our attention were more robust or better ordered, to what would we give it? Pascal had an answer, and Weil did, too, it seems to me. I’m not so sure that we do, and I wonder whether that leaves us more susceptible to the attention economy. Often the problem seems to get framed as little more than the inability read long, challenging texts. I enjoy reading long, challenging texts, and I do find that, like Carr and Hari, this has become more challenging. But I don’t think reading long, challenging texts is essential to human flourishing nor the most important end toward which our attention might be ordered.
We have, it seems, an opportunity to think a bit more deeply not only about the challenges our techno-social milieu presents to our capacity to attend to the world, challenges I suspect many of us feel keenly, but also about the good toward which our attention ought to be directed. What deserves our attention? What are the goods for the sake of which we ought to cultivate our capacity for attention?
On this score, attention discourse often strikes me as an instance of a larger pattern that characterizes modern society: a focus on means rather than ends. I’d say it also illustrates the fact that it is far easier to identify the failures and disorders of contemporary society than it is to identify the goods that we ought to be pursuing. In “Tradition and the Modern Age,” Hannah Arendt spoke of the “ominous silence that still answers us whenever we dare to ask, not ‘What are we fighting against’ but ‘What are we fighting for?’”
As I’ve suggested before, may be the problem is not that our attention is a scarce resource in a society that excels in generating compelling distractions, but rather that we have a hard time knowing what to give our attention to at any given moment. That said, I would not want to discount the degree to which, for example, economic precarity also robs people of autonomy on this front. And I also appreciated Hari’s discussion of how our attention is drained not only by the variegated media spectacle that envelops us throughout our waking hours, but also by other conditions, such as sleeplessness, that diminish the health of our bodies taken whole.
15. Hari seems convinced that the heart of the problem is the business model. It is the business model of the internet, driven by ad revenue, that pushes companies to design their digital tools for compulsive engagement. This is, I think, true enough. The business model has certainly exacerbated the problem. But I’m far less sanguine than Hari appears to be about whether changing the business model will adequately address the problem, much less solve it. When asked by Sean Illing about what would happen if internet companies moved to a different business model Hari’s responses were not altogether inspiring. He imagines that under alternative models, such as subscription based services for example, companies would be incentivized to offer better products: “Facebook and other social media companies have to ask, ‘What does Sean want?’ Oh, Sean wants to be able to pay attention. Let’s design our app not to maximally hack and invade his attention and ruin it, but to help him heal his attention.” In my view, this overestimates the power of benevolent design and underestimates the internal forces that lead us to seek out distraction. Something must, at the end of the day, be asked of us, too.
16. Subtle shifts in language can sometimes have surprising consequences. The language of attention seems particularly loaded with economic and value-oriented metaphors, such as when we speak of paying attention or imagine our attention as a scarce resource we must either waste or horde. However, to my ears, the related language of attending to the world does not carry these same connotations. Attention and attending are etymologically related to the Latin word attendere, which suggested among other things the idea of “stretching toward” something. I like this way of thinking about attention, not as a possession in limited supply, theoretically quantifiable, and ready to be exploited, but rather as a capacity to actively engage the world—to stretch ourselves toward it, to reach for it, to care for it, indeed, to tend it.
Hari and other critics of the attention economy are right to be concerned, and they are right about how our technological environment tends to have a corrosive effect on our attention. Right now, I’m inclined to put it this way: our dominant technologies excel at exploiting our attention while simultaneously eroding our capacity to attend to the world.
Klein and Illing, while both sympathetic to Hari’s concerns, expressed a certain skepticism about his proposals. That’s understandable. In this case, as in so many others, I don’t believe that policy tweaks, regulations, shifting economic models, or newer technologies built on the same assumptions will solve the most fundamental challenges posed by our technological milieu. Such measures may have their role to play, no doubt. But I would characterize these measures as grand but ultimately inadequate gestures that appeal to us exactly to the degree that they appear to require very little of us while promising to deliver swift, technical solutions. For my part, I think more modest and seemingly inadequate measures, like tending more carefully to our language and cultivating ways of speaking that bind us more closely to the world, will, in the admittedly long, very long run, prove more useful and more enduring.
Postscript: In his opening comments, Klein makes the following observation: “And the strangest thing to me, in retrospect, about the education I received growing up — the educations most of us receive — is how little attention they give to attention.”
Around 2014 or so, I began to think that one of my most important roles as a teacher was to help students think more deliberately about how they cultivated their attention. I was helped in thinking along these lines by a 2013 essay by Jennifer Roberts, “The Power of Patience.” In it, Roberts wrote the following:
During the past few years, I have begun to feel that I need to take a more active role in shaping the temporal experiences of the students in my courses; that in the process of designing a syllabus I need not only to select readings, choose topics, and organize the sequence of material, but also to engineer, in a conscientious and explicit way, the pace and tempo of the learning experiences. When will students work quickly? When slowly? When will they be expected to offer spontaneous responses, and when will they be expected to spend time in deeper contemplation?
I want to focus today on the slow end of this tempo spectrum, on creating opportunities for students to engage in deceleration, patience, and immersive attention. I would argue that these are the kind of practices that now most need to be actively engineered by faculty, because they simply are no longer available “in nature,” as it were. Every external pressure, social and technological, is pushing students in the other direction, toward immediacy, rapidity, and spontaneity—and against this other kind of opportunity. I want to give them the permission and the structures to slow down.
Some earlier posts on attention from me: “Spectrum of Attention” (2015), “Attention and the Moral Life” (2016), “Your Attention is Not a Resource” (2021).
Incidentally, one interesting 1883 paper cited by Crary, which give you a flavor for the intellectual climate has the title, “Reaction Time and Attention in the Hypnotic State.”
My use of the word literacy in this conventional way is itself interesting. Literacy, properly speaking, applies to the ability to decode the meaning of human writing. A literate person is a person who is able to read. But we use the term more broadly and analogically. To speak of media literacy, for example, is to suggest an ability to “read” all manner of media, whether written or not. Likewise, my use implies an ability to “read” the landscape. It is striking to note how deeply literacy impressed itself upon the mind of literate populations so that all forms of meaning seeking are understood by analogy to reading. It recalls Ivan Illich’s claim in In the Vineyard of the Text that “The book has now ceased to be the root-metaphor of the age; the screen has taken its place. The alphabetic text has become but one of many modes of encoding something, now called ‘the message.'”
Share this post