At the heart of the chaos and fragmentation which seems to characterize modern society lies a notion of the self which rests upon deep, often unnoticed, philosophical assumptions that shape not only how we think of ourselves as individuals in relationship to others but how society as a whole thinks of itself, how it frames its moral discourse, and how it decides who does and who does not truly count. To justify these claims, it is helpful to revisit a point made by the German philosopher G. W. F. Hegel.
Hegel begins his famous section in the Phenomenology of Spirit on lordship and bondage with the following statement:
Self-consciousness is in and for itself, when, and by the fact that, it is in and for itself for another self-consciousness; that is, it is only as something recognized.
The point Hegel is making is important: selfhood is a dialogue, even a dialectic, between self-consciousnesses. I may intuitively think of myself as defining who I am but in fact my identity, or sense of selfhood, is the result of my interaction with my environment, specifically with other self-consciousnesses. This process Hegel characterizes as “recognition.” This is not recognition in the simple, commonsense manner in which a friend might call to me across the street as she recognizes my face. Rather, it is a more significant sense whereby I am ascribed legitimacy and value by another and, therefore, in relation to that other. A good illustration of this is provided by the creation of Eve in Genesis 2. Upon seeing her, Adam declares that she is “bone of my bone and flesh of my flesh.” He has clearly recognized her as different from all other creatures, possessing an affinity with himself that he shares with no other. We might say that Adam truly comes to know himself at that point precisely because he knows (recognizes) Eve.
Hegel’s notion of recognition is important not simply because it exposes the falsity of our intuitive sense that each of us is sovereign over our own selfhood. It is also important because it highlights the fact that our sense of selfhood stands at the nexus of freedom and belonging. The desire to be free—indeed, the intuitive feeling that I am, or at least should be, free—is a fundamental part of what it means to be human. Unlike other animals, we are intentional beings. The beaver builds a dam instinctively; humans build dams intentionally. There is indeed some truth to the idea that for us existence precedes essence. I could have chosen numerous careers, but I chose to be a teacher. I could have remained single, but I chose to marry. And yet freedom is not all there is to being human. We also want to belong, to be recognized. Everything from the language I speak to the way I dress is a means by which I am located in, and belong to, a wider society. We might say that the vocabulary, grammar, and syntax of recognition are not set by me but by the world into which I am born and by which I need to be recognized. And this arguably involves a sacrifice, or modification, of my freedom with reference to social rules and conventions in order that I might belong to (be recognized by) that society.
Starting in November 2017, on the heels of the devastation of Hurricane Irma, photographer Anna Barry-Jester and I set out to conduct a long-term lyric investigation, in verse and photography, into architecture, urban landscapes, and global warming in the city of Miami, Florida. We hoped to record the ways in which rising water and extreme weather continue to alter the built environment and human geography of the city.
When it comes to the economic repercussions of storm surges, flooding, and sea-level rise, 2020 modeling shows that Miami is the most vulnerable major coastal city in the world, with $400 billion in assets at risk by 2040. According to the 2018 US National Climate Assessment, global average sea levels will likely rise another one to four feet by the end of this century, which would put large areas of the city under water. Miami-Dade County itself relies on projections that place sea levels at approximately two feet higher by 2059, and continuing to rise beyond that. Consequences of this are already apparent: frequent flooding in coastal and inland areas, saltwater intruding into the drinking-water supply, and increasingly unusable roads and septic systems. And although its nickname is the Magic City, Miami can’t make the water disappear.
What’s more, a flickering flame in the cave may have conjured impressions of motion like a strobe light in a dark club. In low light, human vision degrades, and that can lead to the perception of movement even when all is still, says Susana Martinez-Conde, the director of the Laboratory of Visual Neuroscience at the Barrow Neurological Institute in Phoenix, Ariz. The trick may occur at two levels; one when the eye processes a dimly lit scene, and the second when the brain makes sense of that limited, flickering information.
Physiologically, our eyes undergo a switch when we slip into darkness. In bright light, eyes primarily rely on the color-sensitive cells in our retinas called cones, but in low light the cones don’t have enough photons to work with and cells that sense black and white gradients, called rods, take over. That’s why in low light, colors fade, shadows become harder to distinguish from actual objects, and the soft boundaries between things disappear. Images straight ahead of us look out of focus, as if they were seen in our peripheral vision. The end result for early humans who viewed cave paintings by firelight might have been that a deer with multiple heads, for example, resembled a single, animated beast. A few rather sophisticated artistic techniques enhance that impression. One is found beyond the Hall of Bulls, where the cave narrows into a long passage called the Nave.
Modernity’s monsters, as evoked by Stoker and Eliot, express quintessentially modern horrors. Is there no end to the fluidity of my self? Is there no eternity to my body beyond more of the same, its feeding and consuming? Is there any interior core to my person, or am I a walking stuffed effigy? These terrors can find no relief from scientific treatises on blood or even from the catharsis of a horror movie.
One of the horrors of the zombie apocalypse is its inevitability, given how easily zombies conquer. One bite, and a person is reduced to the basest instincts of mindless and destructive feeding: small wonder that many zombie works end on ambiguous or outright despairing notes, as the protagonist becomes the living dead. This new post-modern apocalyptic vision recasts the resurrection from the dead as a nightmarish scenario, in which we become the living dead. What hope can there be?
Yet, T. S. Eliot provides a glimpse of hope at the end of “The Waste Land.” In a few lines near the very end of the poem, London Bridge, that site of the zombie hordes of the Unreal City, returns in an ambiguous way. Eliot evokes a destruction that is perhaps a renewal providing relief: “London Bridge is falling down falling down falling down / Poi s’ascose nel foco che gli affina” (426-47). The line, Eliot’s later notes tell us, is from Dante’s Purgatorio: “Then he hid himself in the fire that refines him.”
Let me be mean for a second. What’s the real difference between conspiracists and a popularized, that is a teachable version of social critique inspired by a too quick reading of, let’s say, a sociologist as eminent as Pierre Bourdieu (to be polite I will stick with the French field commanders)? In both cases, you have to learn to become suspicious of everything people say because of course we all know that they live in the thralls of a complete illusio of their real motives. Then, after disbelief has struck and an explanation is requested for what is really going on, in both cases again it is the same appeal to powerful agents hidden in the dark acting always consistently, continuously, relentlessly. Of course, we in the academy like to use more elevated causes—society, discourse, knowledge-slash-power, fields of forces, empires, capitalism—while conspiracists like to portray a miserable bunch of greedy people with dark intents, but I find something troublingly similar in the structure of the explanation, in the first movement of disbelief and, then, in the wheeling of causal explanations coming out of the deep dark below. What if explanations resorting automatically to power, society, discourse had outlived their usefulness and deteriorated to the point of now feeding the most gullible sort of critique?8 Maybe I am taking conspiracy theories too seriously, but it worries me to detect, in those mad mixtures of knee-jerk disbelief, punctilious demands for proofs, and free use of powerful explanation from the social neverland many of the weapons of social critique. Of course conspiracy theories are an absurd deformation of our own arguments, but, like weapons smuggled through a fuzzy border to the wrong party, these are our weapons nonetheless. In spite of all the deformations, it is easy to recognize, still burnt in the steel, our trademark:Made in Criticalland. Do you see why I am worried? Threats might have changed so much that we might still be directing all our arsenal east or west while the enemy has now moved to a very different place. After all, masses of atomic missiles are transformed into a huge pile of junk once the question becomes how to defend against militants armed with box cutters or dirty bombs. Why would it not be the same with our critical arsenal, with the neutron bombs of deconstruction, with the missiles of discourse analysis? Or maybe it is that critique has been miniaturized like computers have. I have always fancied that what took great effort, occupied huge rooms, cost a lot of sweat and money, for people like Nietzsche and Benjamin, can be had for nothing, much like the supercomputers of the 1950s, which used to fill large halls and expend a vast amount of electricity and heat, but now are accessible for a dime and no bigger than a fingernail. As the recent advertisement of a Hollywood film proclaimed, “Everything is suspect . . . Everyone is for sale . . . And nothing is what it seems.”
Once an esoteric theoretical stance, the basic premises of the “school of suspicion” have now become commonplace, shared as they are across a range of ideological groupings and subcultures. In his 2004 essay “Why Has Critique Run Out of Steam?,” the French sociologist of science Bruno Latour asked what the popularization of suspicion means for the dominant modern intellectual project of social critique that arose out of the work of figures like Marx, Nietzsche, and Freud. As Latour puts it, social critique had been dedicated to combating “ideological arguments posturing as matters of fact.” However, he wonders whether the greater problem is now “an excessive distrust of good matters of fact,” a suspicion that all claims conceal “bad ideological biases.” His central example is climate science, whose typically right-wing critics allege that its supposed objectivity conceals particular interests — an argument not unlike ones made by left-leaning academic social critics.
Latour accordingly asks if there is a “real difference between conspiracists and a popularized … version of social critique.” After all,
in both cases, you have to learn to become suspicious of everything people say because of course we all know that they live in the thralls of a complete illusio of their real motives. Then, after disbelief has struck, and an explanation is requested for what is really going on, in both cases again it is the same appeal to powerful agents hidden in the dark acting always consistently, continuously, relentlessly. Of course, we in the academy like to use more elevated causes — society, discourse, knowledge-slash-power, fields of forces, empires, capitalism — while conspiracists like to portray a miserable bunch of greedy people with dark intents, but I find something troublingly similar in the structure of the explanation.
All of this suggests that Latour’s title question, asking whether social critique has run out of steam, is partially misleading. Critique, in his account, has become vernacularized, and in the process its operations have not run out of steam but rather accelerated beyond the academy’s control. As he notes, for any major news story, “the smoke of the event has not yet finished settling before dozens of conspiracy theories begin revising the official account.”
He compares the work of the lone intellectual iconoclasts of the past — think of Marx, Nietzsche, and Freud — to the 1950s supercomputers operated by technical experts. Today, by contrast, average people can perform the operations of radical critique as easily as they can use the miniaturized computers in their pockets. Latour thus wonders whether his concern is just a “patrician spite for the popularization of critique.” But the condescension can also run in the other direction: As he remarks, “my neighbor in the little Bourbonnais village where I live looks down on me as someone hopelessly naïve” for accepting the U.S. government’s account of the 9/11 attacks.
On the other hand, Latour was right to see that traditional critique was running out of steam in academic circles — in part just because it had been vernacularized, as the rise of the “red pill” metaphor suggests. For instance, an influential new approach that emerged in the early 2000s under the name “postcritique” pushed back against the instinctively suspicious sensibility that had long dominated many academic fields. Perhaps, like a luxury product that loses its allure from an abundance of cheap imitations, the intellectual capital to be accrued by demystifying and debunking and unmasking declined once it was embraced by the masses.
This article investigates waymaking, the use of language to dedicate space to the traffic of animals, goods, fuel, waste, and people. It argues that the rhetorical creation of traversable clearances anticipates and services the formation of infrastructure. Through a close reading of Daniel Defoe’s A Journal of the Plague Year (1722), I show how literary critics can analyze the words that create the emptiness that allows conduits to happen and claim this emptiness as an analytical object in itself. By tracing modern conceptions of infrastructure as assemblage, occasion, and patterning to the fray of early modern waymaking, I claim that criticism can supplement social-scientific research by casting as poetic event and autopoietic phenomenon the human practice of reserving space for utilities.
In the late 1940s and the early 1950s, Burnham was viewed as a maverick member of the trans-Atlantic anti-communist left. To use the contemporary term, he was “canceled” by the left-leaning intellectual establishment when he defended Sen. Joseph McCarthy’s demagogic hearings about real and imaginary Soviet subversion in the United States. He joined the young William F. Buckley Jr. in founding National Review in 1955, where he remained an editor until his death in 1987, shortly after being awarded the Presidential Medal of Freedom by Ronald Reagan.
One might expect that the Cold War liberals and anti-communist socialists who became known as “neoconservatives” in the 1970s and 1980s would have embraced Burnham’s proposed revision of Marxism by managerial theory. After all, his left-to-right journey had blazed the trail that many of them had taken. Ironically, however, like good orthodox Marxists, the neoconservatives rejected the idea of the managerial revolution as a transition from one kind of society to another. For both Daniel Bell, a self-described democratic socialist, and Irving Kristol, co-founders of The Public Interest, small owner-operated businesses and globe-straddling multinational firms alike could be described without qualification as “capitalist” (a term for which Kristol often used “bourgeois” as a synonym).
Bell, who described himself as a socialist in economics, a liberal in politics, and a conservative in culture, was influenced by the economist Joseph Schumpeter, who argued in Capitalism, Socialism and Democracy (1942) that the cultural hedonism unleashed by consumerism would dig capitalism’s own grave and lead to a socialist society. For his part, Kristol rejected the idea of a broad managerial elite and instead appropriated the term “the new class,” coined by Đilas, and later associated with the work of the sociologist Alvin Gouldner, to refer to academics, journalists, and nonprofit activists whose left-wing “adversary culture” made them hostile to “bourgeois” capitalism and traditional “bourgeois” values alike. (Gouldner objected to this use of the term “new class.”)
In January 1978, the 73-year-old Burnham joined the debate in a review in National Review of Alfred Chandler’s study of the rise of managerial capitalism, The Visible Hand, titled “What New Class?” Burnham mocked the neocons for arguing that the “new class” of “intellectuals, verbalists, media types” could be compared in power and influence to “the managers of ITT, GM, or IBM, or the administrator-managers of the great governmental bureaus and agencies.”
Implausible though it was, the “new class” theory allowed neoconservative apparatchiks and other members of what has been called Conservatism Inc. to seek grants from rich libertarian donors and pro-business foundations by promising to prevent the allegedly imminent overthrow of capitalism by left-wing professors and nonprofit activists belonging to “the adversary culture.” Needless to say, Burnham’s view that corporate managers and bank executives are secure at the apex of the potentially oppressive managerial elite was not useful to conservatives in fundraising campaigns.
What is worse, the meaning of performative in contemporary parlance, while not very precise, is almost exactly the opposite of the word’s original meaning. When journalists refer to former president Trump as a “performative” figure, or accuse celebrity activists of “performative wokeness,” all they are saying, in that absurdly pretentious way, is that “it’s all for show.” Something “performative” is a mere performance, an act of theatricality, a tableau of artifice behind which there is nothing, or at least nothing substantial or authentic. Dare one point out that there are better ways—more vivid, precise, effective, and cogent ways—to express the same idea, without resorting to the wooden abstraction of this Latinate word?
But more to the point, and in defense of performative, it is a technical academic word that was invented to serve a particular purpose. The British philosopher J.L. Austin (1911–60) was an influential exponent of the view that our use of language must in some instances be understood as a form of action, and not merely as a system of signifiers that record and order the structure of reality. His most famous work, How to Do Things with Words (1955), is the locus classicus for the understanding of what he called a “performative utterance,” and he would go on to label such utterances “speech acts,” uses of language that are not describing something—indeed, are not even susceptible of being judged true or false, real or artificial—but doing something.
Writer - Critic - Poet - Editor