"Ideology grows stronger for our belief in a lie: that information has an additive property whereby at some point it becomes knowledge. This simply isn’t true. Outside the contextual frameworks that give information a place in life and a relationship to other information, it is quite literally meaningless. Would more state-issued facts about the Soviet economy in 1980, or more pages of talking points from an industry lobby, get you closer to the truth simply for not being untrue? Does knowing more trivia help someone build a better car or advance particle physics or write a more touching ballad? If we judge the “informed” as those who possess more information—more disembodied or decontextualized bits of trivia that are “true” in the sense of not being demonstrably false—we may find we have created a vacuous category (“conversance”) and that we need invented contexts, like the proliferating “news quizzes,” to put these incoherent facts to use. “Think You’re Smarter Than a Slate Senior Editor? Find Out With This Week’s News Quiz,” Slate suggests. “Did you stay up to date this week?” the New York Times’ news quiz more gently wonders. It’s only one step further to propose the news business itself and the practice of journalism as the proper object of the news connoisseur’s attention and interest. Asking such people’s opinions in polls, then, may do less to draw out “informed” commentary than to hold up a mirror to the culture’s own confusion.
“Truth” and “fact” in isolation do nothing to combat ideology and error. It merely benefits the news industry to pretend they do. I understand why people object to false equivalencies between MSNBC and Fox News, but to focus on veracity blinds us to the deeper effect of opinion and punditry per se. The pertinent question concerns the terms of the implicit contract between audience and commentator. If commentators serve the sensibility of their audiences—which the necessity of attracting and retaining viewers (or listeners or readers) in a competitive media environment ensures—it hardly matters that they traffic in fact or avoid untruth since the overall message people receive is: Your worldview is substantially right, and here are the arguments to insulate and fortify it. The purpose is to justify ideological frameworks as a way of dealing with uncertainty and to reinforce the complex social agreements on which these consensuses are built. When Fox News anchor Shepard Smith debunked what conspiracy theorists had dubbed the Hillary Clinton “uranium scandal” in 2017, his audience did not thank him for elucidating the truth, but suggested he belonged on CNN or MSNBC and that, for exposing a false story, he was anti-Trump. In other words, he had violated the terms of their contract, which was not to provide fact or best judgment but corroboration. Truth was welcome, but only truth that confirmed one view.
Thus while ideology and entertainment may seem at odds—entertainment is reputedly fun and lighthearted, where ideology is deadly serious—they are in fact flip sides of the same coin. Entertainment means to transfix, to keep you in place: watching, tuned in. It cannot ask you to endure discomfort, and the comfort it offers is often an uncomplicated intimacy, even a vicarious identification, with a celebrity—in the case of news, with the commentator or host. Because this person’s primary concern is your comfort—which is to say your attention and approval—a subtle con exists at the heart of the exchange. This person does not know who you are or, in any but the most superficial sense, care about you. But the illusion of a relationship is nonetheless paramount. It goes one step further, since part of the illusion, in the face of political confusion and distress, is that the news celebrity’s competence and clarity are your own. Her power is briefly yours, and while you inhabit the aura of her expertise you are safe from your own ignorance and the frustration of life among other people. The most fervent devotees of a cult or demagogue are those who mistake courtship for love and the power of a leader for their own. But when you step outside the aegis of a leader’s power, the aura of a pundit’s companionship, you realize, suddenly, that you are alone and unprepared. You were misled into thinking you were getting help when you were giving worship. Ideology takes root in this disappointment because the alternative is more painful: accepting that you’ve been conned."
Foucault’s rejection of the concept of justice led him to espouse several other views that were out of step with the left-wing consensus. For example, Foucault was skeptical of the notion of a “universal desire—much less a ‘right’—to any social good, including health care.” Daniel Zamora has argued that Foucault abandoned collective revolution entirely, focusing instead on “a wide array of micro-powers that operated at the level of sexual relations, schooling, family structures, expertise, science,” and everyday nuances in “social and cultural organization”. Foucault reduced revolution to a question of individual lifestyles—a trend we can see reflected today in the preoccupation with microaggressions, the need to oppose patriarchy in the family structure and the idea that using nonbinary pronouns is a revolutionary action. Foucault transformed revolution into a lifestyle of inclusion.
With his emphasis on lifestyle and individual choices, Foucault laid the foundation for the culture of the wokescold, which replaced the Chomskyite vision of a collective, class-driven left, focused on structural change. Foucault’s postmodern critique was not intrinsically married to any particular system of human organization. Foucault opposed Marxism for most of his life and his thought is consistent with neoliberal capitalism in striking ways.
Foucault’s revolutionary critique was largely concerned with social issues, particularly sexuality and mental health. However, there is nothing inherently revolutionary about a libertine social politics. The rejection of traditional childrearing practices, for example, is a boon to corporations, who would prefer not to have to give employees time off for parenting.
Foucault is insistent that the bourgeois mode of existence is wrong, and he is in favor of revolutionary thought, yet what happens when the systems of knowledge and power in a society reject the bourgeois mode of living? When the heteronormative patriarchy that has traditionally excluded all other identities is itself excluded, what is a postmodern thinker to do? Where should he focus his critique of systems of knowledge and power?
But in 1990, the NIMH suddenly and radically switched course, embarking on what it tellingly named the ‘Decade of the Brain’. Ever since, the NIMH has increasingly narrowed its focus almost exclusively to brain biology – leaving out everything else that makes us human, both in sickness and in health. Having largely lost interest in the plight of real people, the NIMH could now more accurately be renamed the ‘National Institute of Brain Research’.
This misplaced reductionism arose from the availability of spectacular research tools (eg, the Human Genome Project, functional magnetic resonance imaging, molecular biology and machine learning) combined with the naive belief that brain biology could eventually explain all aspects of mental functioning. The results have been a grand intellectual adventure, but a colossal clinical flop. We have acquired a fantastic window into gene and brain functioning, but little to help clinical practice.
What the millennial aesthetic sells, it sells through the promise of novelty. This is true even when the product on offer is not appreciably novel: cat food, Dutch ovens, and generic drugs are repackaged, redesigned, as if millennial buyers required a version all their own. Jessica Walsh, a graphic designer and founder of the creative agency &Walsh, dates the style to the last five years and sees its expiration date approaching already. “Everyone wants to look like the Casper, Warby Parker, or Aways of the world,” she explains, which has made branding increasingly interchangeable. “People are tired of the sameness and already craving something new.”
When the time comes — when smooth pastels start to feel a little tacky, when brown starts looking good again — what will be saved? As in any era, most of our belongings will be lost, but fewer than ever seem worth trying to preserve. In her article “Why Does This One Couch From West Elm Suck So Much?,” author Anna Hezel asks employees in a West Elm store how long that “Peggy” couch was, ideally, supposed to last. One to three years, they inform her.
Last year, the interior-design start-up Homepolish collapsed; last month, Casper made its disappointing IPO; last week, Outdoor Voices CEO Tyler Haney stepped down amid reports that her company, based on tastefully colored leggings, was losing cash. Design created an astonishing amount of value in the last ten years, and increasingly that value looks ephemeral. I remember visiting WeWork corporate offices in early 2016 and telling a friend that the space already felt period — larded and spackled with efforts to look designed ca. 2016, which now sounds like a very long time ago. Of course, I can also look around my apartment and see what threatens to wilt: boob poster, pink blanket, plants. We have lived through a moment in which design came to seem like something besides what it was, like a business model or a virtue or a consolation prize. The sense of safety promised in its soft, clean forms begins to look less optimistic than naïve.
Illustration by Fala Atelier
The center of this conviviality will be our (online) Workshops, where the Edenic-minded will be able to engage with the Utopians (or rather, in our term, Eutopians) in order to think well about a new society already emerging.
And what do we talk about in these (Zoom-based) Workshops? Here are some ideas:
Certain basic things that important novelists do, Houellebecq does not. Great novels usually concern the relationships, institutions, and ideals out of which the “bourgeois” social order is knit together—marriages, schools, jobs, piety, patriotism. But in our time, relationships fail to take root. Institutions fall apart. The visible social order seems not to be the real one. Many novelists limit their vision to those narrow precincts where the world still makes sense (or can be made to make sense) in the way it did to Balzac or Flaubert. Often these are contexts in which a set of rules has been bureaucratically imposed, or grandfathered in: a SEAL team in bestselling fiction, a university literature department in more arty work. Houellebecq is up to something different. He places his characters in front of specific, vivid, contemporary challenges, often humiliating and often mediated by technology: Internet pornography, genetic research, terrorism, prescription drug addiction. This technological mediation can make his characters seem isolated, and yet it is an isolation with which any contemporary can at least empathize. The Outsider is Everyman. Houellebecq’s reputation as a visionary rests on his depiction of what we have instead of the old bourgeois social order.
Phil Christman, at it again:
Some bad movies, for example, reveal through sheer lack of self-awareness the incoherencies and solecisms of the culture that produces them. These sorts of movies fascinate me in the way a too-honest idiot does, after he’s had three or four drinks. Red Dawn (1984) is notoriously enjoyable in this way. More recently, Ava DuVernay’s adaptation of A Wrinkle in Time exemplifies this sort of badness. (DuVernay has done excellent work before and after this film, particularly in her studies of the prison-industrial complex, 13th and When They See Us, so I attribute the rich and extravagant lousiness of A Wrinkle in Time to its screenwriters, and to its rumored short production schedule.) Like many memorably bad movies, A Wrinkle in Time is full of moments that, if done in a self-aware spirit, would constitute unanswerable satire. Passing this test are the scenes in which Oprah Winfrey, as magical Mrs. Which, stands two stories tall, shimmering like a hologram and smiling benignantly upon heroic young Meg Murray. What a brilliantly sly commentary this almost is on Oprah’s odd place in American life, how she patronizes us from her billionaire height while remaining trapped in the thankless, dehumanizing role of white femininity’s wise, bodiless, never-quite-real cosmic black friend. (More purely absurd is the moment when Reese Witherspoon transforms into a flying lettuce.)
Where the film does its greatest service, however—and where it provides a glimpse into American culture that is so dark, so total, that its badness lingers in the mind like greatness—is in the way it foregrounds the unconscious nihilism of the American worship of self-esteem. At one point, Mrs. Which says to Meg, “Do you realize how many events, choices, that had to occur since the birth of the universe leading up to the making of you? Just exactly the way you are.” She could say the same thing to Charles Manson, or to Henry Kissinger, or to a leaf blower, and be equally correct. (Manson no doubt would enthusiastically agree: What an unlikely path the universe took on its way to producing his uniquely authoritative self.) Soon after, Mrs. Which introduces Meg to a faun played by Zach Galifianakis. His job is to help Meg—who I must stress is a child—rescue her father—who I must stress is an astrophysicist, trapped at the other end of the universe, by an all-but-omnipotent evil intelligence. Galifianakis tells her that she can already do it: She’s simply choosing not to.
If you don't already know Chris Morgan's blog, Black Ribbon Award, you should check it out:
The true character of our lives comes from the circumstances of our death. A good life, we hope, better entails a good death, well cared-for and relatively light on suffering but which in any case ends with a formal burial in a place where loved ones can and will remember you. This is not always the case, as The Ring’s antagonist Samara Morgan (no relation) can tell you, having been unsuccessfully murdered by her mother and left to starve at the bottom of a well. The film from her perspective is about the transference and perpetuation of pain. From her victim’s perspective it is more complex. Being in Samara’s control for any amount of time is not ideal, yet underneath the control, the fear, and the pain she wants you to feel is a kind of mercy. Though you are on your own about the cure, Samara is remarkably straightforward about her process and intentions. You have x-amount of time before I do y, because of q-reasons that I have esoterically given. This is more than Samara got, this is more than most people get.
The Ring goes farther than most horror films in depicting the myth of the meaningful death. For all the trouble Samara puts her victims through, their lives are still made instrumental as part of a larger plan. The victims, moreover, are given options as to that instrumentality. They can aid in its spread and survive or they can put a stop to it and die. The choice is easier to consider in the abstract, and much harder with the similar options real life sometimes gives us. But this mode, with its countdown and the possibility (fleeting though it would be in the digital era) of moral victory, is still better than the more possible outcomes of reality. I think about this when I consider all the dystopian options my future has to offer, stemming as much from my own poor judgment as the uncontrollable downward drift of the times in which I am stuck. The best-case scenario being a quiet fade-out in some dingy corner of an institution for human odds n’ ends, hopefully discovered in a reasonable amount of time, followed by a group cremation and a trip to the nearest Staten Island landfill.
That is an unusual line to take given that horror is often accused of gratuitous dispensation of bodies. True enough, there are no martyrs in horror, but everyone stuck in a horrific world, for good or bad or for whatever, plays their role and does not go unappreciated in one way or another.
Writer - Critic - Poet - Editor