During the COVID-19 outbreak, Jacobson v. Massachusetts became the fountainhead for pandemic jurisprudence. Courts relied on this 1905 precedent to resolve disputes about religious freedom, abortion, gun rights, voting rights, the right to travel, and many other contexts. But Justice John Marshall Harlan’s decision was very narrow. It upheld the state’s power to impose a nominal fine on an unvaccinated person. No more, no less. Yet, judges now follow a variant of Jacobson that is far removed from the Lochner-era decision. And the Supreme Court is largely to blame for these errors. Over the course of a century, four prominent Justices established the irrepressible myth of Jacobson v. Massachusetts. This myth has four levels.
The first level was layered in Buck v. Bell (1927). Justice Holmes recast Jacobson’s limited holding to support forcible intrusions onto bodily autonomy. The Cambridge law did not involve forcible vaccination, but Holmes still used the case to uphold a compulsory sterilization regime. The second level was layered in 1963. In Sherbert v. Verner, Justice Brennan transformed Jacobson, a substantive due process case, into a free exercise case. And he suggested that the usual First Amendment jurisprudence would not apply during public health crises. The third level was layered in 1973. In Roe v. Wade, Justice Blackmun incorporated Jacobson into the Court’s modern substantive due process framework. Roe also inadvertently extended Jacobson yet further: during a health crisis, the state has additional powers to restrict abortions. The fourth layer is of recent vintage. In South Bay Pentecostal Church v. Newsom, Chief Justice Roberts’s “superprecedent” suggested that Jacobson-level deference was warranted for all pandemic-related constitutional challenges. This final layer of the myth, however, would be buried six months later in Roman Catholic Diocese of Brooklyn v. Cuomo. The per curiam decision followed traditional First Amendment doctrine, and did not rely on Jacobson. But Jacobson stands ready to open up an escape hatch from the Constitution during the next crisis. The Supreme Court should restore Jacobson to its original meaning, and permanently seal that escape hatch
While the concept of the dignified constitution came out of a specific historical context, the twentieth century proved that Bagehot’s insights travel well. The British monarch’s “reserve powers” to dissolve Parliament and choose a prime minister are exercised mechanically on advice by the government of the day, though in theory they remain as a last check on political ambitions that might endanger constitutional arrangements. Longer-serving monarchs have an informal ability to “advise” and “warn” prime ministers out of the public eye.3 Other countries like Spain and Japan have relied on monarchy to symbolize continuity and bridge deep divides. The Spanish monarchy was refounded after a divisive civil war and four decades of Francisco Franco’s dictatorship.4 The king also helped suppress a right-wing coup attempt in 1982, for example, by donning military uniform to urge the troops to return to the barracks and respect the democratic constitutional settlement. Japan’s emperors were mere figureheads when the shoguns held “efficient” power, and after 1945 were designated a symbol of the nation. Deference to them persisted amid rapid postwar social change.5 In republics, too, the style of office also reflects particular traditions: from the self-effacing German presidency to the majestic pomp surrounding the French president in his head of state role.
Many contemporary Westerners view law as did Roscoe Pound, who famously called it a tool of “social engineering”: something the community uses both to reflect itself and to change itself to achieve certain results. Both the wider Western legal tradition and Confucius’s notion of li help us see that one cannot simply coerce social change by commanding substantive ends in positive law. Rather, human law can facilitate social change by rewarding, punishing, or even simply valuing certain actions and thereby also communicating the inherent value of that action. Law does not so much dictate values as habituate them by encouraging their practice.
More importantly, Western legal theory and Confucianism encourage us to ask, not how to use the law to create a better society, but what society’s current laws are already communicating and how they might need to change. Do they serve the higher standard by which human law should be judged, whether one calls it the law of nature or right principles? For the final goal of both law and li is less without than within: that we order ourselves according the higher order on which every human society and person depends.
What all these solutions are circling around but never quite name is the irreducible non-identity at the heart of the human person. Our identity-instability points to this metaphysical aporia, namely, that we are, as Erich Przywara puts it, marked by “the illimitable openness of the movement of becoming.” It is, in other words, “a creaturely principle” that we are non-identical. We are, and yet we change. Further, we are, and we appear, and these two aspects of ourselves are not simply identical. Non-identity fractures the person. These aporias are one reason why post-moderns chose simply to embrace this non-identity in a literal way, à la Foucault: we do not have a face.
For all that, however, we are also marked by identity. This is the value in the primary, logical meaning of identity. The metaphysical bases for this identity are multiple. First, human beings are substances, that is, individually existing persons that exist on their own. Substances “stand under” (as the term literally states) the features that mark our lives, namely, the qualities, relations, and locations that can come and go (these are also called metaphysical “accidents”). I may undergo dementia and not remember my family and friends, but I would be still the same human being who once remembered and then does not.
Second, as persons, human beings are a particular kind of substance: we have a rational and embodied nature. This nature does not change as I change; I am still the kind of thing that I was as a girl. Through all my non-identity, that is, through all my changes, I am still as human as ever and the same person as ever.
Third, individual human persons have souls, which are intimately related to our changing bodies. A human soul, as the form of a living body, organizes its material flux (Locke’s “constantly fleeting Particles of Matter”) around a unifying and governing center. My body, despite all its change, is still my body, because my soul ensures its continuity.
For all that, human beings do not find this metaphysical identity, which gives us a perduring structure underlying change, to be enough. As both Balthasar and John Paul II point out, this continuity is necessary but still not sufficient to answer the question “Who am I?” How can we account for each person’s irreducible uniqueness, which sets me apart from all other human persons?
At the heart of the chaos and fragmentation which seems to characterize modern society lies a notion of the self which rests upon deep, often unnoticed, philosophical assumptions that shape not only how we think of ourselves as individuals in relationship to others but how society as a whole thinks of itself, how it frames its moral discourse, and how it decides who does and who does not truly count. To justify these claims, it is helpful to revisit a point made by the German philosopher G. W. F. Hegel.
Hegel begins his famous section in the Phenomenology of Spirit on lordship and bondage with the following statement:
Self-consciousness is in and for itself, when, and by the fact that, it is in and for itself for another self-consciousness; that is, it is only as something recognized.
The point Hegel is making is important: selfhood is a dialogue, even a dialectic, between self-consciousnesses. I may intuitively think of myself as defining who I am but in fact my identity, or sense of selfhood, is the result of my interaction with my environment, specifically with other self-consciousnesses. This process Hegel characterizes as “recognition.” This is not recognition in the simple, commonsense manner in which a friend might call to me across the street as she recognizes my face. Rather, it is a more significant sense whereby I am ascribed legitimacy and value by another and, therefore, in relation to that other. A good illustration of this is provided by the creation of Eve in Genesis 2. Upon seeing her, Adam declares that she is “bone of my bone and flesh of my flesh.” He has clearly recognized her as different from all other creatures, possessing an affinity with himself that he shares with no other. We might say that Adam truly comes to know himself at that point precisely because he knows (recognizes) Eve.
Hegel’s notion of recognition is important not simply because it exposes the falsity of our intuitive sense that each of us is sovereign over our own selfhood. It is also important because it highlights the fact that our sense of selfhood stands at the nexus of freedom and belonging. The desire to be free—indeed, the intuitive feeling that I am, or at least should be, free—is a fundamental part of what it means to be human. Unlike other animals, we are intentional beings. The beaver builds a dam instinctively; humans build dams intentionally. There is indeed some truth to the idea that for us existence precedes essence. I could have chosen numerous careers, but I chose to be a teacher. I could have remained single, but I chose to marry. And yet freedom is not all there is to being human. We also want to belong, to be recognized. Everything from the language I speak to the way I dress is a means by which I am located in, and belong to, a wider society. We might say that the vocabulary, grammar, and syntax of recognition are not set by me but by the world into which I am born and by which I need to be recognized. And this arguably involves a sacrifice, or modification, of my freedom with reference to social rules and conventions in order that I might belong to (be recognized by) that society.
Starting in November 2017, on the heels of the devastation of Hurricane Irma, photographer Anna Barry-Jester and I set out to conduct a long-term lyric investigation, in verse and photography, into architecture, urban landscapes, and global warming in the city of Miami, Florida. We hoped to record the ways in which rising water and extreme weather continue to alter the built environment and human geography of the city.
When it comes to the economic repercussions of storm surges, flooding, and sea-level rise, 2020 modeling shows that Miami is the most vulnerable major coastal city in the world, with $400 billion in assets at risk by 2040. According to the 2018 US National Climate Assessment, global average sea levels will likely rise another one to four feet by the end of this century, which would put large areas of the city under water. Miami-Dade County itself relies on projections that place sea levels at approximately two feet higher by 2059, and continuing to rise beyond that. Consequences of this are already apparent: frequent flooding in coastal and inland areas, saltwater intruding into the drinking-water supply, and increasingly unusable roads and septic systems. And although its nickname is the Magic City, Miami can’t make the water disappear.
What’s more, a flickering flame in the cave may have conjured impressions of motion like a strobe light in a dark club. In low light, human vision degrades, and that can lead to the perception of movement even when all is still, says Susana Martinez-Conde, the director of the Laboratory of Visual Neuroscience at the Barrow Neurological Institute in Phoenix, Ariz. The trick may occur at two levels; one when the eye processes a dimly lit scene, and the second when the brain makes sense of that limited, flickering information.
Physiologically, our eyes undergo a switch when we slip into darkness. In bright light, eyes primarily rely on the color-sensitive cells in our retinas called cones, but in low light the cones don’t have enough photons to work with and cells that sense black and white gradients, called rods, take over. That’s why in low light, colors fade, shadows become harder to distinguish from actual objects, and the soft boundaries between things disappear. Images straight ahead of us look out of focus, as if they were seen in our peripheral vision. The end result for early humans who viewed cave paintings by firelight might have been that a deer with multiple heads, for example, resembled a single, animated beast. A few rather sophisticated artistic techniques enhance that impression. One is found beyond the Hall of Bulls, where the cave narrows into a long passage called the Nave.
Writer - Critic - Poet - Editor