Getting Down to "Essentials"

The Supreme Court handed down its decision in Roman Catholic Diocese of Brooklyn v. Cuomo (consolidated with Agudath Israel v. Cuomo) on the brink of Thanksgiving Day. That is a traditional time for government bodies to announce actions that they would rather no one attended to. Obviously, that was not the Court’s expectation or intent. If I may add to the cacophony of praise, condemnation and analysis, here are the thoughts that came to my mind after reading the Justices’ opinions. There are six of them: an unsigned per curiam majority opinion, two concurring opinions (by Justices Gorsuch and Kavanaugh), a wishy-washy dissent by Chief Justice Roberts, and more vigorous ones by Justices Breyer and Sotomayor. It’s not obvious why all six were necessary. This may be an instance of the old lament, “I wrote a long letter, because I didn’t have time to write a short one.”

1. The nub of the disagreement between the majority and the full-throated dissenters is their different views of the role of religion in life. To the majority, the ability to practice one’s faith is by definition “essential” – not an absolute that overrides all other considerations but to be limited only under exigent circumstances. “At a minimum, [the First] Amendment prohibits government officials from treating religious exercises worse than comparable secular activities, unless they are pursuing a compelling interest and using the least restrictive means available.” (Gorsuch, J., concurring)

The dissenters (other than the Chief Justice, who conceded that “it may well be that such restrictions violate the Free Exercise Clause” but opined that a decision could be put off till another day) took a more relaxed view: that “state officials seeking to control the spread of COVID-19 . . . may restrict attendance at houses of worship so long as comparable secular institutions face restrictions that are at least equally as strict”. In this instance, “comparable secular institutions” included “lectures, concerts, movie showings, spectator sports, and theatrical performances”. (Sotomayor, J., dissenting) Missing is the concept that what goes on inside “houses of worship” is of any greater significance, or entitled to greater legal protection, than a screening of the latest Star Wars sequel.

Just how little significance the dissenters attach to religious practice is shown by their casual conclusion that the restrictions they would uphold – no more than ten or 25 people present at services, depending on Governor Cuomo’s classification of the edifice’s neighborhood as “red” or “orange” – are justified by “the conditions medical experts tell us facilitate the spread of COVID-19: large groups of people gathering, speaking, and singing in close proximity indoors for extended periods of time”. (Sotomayor)

Assuming for the sake of argument that assemblages of eleven or 26 people in venues that can hold hundreds constitute “large groups of people gathering . . . in close proximity”, perhaps New York’s restrictions are rational for “non-essential” secular institutions. The position of Justices Breyer, Sotomayor and Kagan – that no more justification is needed before imposing them on churches and synagogues – assumes that worship is also “non-essential” and that restrictions on gatherings for worship are subject to only the lightest degree of judicial scrutiny: that they not be irrational.

The dissenters’ argument encapsulates the rapid descent of the Free Exercise Clause in the progressive hierarchy of rights. Less than a decade ago, in Hosanna-Tabor Evangelical Lutheran Church and School v. Equal Employment Opportunity Commission, a unanimous Supreme Court (on which all of the current dissenters sat) held that a parochial school could, without regard to the Americans with Disabilities Act, discharge a teacher whom it classified as a minister. The Brooklyn Diocese and Agudath Israel presented at least as strong a case: Governor Cuomo’s decrees effectively prohibit organized worship by most Christians and Jews. Indeed, Orthodox synagogues in red zones cannot admit any women or any children under age 13 to their services, because the ten adult males needed to comprise a minyan take up all the available slots. Yet at least three Justices are untroubled. Religious bodies may be able to choose their leaders without state interference, but the state can keep their followers away for reasons that meet the least demanding Constitutional standard.

2. The exchange of views between Justice Gorsuch and Chief Justice Roberts opens a window on the untoward side effects of working remotely. The details will interest only lawyers (and won’t interest most lawyers very much). What is noteworthy is the waspish tone, which seems to evidence a bit more than ordinary lawyerly disagreement. Justice Gorsuch criticizes at length a single citation in one of the Chief Justice’s past opinions, arguing (quite correctly, I believe) that it was inapposite. The Chief Justice complains in response that the citation was merely offered in support of an uncontroversial proposition. (Neither mentions the other two citations in the criticized opinion, both of which are even further off point.) One can’t help but think that these passages would have either gone unwritten or have been differently phrased if the Justices had conferred in person rather than video chat. Zoom is a poor substitute for human contact.

3. One of the majority’s argument against the Cuomo Decrees was that, while religious gatherings were capped at ten or 25 attendees, “essential” businesses in red zones and both “essential” and many “non-essential” businesses in yellow zones “may decide for themselves how many persons to admit”.

Aside from its importance to the legal issues in the case, that distinction raises a policy question. The aim of locking down economic activity is to inhibit the spread of coronavirus through human contact. But how can that strategy be effective if, as the Court observes, “the list of ‘essential’ businesses includes things such as acupuncture facilities, camp grounds, garages, as well as many whose services are not limited to those that can be regarded as essential, such as all plants manufacturing chemicals and microelectronics and all transportation facilities”?

One of the few COVID-19 facts about which everyone concurs is that the novel coronavirus is highly contagious and can spread rapidly from a handful of individuals to millions. That is exactly what happened early this year. If every American lived alone in a pod for two weeks, perhaps the virus would die out, but it’s hard to see how it won’t survive and (from its own point of view) prosper if given the opportunity to spread among “essential” workers and then into the “non-essential” population. Are we not, at tremendous cost, bailing water with sieves?


The Futile Quest for the Non-Binary Pronoun

Let us imagine that, back around the time of Beowulf, the speakers of ancestral English had been woke enough to develop an extra, nonbinary pronoun to go with “he”, “she” and “it”, to be employed when the referent’s gender was unknown, unimportant or ambiguous. “Everybody has thoz own opinion”, an Anglo-Saxon warrior might have said (in translation) before deplatforming via decapitation the tho foolish enough to disagree with his own.

Would that linguistic happenstance have spared us the phenomenon that Joseph Epstein wittily ponders in his review of a book titled What’s Your Pronoun? Beyond He and She, published by the appropriately and sanctimoniously named “Liveright Press”?

No, it wouldn’t have. How do I know that? Because our language did have a closely parallel development, and the wokerati wound up hating it just as much as they do the generic “he”.

“Man” and “male” historically weren’t the exact synonyms that they have become (or, to be precise, that bien pensant opinion declares they must become). “Man” was available – and was used – to refer to members of the human race abstracted from their sex. “Male” referred only to men with certain biological features. (This was in olden times, when there were two sexes rather than an infinity of genders.)

It’s obvious why “man” retained an alternative meaning of “male”. In the social conditions that existed until quite recently, the most prominent men, the ones most often talked and written about, were males. If women had been socially dominant, “man” would have assimilated with “female”. If the sexes had filled all roles equally, “man” would have had overtones of neither.

The fate of “man” and “male” would have been inevitable for “he, his, him” and our hypothetical “tho, thoz, thom”. Where tho was uttered, the tho in question would most often have been male. Over time, tho would have become he’s twin, leaving our enlightened age with the same “problem” of binary pronouns, for which the same ugly and awkward “solutions” would be put forward.

The singular “they”, “he or she”, the made-up neologisms (how odd that no one espouses the obvious portmanteau “she/he/it”, run together into one syllable) and so forth are worse than ugly and awkward. The insistence that everyone choose a pronoun set and that everyone else rigorously obey the chooser’s dictate is a symptom of the petty tyrannical mindset that is so endemic to our time. Fie on it, I say. Laissez faire, laissez parler!


On the Road to Electoral Perdition

National Review Online today offers its readers an admiring article on “Why California Republicans Stopped Complaining about Ballot Harvesting and Embraced the Process”. The quote that summarizes the “why” comes from a California member of the Republican National Committee:

The issue of ballot harvesting is we don’t like it. We don’t agree with it. However, it’d be political malpractice not to do it where the other side is doing it, and the other side has done it effectively.

For those who may not know, “ballot harvesting” is the practice – illegal in most states but authorized by law in California – of having party operatives collect mail-in ballots and deliver them to the polls. The opportunities for bribery, intimidation and fraud are obvious.

In 2018 California Democrats picked up seven Republican-held Congressional seats thanks to harvested ballots. In 2020, as the NRO article glowingly relates, Republicans struck back with their own harvesting operation, won back several of the seats lost in 2018 and prevented any further Democratic pickups.

The state’s Democratic attorney-general naturally sued (without success) to shut down the GOP effort, declaring that it is “illegal to tamper with a citizen’s vote”. Ballot harvesting has thus become a bipartisan device. It will no doubt grow and mutate in the future, and election outcomes will come to depend as much on the skills of the vote reapers as the qualities of the candidates.

The leading advocates of relaxing election security and chucking the secret ballot are currently Democrats and progressives, but there’s nothing inherently ideological about cheating. South of our border, both the “right wing” Porfiriato and the “left wing” Partido Revolucionario Institucional relied on fraud to keep control of the government for decades on end.

The California Republicans’ harvesting has so far taken the relatively innocuous form of placing collection boxes in churches and other sites where Republicans are more likely to be found than Democrats. Still, once the principle is accepted that failing to imitate the other side is “political malpractice”, escalation is inevitable. If it is “political malpractice” to refrain from imitating the other party’s dubious expedients, isn’t it also malpractice not to try to push further ahead? Once partisan enthusiasm is no longer constrained, how wide is the distance between collecting ballots and coercing voters into casting them? And, after that, pulling voters out of the cemeteries?

The United States already has the least secure election process of any major democracy. We are careening down the road to elections without guard rails, to which the adjective “democratic” will be applicable only with a heavy infusion of cynicism or irony.


Should We Care What “History” Will Say?

The “Verdict of History”, which so many regard as the ultimate judgment on events, is rendered by always ill-informed, and often tainted, jurors. The verdict on the strange year 2020 (has any year ever been so inaptly numbered?) will be no different.

The “instant histories” doubtless started going to press the instant that the media declared that the Biden/Harris ticket had won, if not before. Everybody expects those to be no more than first drafts of history. Most will be lucky to reach the level of zeroth drafts. Yet they will be free from two faults of later accounts: The writers live in the “temporal country” that they describe (“The past is a foreign country; they do things differently there.”), and they don’t know the shape of controversies to come.

After not very long, historians become foreigners, and what they write about the now-alien past inevitably has an eye toward the use to which history can be put by their contemporaries. They may have access to documents and other evidence that became available only after the fact, but vital information – in particular, a clear understanding of the assumptions that everyone at the time accepted – will have vanished.

In 1961, when World War II was a recent event, the eminent historian A. J. P. Taylor wrote a book on its origins. In it, he described the war as an accident brought about by inept diplomacy, characterized Adolf Hitler as a traditional German statesman and praised appeasement as the best hope Europe had for peace.

Taylor’s work was an especially plain instance of adjusting the past to the needs of the present. He was a committed socialist (a communist in his youth) and Labour Party stalwart, writing at a time when one of the greatest issues on the Left was opposition to the American hard line, embraced by Democrats and Republicans, against the Soviet Union. “Munich” was the anti-communists’ bête noire. Taylor’s aim was to show that Munich wasn’t so bad.

On the other side of the political spectrum, consider the origins of World War I. In the immediate aftermath of World War II, when reconciliation with West Germany was a high priority, diplomatic history saw the Great War as stemming from the breakdown of an unstable alliance system, an outcome that no one desired. The clearest message was that nationalism and the nursing of ancient enmities were prime obstacles to peace.

Then, as the Cold War intensified, the analysis shifted. Imperial Germany, historians averred (particularly those who supported anti-Soviet policies), had sought war in order to dominate Europe. Only a firm stance by the other Great Powers could have restrained her.

The Cold War is now a receding memory, and one can’t help but notice that the thesis of “German guilt” has lost traction. Responsibility for the Twentieth Century’s defining catastrophe is being spread around again, and confusion has replaced malevolence as the central villain.

So it will be with Wuhan flu, the Great Lockdown, Donald Trump, Joe Biden and the rest of this year’s turmoil. The jury is out, it will be out forever, and what it decides won’t be worth bothering about.


Is the Secret Ballot Dead?

Could one American in a thousand formulate a coherent argument against the secret ballot? Everyone agrees that it is an essential safeguard against intimidation and bribery. Ward heelers can offer cash for votes. Thugs can threaten. Bosses can put workers in fear of their jobs. Husbands can domineer. That’s all useless if Jack can’t verify that Jill kept her promise to put her mark next to a particular name.

But in 2020 America, Jack can verify it. He can watch Jill as she fills out her ballot, or he can fill it out for her and drop it into the mailbox.

In the Roman Republic, which conducted elections by secret ballot throughout the century that preceded its demise, the richest and most influential men had little objection to the ballot – so long as secrecy was optional and therefore ineffective. We have come to much the same state of affairs.

Oddly, though, one hardly ever hears this point raised in arguments about mail-in voting. Defenders of the practice think it sufficient to insist that fraud, in the sense of phony or altered ballots, is rare. American politicians have, as we all know, consistently been figures of unblemished virtue who would no more steal a vote than do favors for a campaign contributor, so it’s plausible that they would shrink from tampering with votes cast by mail.

Yet even after we’ve ruled out the possibility of fraud, are we comfortable with jettisoning secrecy, with exposing every citizen to the risk that someone whom he can’t afford to, or simply doesn’t wish to, displease will demand to see his vote? That’s where we are today. True, we’ve barely stepped onto the road. It has a long way to go, but the destination is unmistakable.


“Two Nations” in the Future

The idea of “two nations”, politically conjoined but psychologically separate and economically unequal, was old when Benjamin Disraeli used it as the subtitle of his novel Sibyl in 1845. A hundred seventy-five years later, it’s a cliché on both the Right and the Left.

Left and Right agree that this is a deplorable state of affairs. History has shown so many instances of one “nation” extracting forced labor from the other: slavery, serfdom, the nomenklatura dominance in totalitarian regimes.

Interestingly, though, Great Britain wasn’t like that when Disraeli wrote. As Paul Johnson relates in his classic account, The Birth of the Modern, Britain had just set out on the road to unimaginably greater economic and social equality, in the first stage of which the pioneer modernizers became very rich while the rest of what was still a backward and impoverished society lagged behind. To the Luddites, that inequality could be due only to feudal-like exploitation. In reality, it was rendering feudalism obsolete.

It’s easy to see a parallel today, when figures like Bill Gates and Jeff Bezos have amassed their billions not by inheriting latifundia but by selling people products that make their lives more comfortable and convenient.

That leads to a persuasive analogy, which I am hardly the first to notice, between the 19th Century “birth of the modern” and the 21st Century stirrings of the post-modern: innovative technologies, entrepreneurs of sometimes unscrupulous character, fortunes dispersed in good living or good works, vigorous rocking of ideological cradles.

And there’s one conspicuous difference. The moguls of two centuries ago, like those of today, strove for political influence. They didn’t all share the same politics, but the great majority promoted policies that made it easier to make money through commerce and industry and resisted policies that “promised abundance for all/ By robbing selected Peter to pay for collective Paul”.

In the 21st Century, the moguls again aren’t unanimous, but most of them, including the most innovative, now back candidates and positions that are inimical to their economic interests. They are the Peters who will be taxed and regulated to enable multitudes of Pauls to live in modest, unchallenging, collective ease.

I’m not going to enter the contention to explain this phenomenon. Jeff Bezos and I are so many degrees of separation apart that you would need a new trigonometry to count them. (But I was a first week Amazon customer, so he’s partly my fault.) What I will offer is a quotation from a history book yet to be written. The author knows our era only from its records, so it’s natural for him to see intention behind accident. I quote:

The Cognitive Elite had no reason to fear socialism. Globalization meant that enterprise on their scale had become invulnerable to the vagaries of national governments. “But when they persecute you in this city, flee ye into another.” Material progress bounded over the speed bumps that governments placed in its path. Those who did the most to accelerate that progress, and who benefited most blatantly from it, were undeterred from associating with public figures who wanted to tear down all that the Elite had built, but couldn’t, and who otherwise agreed with the Elite’s rejection of archaic restraints on individual volition. The reactionaries who advocated laissez faire also pitched self-restraint, modesty, fidelity and other encumbrances to doing what one willed. The Elite were happy that there was no need to collude with them.

Will the society thus portrayed flourish or disintegrate? We will have to leave it to the Future to find out.


The Name of the Blog

If I am the first to coin “panicdemic” as the term for the world’s current distress, creativity is dead beyond resurrection. If I am not the first, I apologize for my unconscious theft.

Plagues and pandemics have been subjects of literature since ancient times. A multitude of writers have speculated about how they might devastate society in the future. Yet, so far as I am aware, no speculator, however bold, foresaw any chain of events like what has occurred during the past ten months. Did anyone predict –

  • that the recipient of the fearsome name “pandemic” would be a disease that apparently kills around two percent of its victims?
  • that the all but universal reaction all over the world would be the shuttering of most businesses, schools, churches, parks and other places where people might gather in more than minuscule numbers?
  • that these measures would continue month after month while production fell, store shelves emptied and unemployment soared?
  • that this cessation of normal life would be popular and government officials who failed to implement it stringently would be stigmatized?
  • that the facts about the spread of the disease and its severity would be enshrouded in a fog of contradictory statistics, until anyone with any point of view could cite reams of authoritatively stated facts to support it?
  • and, finally, that the facts that individuals chose to believe would be 90 percent predictable from their political affiliation?

Whatever set of facts you believe now, how would you have reacted if, in November of last year, some science fiction writer had published a novel with all of the preceding among its premises and plot lines? He would have had to self-publish. No reputable house would have accepted such obviously incredible claptrap.

In retrospect, we can explain it all, whether we like the Great Shutdown or not. Over the last half century, the world has grown much safer than at any previous time in history, and nearly all nations have enjoyed unprecedented prosperity. Less nearly all, but most, have simultaneously been spared major wars or domestic disturbances. The natural reaction to the shock of a new, virulent disease was “safety first”. That was certainly the preference of the people who have the most impact on public opinion: politicians, journalists, scientists, top business executives and they like. For them isolation is a minor burden, thanks to computers, cell phones and the Internet. Those same channels deluged the public with information in quantities too vast for mortals to understand and evaluate, at a time when social trust was at a low ebb. In need of a winnowing principle, many accepted the views of the public figures whom they liked, and those figures often times formed their own views in reflexive reaction to those of their friends or enemies.

It was all so obvious. How could anyone have expected anything else?