Thursday, February 29, 2024

Opinion in Good [Hu]men

Back when I was pretty moist [just for you, Paul Morrison] behind the ears as the guy standing in front of the class and trying to teach, I was often judged to be too opinionated, according to the reviews that students filed at the end of a semester.  In my mind that’s not necessarily a bad thing.  I thought of what John Milton says in “Areopagitica,” that “opinion in good men [by which Milton meant humans] is but knowledge in the making.”  The assumption of goodness attributed to me is, needless to say, absolutely right, of course.

 

But then it occurred to me that despite Milton’s wisdom, the way of expressing opinion does, after all, condition how that opinion is received.  If you’re talking to someone who in your opinion is an idiot, it does absolutely no good to say, “you’re an idiot.”  If the person is really an idiot, there’s no remedy, ultimately—but wending your way around to convincing yourself that your opinion is a matter of fact will at least give some satisfaction.

 

So I worked my way around to a different way of expressing opinions in classes.  An example follows.

 

The usual reading of Robert Frost’s well-known poem, “The Road Not Taken,” is that Frost is praising the person who marches to a different drummer, who boldly goes where no human has gone before. And of course that’s an almost irresistible reading.  Probably everyone knows the poem, but just in case, here it is:

 

Two roads diverged in a yellow wood,

And sorry I could not travel both

And be one traveler, long I stood

And looked down one as far as I could

To where it bent in the undergrowth;

 

Then took the other, as just as fair,

And having perhaps the better claim,

Because it was grassy and wanted wear;

Though as for that the passing there

Had worn them really about the same,

 

And both that morning equally lay

In leaves no step had trodden black.

Oh, I kept the first for another day!

Yet knowing how way leads on to way,

I doubted if I should ever come back.

 

I shall be telling this with a sigh

Somewhere ages and ages hence:

Two roads diverged in a wood, and I—

I took the one less traveled by,

And that has made all the difference.

 

The standard reading leans very heavily on the final two lines.  And, in my opinion, that reading is just inadequate to the subtlety of the poem—a subtlety that Frost is often considered to not have, by the way, because his language seems so conversationally straightforward, colloquial, easy enough to understand. All of those characteristics of Frost’s language are true.  But subtle, I’m afraid, Frost really is.

 

So my old approach to presenting the poem in class was to say something like this:  “Yes, there’s a suggestion of the speaker’s uniqueness of choice in the poem—no doubt about that.  But leaving the poem as being all about the wonders of such uniqueness gets contradicted by almost everything in the poem, from the title, which focuses not on the unique choice but on the choice not made, to the limited vision of the speaker, who can therefore not judge anything about uniqueness or lack or uniqueness, to clear-eyed observations like ‘the passing there / Had worn them really about the same’ and ‘both that morning equally lay / In leaves no step had trodden black.’  So surely the idea that the speaker chose the unique path is just incorrect.”

 

Talk about putting people’s backs up!

 

I worked my way into an alternative approach, then.  I’d hear the standard response to the poem, and then ask, “How would you fold ‘the passing there / Had worn them really about the same’ into that reading?  And similar questions for the other evidence—that’s what the terms of the poem gives us, after all, evidence.

 

Students would wrangle with the questions, come to whatever conclusion they thought most appropriate.  Ignore the evidence, though, they couldn’t—I’d insist on that, for sure.  Ultimately I hoped that they would come to my conclusion, that the poem is fundamentally about the nostalgia for a past where the full spectrum of options in the future are wide open, and of course the wistfulness that comes when one is older because the choices one has made foreclose such openness, so one is more or less stuck in what one has chosen.  I know—“stuck” is pretty pejorative, and it’s often the case that one’s path is wonderful.  Nonetheless, the poem suggests, one is stuck in that path, though wonderful.

 

But in fact, I came not to really care about students’ opinions coinciding with mine (even though, of course, I was right J).  The real point was to get students to think through evidence, including the give and take in the class discussions as other people’s opinions were articulated, come to sustainable conclusions, and then write essays that supported those conclusions, using the evidence adduced to reach them.  If research sources came into play on a given assignment, then the evidence of those sources would come into play as well.  But the important thing, I concluded, was the process, not necessarily the “correctness” of the opinion expressed.

 

I’ve had an online presence, as they say, for many years.  I think my first encounter with the world of online discussion boards was back in the early ‘90s—my memory being what it is, I can’t date it more narrowly.  In those many years of observing and participating in online discussions, I’ve experienced many many good things.  I’ve also seen many localized world wars, mostly on the basis of an opinion asserted as I used to assert the correct reading of the poem, or on the basis of misunderstood irony or sarcasm or humor or . . . fill in your favorite linguistic shtick.  Those have been painful to see and to participate in.  Emoticons don’t work all the time, and as many a literary scholar will say, irony and sarcasm are the most difficult verbal structures to convey.  As Robert Scholes says, to know that “the majestic gingkoes of Brooklyn” is an ironic phrase requires knowledge of what gingkoes look like.  Add the brevity that online communications almost demand, and the result is . . . localized world wars.

 

Maybe we need an online version of the UN?  But then the SCOTUS might conclude that such intercession is prohibited by the First Amendment.

Sunday, January 7, 2024

Tit for Tat

 Politics recently has become a ferocious, often silly game of tit for tat.  You impeached our president, well bless your heart, we’ll impeach yours.  You accused our guy of grifting, well honey how about your guy.

It’s silly, but profoundly dangerous.  Because I’m interested in what sometimes is grandly called the life of the mind, I’m really concerned about the tit for tat as it attacks thought, speech, ideas generally.

 

I come to the question from the point of view that free speech is the bedrock of any democracy, so my perspective may or may not be your cup of tea.  But as I see it, the desire to regulate speech, and via speech to regulate thought, leads inexorably to totalitarianism.  It’s no accident that the most dangerous element of the society in Orwell’s 1984 is the thought police and its key tool, the memory hole.

 

The memory hole makes the requirements of the immediate present the only criterion for what is true and what is false.  In doing so the memory hole also condition the past so that it meets the requirements of present exigency, but its main function is to make the present the sole criterion of what is or is not acceptable.

 

With that in mind, I think about actual efforts to make the present, or at least the present as it’s filtered through some ideological sieve, the basis for considering the past.  In Florida, for instance, thanks to the ideological filter of the far right and what I see as its hysterical fear of the truth, schools are proscribed from teaching the truth about slavery as it was practiced in the US.  The notion promulgated by the state, that slaves were more or less content with their condition, is so very very far from the truth that the whole of the past has to be rewritten, the truth erased, in order to meet the requirements of the ideology that dominates the state government.

 

The process is not quite a memory hole, to be sure, but the next best thing.  The actual past is still available to the diligent student who wants to do the most cursory of research outside of the bounds of what the schools are allowed to teach.  The effect for the great majority of kids in the schools of Florida is nonetheless to falsify the past according to the requirements of the present dominant ideology.

 

The same is true on the other side of the ideological coin.  It was not that long ago that the left wing in the US decided that The Adventures of Huckleberry Finn, when it wasn’t simply banned in schools, had to be censored so that every instance of the n-word was replaced by the word “slave.”  There is an edition of the novel, edited by Alan Gribben and published in 2011, that does exactly that.  Prof. Gribben’s intention was good.  He was concerned that an unredacted version of the novel would simply be banned so that kids would not have ready access to it.  And that would have been a shame, of course, because Huckleberry Finn is as essential to understanding the US as is the Constitution.

 

But substituting the one word for the other simply falsifies the past.  It is just a matter of fact that Americans have constantly used the n-word.  Whites use it to diminish and subordinate Black people; Black people use it to reappropriate the word and assert their own position in the world.  Substituting “slave” for the word is to falsify that fact.

 

It also substitutes a legal, indeed a Constitutional condition for a social reality.  As David Sloane points out in “The N-Word in Adventures of Huckleberry Finn Reconsidered,” the substitution makes it impossible to hear the point or understand the significance of “the climactic assertion by the recovering Tom that Jim ‘ain’t no slave; he’s as free as any cretur that walks this earth!’”  I think that Tom here squints across the legal and the social realities of being Black in America before the Civil War.  But surely the legal status of a slave does not equal the social import of the n-word.

 

The left wing’s motive in erasing painful terms is to protect the sensibilities of a present-day reader, a mirror image of the right wing’s motive in banning books that trace the painful past.  For me, censoring Huckleberry Finn is about the same as banning Toni Morrison’s The Bluest Eye.  Both actions do what the memory hole does.  They both seek to impose a present valuation on the past—in Morrison’s case a relatively recent past at that.  The result is to destroy the reality of the past.

 

I suspect that the banning of books by right wing folks takes its initial impetus from the censoring of books by left wing folks.  It’s the right wing tat for the left wing tit.  But as far as I’m concerned, the two wings lift the same bird.

Thursday, January 4, 2024

Free Speech

 I understand the argument that free speech is an essential part of any democracy. I'm 100% in favor of free speech. Can't live without it, really. At the same time, I'm also very much aware that free speech provokes—or rather "encourages," a more neutral term—response, which is equally free in its expression.

Such responses are sometimes very sharp, but they are always framed by social expectations, which is to say whatever has become standard social practice. Once upon a time it might have been hilarious to make a "cripple" jokes, and for some people it might still be ok to laugh at them. But the consensus seems to be that such jokes are reprehensible, so that when dearleader mocked the reporter, Serge Kovalesk, who suffers from arthrogryposis, most people, dearleader's minions aside, were at least disgusted by the then presidential candidate's behavior.

There was pushback. To be sure, dearleader and his epigones shrugged off the pushback. But there has not been a recurrence of such "joking," so perhaps the pushback had an effect.

That's the way that free speech works. You say X, which provokes me to say Y and elicits Z from Joe Blow, and X' from . . . . In the interaction of statements arises the very soul of democracy. As Milton says in "Areopagitica," "Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making." Milton puts a great deal of weight on the optimistic idea of "desire to learn" and of the beneficence of "good men," an optimism limited to Protestants since he explicitly excluded Catholics from the orbit of free speech.

We might find such an exclusion ridiculous—although certainly it was still a crucial aspect of American culture when JFK ran for office. But given the religious wars of Europe in the 17th century, excluding Catholics in a Protestant country was as expected as excluding Protestants in a Catholic country. So too Milton's use of "men" to mean "humans," since "men" in 1644 meant all sexes. Given the standard social practice of the 17th century, no one found the term offensive, as people might well do in the first third of the 21st century.

Sometimes the standard social practices of a period get formalized in "speech codes," as they're called. Milton's exclusion of Catholics might be a "speech code" of sorts. In 17th century England, the exclusion of Catholics also had the force of law. But in the 21st century, speech codes have no legal force, so that Milton's use of "men" to denote all human beings, offensively patriarchal though it might be to a great many people nowadays, is still in use by a great many other people, and there is no law to proscribe such usage.

The same thing goes for a great many other offensive terms and practices. No one with the slightest sense of decency would use the "n-word" to denote people who are not of European descent. But people still use that word, and although there is much much pushback against such usage, there is no law that prohibits it. Instead, as one of the benefits of free speech, the usage of the term by some people defines the character of those people. If under the license of free speech you use the n-word, and so tell me who you are, I will believe that you are that person. And I will speak against you.

The issue becomes more complicated when institutions are mixed into the equation. Dave Chapelle got a great deal of pushback for his "jokes" about gay and transexual people, for instance. That pushback from people is part of the normal interaction of free speech with other free speech. But then institutions got involved and, in Chapelle's own terms, he got "canceled," caught in the web of "cancel culture."

I realize that comics are particularly at risk in the give and take of free speech. Freud argues that the best, in the sense of the most laughter-provoking jokes are what he calls "tendentious," attack jokes. Aristophanes attacks male arrogance in "Lysistrata," philosophical arrogance in "The Clouds," and dramatists' self-importance in "The Frogs," and so on. They are hilarious plays. The objects of attack no doubt responded to Aristophanes. Still, I imagine that Socrates would have preferred "The Clouds" to the glass of hemlock that the state of Athens required him to drink.

My sense is that "cancel culture" responds to the pressure of standard social practices, fair or unfair as those practices may be. Chapelle says that corporate entities are more responsive to the pressures of the LGBTQ+ community that they are to the experience of Black people, so that Black entertainers who mock the LGBTQ+ community are "canceled" while Black citizens get shot. There may be some truth to the influence of the LGBTQ+ community, but nonetheless I think Chapelle is mixing oranges with apples. Killing people requires a legal response, whereas making jokes evokes the pushback of free speech in response to free speech. Breonna Taylor and George Floyd and Daunte Wright, and so many many others are martyrs. Chapelle is not.

"Cancel culture" is what happens in the give and take of free speech when it's a corporate entity that participates in the exchange of speech. Al Franken's "molestation" photograph led to his resigning from the US Senate, for instance. Sarah Silverman's black-face episode led to her disappearing from screens. Is it legitimate for corporate entities to "speak" as individuals might? The SCOTUS asserts that corporate entities have free speech rights, so legal practice makes it perfectly acceptable for corporations to respond to free speech in whatever way they wish. To my mind it would make better sense for the corporation to speak and argue rather than to silence and "cancel." But that's not the way corporations seem to want to respond.

We do not have an Athenian jury condemning comics or philosophers to death. "Cancel culture," unfortunate as I think it to be, is an effect of free speech countering free speech. To be sure, we do have efforts to substitute state power for free speech. All you have to do is consider the laws promulgated by the Florida state government. Unlike Chapelle, whose "cancellation" followed from his "joking" about the LGBTQ+ community, a teacher in some public school in Tampa or Jacksonville or Pensacola can end up in jail were they to say "gay." That is not quite a glass of hemlock, but certainly it is cancellation with a vengeance. When state power intervenes, free speech ends.

Sex and Gender

 The question about sexual difference seems always to come as an absolutist either/or: are there more than two sexes, the inquisitors ask, and assume that the poor object of the query will mumble some nonsense, or what the inquisitor will smirk into nonsense.  I'm thinking Piers Morgan here, but there are others.


Let me agree that on the whole, generally, usually, there are just two sexes.  Some beings have organs that produce eggs.  They are female.  Some beings have organs that produce sperm.  They are male.  Sometimes a particular being has ambiguous genetics and so the organs are less than obviously sperm or egg producers.  There are also true hermaphroditic beings, snails and earthworms for example, with organs that produce both eggs and sperm.  And there are creatures that change their sex, like Nemo—I mean, clownfish—and parrotfish.  True hermaphrodites as well as protandrous or protogynous critters are pretty rare, however; the sex changers are male when their organs produce sperm and female when they produce eggs.


In short, on the whole, generally, usually, as Milton says, male and female are "the two great sexes that animate the world."  Animate there is quite literal in Milton’s use, I think—the two great sexes bring anima, spirit, life to the world.


Of course Milton did not know about the millions of species of asexual bacteria and viruses, lichen and some fungi and ferns, which bring life into the world without egg or sperm, so without the two great sexes.  But that’s a different story.


For the grand inquisitor who wants to assert the either/or of sexual difference, however, the problem is that in humans, at any rate—and no doubt in other creatures as well, although I’m not ethologist enough to assert as much—in humans, sexual difference is only one aspect, one register, of sexuality, and so of sexual identity.  The biology of sexual difference intersects immediately with a whole range of other issues—social, psychological, familial, legal, religious . . . .  An infinite number of registers in which sexuality is elaborated.


Those complexities of sexuality are not a matter of sex, or rather only very partially a matter of sex.  The whole complex of issues that invest sexuality all together produces what we call gender.  Gender is to sex as a symphony is to an octave.


I suspect we’re all aware of that difference in gender as opposed to sex.  Even the most dualistic of inquisitors will refer to a “feminine” man or to a “masculine” woman.  In those phrases the adjective in quotation marks refers to gender, and the substantive modified by the adjective refers to sex.


A quick stroll through Freud’s Civilization and Its Discontents, which I invoke quite deliberately with a sort of malice aforethought, yields the following gender characteristics.  Feminine is passive; masculine is active.  Feminine focuses on family; masculine seeks out friendships.  Feminine is therefore of the household; masculine is of the society.  Feminine comforts and supports; masculine commands.  In the fruity phrase of Satan’s erroneous vision when he first sees Adam and Eve in Milton’s Paradise Lost, “He for God; she for God in him.”


Such gender stereotypes may well attract the perhaps satanic and certainly Freudian imagination of MAGATs, since clearly they limn out a social universe of difference that defines for them when America was great.


But what Freud does is as absolutist an either/or dualism as is the either/or of sexual difference.  Freud in effect maps sex onto gender in a one-to-one relationship.  That dualism is absurdly reductive.  It does not allow, for instance, for a woman who is ferociously dominant in the workplace but gently comforting with her children, or for a man who gladly serves as boss in the workplace but is submissive in the bedroom.  Gender gets expressed in so complicated and nuanced a series of behaviors, attitudes, and mind sets that Freudian reduction of gender is as false as Satan's dichotomy of sex in Paradise Lost.


Via Freud's absurdity I mean to suggest the absurdity of most current considerations of gender.  Or should I say the disappearance of gender from current discourse.  Recently it seems, gender gets absorbed into the dualism of sexual difference.  When the grand inquisitor asks about how many sexes there are, bullying the poor object of the inquisition, he does not want to hear about the myriad possibilities of gender identities.  He does not want to hear anything remotely like the complex reality of human existence.  He wants to hear only about the duality of male and female.  He certainly doesn't want to hear that just as the modern world has made it possible for humans to fly, so to the modern world has made it possible for a fully mature male to become like Nemo and turn into a female, or for a fully mature female to become like a parrotfish and turn into a male.  Yes, to be sure, Mr. Inquisitor, we're not at the point where the new male or the new female has the organs to produce eggs or sperm.  Maybe we'll never get to that point.  But then, though we humans now fly, we do not flap our wings to do so.


And definitely the grand inquisitor does not want to complicate matters even further with questions of sexual preference.  A transgendered woman loving a cisgendered woman?  A cisgendered man loving a transgendered man?  No thanks, says the inquisitor.  Gimme man or woman, either/or, and gimme man loving woman.  Gimme a world where Freud's dualism is the name of the game, and any variation is nothing other than perversion.


Such limited imaginations, such falsification of reality, such assertion of socializations as absolute truth would be laughable were it not so dangerous.

Monday, October 17, 2022

Generations

Boomers are guilty of inventing “generational difference” as a valid way of dividing people from people.  Well, maybe Boomers were not the first inventors because there are signs of such nonsense in the 18th century habit of denoting “the last age” as so different from the wonders of the Enlightenment that the two “ages” simply had nothing in common.  Still, it’s the Boomers who invented the generational cry, “don’t trust anyone over thirty” meme—or what would have been a meme had there been such a thing back in those antique days of yore.

 

But even back then generational differences were just so much nonsense.  I remember lots of people my age who were as aggressively pro-Vietnam War as “we” who were definitely against it; as many go-get’em protocapitalists among my peers as among the “over thirty” crowd; as many folks in my age group who cared nothing at all about the ecosystem as folks who were all for the first Earth Day and beyond.  By the same token, as I think back to the “Jews will not replace us” crowd at Charlottesville, I note that almost none of them seemed to be anywhere close to being Boomers—and the same for the crowd at the Jan. 6th event, for that matter—whereas the arguments against the Charlottesville rednecks and the Jan. 6th insurrectionists include many Boomers.

 

I have no basis for thinking as I do, but nonetheless I will assert that there are jerks and assholes as well as wise and compassionate people in every generation.

 

That’s not to say that generational differences are null and void as ways of analyzing how people behave.  Shared experiences produce shared perspectives and behaviors.  People who were faced with the challenges of the 1940s shared a commitment that people who grew up in the 1960s simply did not have.  That difference in experience no doubt produced differences in behavior.

 

At the same time, the scarcities of the Great Depression and WW II compared to the foison plenty of the 1960s did not necessarily produce predictable outcomes.  My stepfather, one of the generation raised in the 1930s during the Great Depression and gone to war in the 1940s, scrimped and saved throughout his entire life.  But then, I, raised during the great times of the 1960s and gone to college in the 1970s, scrimped and saved throughout my entire life as well.

 

Scarcity vs. plenty did not produce difference in behavior.

 

On the other hand, my stepfather was much into command and control, whereas I am absolutely not at all in that ballpark—even in that game, for that matter.  Is that difference a product of the difference between being raised in the 1930s as opposed to the 1960s?  Maybe.  Maybe not.

 

At any rate, I think that any generational truism, about any generation at all, must absolutely be undermined.  Are Boomers selfish?  Sure.  They are also quite generous.  Are Gen Xers grumpy?  Sure.  They are also quite sweet.  Are Millennials lazy?  Sure.  They are also quite driven by values.  And so on.  Similarly, some Boomers are grumpy and lazy, and some Gen Xers are selfish and lazy, and some Millennials are . . . .

 

The only thing that the blanket application of generational notions accomplishes is to divide—and as the very old generation of Romans used to say, divide et impera, divide and conquer.  In each historical period, for each generation, it’s always the same category of people who cherish such divisions.  You can see who they are at a glance.  Without exception, they chortle their way to the bank.

Thursday, September 22, 2022

Empire

 It’s undeniable that Britain had the largest empire of the last round of empires in the history of the world.  And it’s also undeniable that the cost of empire was borne by the subordinated peoples, wherever those people might find themselves.  For me it’s also self-evident that the idea of empire and of monarchy are irrelevant in the modern world, not just because empire produces misery for the subordinated and because monarchy produces ridiculous ideas of genealogical superiority, but also because both empire and monarchy make it hard, if not impossible for equality to flourish.

At the same time, it seems to me that the past can’t be judged on the basis of the present.  That’s the latest academic fashion, presentism, that sees the whole of the past from the perspective of what is currently accepted as true and undeniable.  When I was a young whippersnapper, the academic fashion was exactly the opposite.  We set about estranging the past, as we called it, in recognition that things back then were just absolutely different from things nowadays.  That was not intended to excuse horrible behavior in the past, but rather to recognize that the horrible behavior was the product of complicated social interactions.

 

Some of my favorite passages in Ta-Nehisi Coates’s Between the World and Me are what he says about Queen Nzinga, whom he initially admires because of the power she exercised in response to the insults directed towards her by Dutch traders.  Later he reconsiders the admiration when he recognizes that she exercises her power by making one of her retainers into a human chair so that she, “heir to everything she’d ever seen, could sit.”

 

I love those references in two different ways.  First because Coates finds a source of pride in his past.  That seems to me pretty important for everyone.  Second, though, I love the references because they also point to the idea that the past is different from the present, and that it’s also important, maybe crucial, to understand the difference.

 

So was Britain—indeed, all of Europe from 1492 to the present, which continues to express European hegemony—horrible in exercising imperial control over the whole of the world?  Yes indeed.  Can, or rather should we judge that horror from the perspective of the present?

 

Yes and no, I think.

 

The yes depends on recognizing that folks in the past are as ethically engaged as we are.  In Notes on the State of Virginia, for instance, Thomas Jefferson lays out pretty clearly just how evil slavery is, for the slaves as well as for the slave owners.  He knows that the practice of slavery from which he benefits is horrible.  But he continues to own his slaves and to rape his slave mistress and then enslave his own children.  In terms of his own ethical framework, he’s positively evil.

 

The no depends on recognizing that folks in the past contend with relations among each other that dictate behavior.  I don’t know enough about Queen Nzinga to say that her own ethical framework condemns her behavior.  But how else is Nzinga going to demonstrate her power to the Dutch?  How else is Jefferson going to demonstrate his social standing to the other plantation owners of Virginia?  Again, I don’t think that recognizing those differences from the past mean that I excuse Nzinga’s objectification of her subjects or Jefferson’s commodification of human beings.  But Nzinga’s royal standing makes it possible for her to reject the Dutch insolence and Jefferson’s patrician privilege makes it possible for him to write the Declaration of Independence.

 

So too the European imperial framework, I think, albeit more complexly and more fraught with a mixture of evil and “good.”

Sunday, August 28, 2022

Free Speech

I have an embarrassing confession to make.  On weekday nights at 11 I turn on MeTV and watch the bits and pieces of The Carol Burnett Show that the channel broadcasts before it turns to the serious business of Perry Mason.  Watching these old shows is informative and educational in a whole bunch of different directions.  Mason’s more or less in-house detective, Paul Drake, smokes and smokes and smokes again, with nary a trigger warning to the equally smoky audience of the early 1960s.  The cars that the people on the show drive are the boats that we Boomers grew up with.  Mileage?  Maybe ten miles a gallon?  I honestly can’t imagine navigating the streets of Reading, PA, in those automobiles.

 

But on a recent evening, the Burnett Show handed me a real winner.  The skit took place in a take-off on Beni Hana restaurants, here called Beni Haha.  As the skit began, we saw a Japanese chef doing the kinds of culinary sleights of hand for which Beni Hana is famous—the flashing knives, the sudden flames, the portions masterfully landed on the customers’ dishes.

 

The careful modern observers may have done a double take when they noted that although the chef himself was East Asian, the restaurant manager gleefully watching the performance was . . . Harvey Korman.  Korman was a very funny man, a great actor, whose Hedley Lamarr on Blazing Saddles raises him to the heights of the cinema firmament.

 

But Harvey Korman was not Asian.

 

And then the shtick of the piece fell into place.  The other chef, it turns out, was too sick to work, so Korman-as-manager had to scramble around to get someone to replace him.  Enter the substitute, then, Tim Conway.  Janitor for the restaurant, Conway convinces Korman that he has all the skills necessary to be a Beni Haha chef.

 

Conway was playing his usual bumbling, cack-handed self, so the idea that he was competent to twirl knives, etc. was part of the humor of the scene.  The contrast to the Asian cook we had just seen was obviously going to be hilarious.

 

But again from the perspective of the careful modern observer there were other problems for the scenario.  First, Conway is as Asian as Korman.  To be sure, he did put on a slit-eyed guise intended to suggest an East Asian background.  That was obviously intended as part of the humor of the scene.  Second, Korman and Conway communicated in something that I guess was meant to pass as Japanese.  The conversation was also part of the humor of the scene since the “language” that the two deployed was of course very very badly faked Japanese—gobbledygook, in other words.  Third, as Conway began his failing efforts to imitate a Beni Hana chef, he and Korman yelled back and forth at each other, Korman evidently questioning Conway’s abilities while Conway evidently reassuring Korman that despite the disaster taking place at the table, he could manage the task.

 

I say “evidently” about the conversation between Conway and Korman because they were still speaking gobbledygook.  And I say about the whole shtick that the various moments were part of the humor because, without exception, the audience—a real live audience, as was the standard for The Carol Burnett Show—the audience roared with laughter, first at Korman’s passing as Asian, then at Conway’s slit-eyed bumbling, then at the very badly faked Japanese conversation and then at the mutual harangue between Conway and Korman.

 

Like Paul Drake’s smoking and the boat-sized automobiles of Perry Mason, the skit was a revelation—for those of us old enough to have seen the shows when they first aired, a recollection—of just how different the past was from the present.  I cannot imagine any show in 2022 presenting two European Americans as Asians without commentary in the twitterverse that would have excoriated the actors.  I cannot imagine any show in 2022 giving us a dialogue, over the course of the ten or so minutes of the skit, in which the speakers exchanged fake Japanese with each other in order to provoke gales of laughter from the audience.  Again, the twitterverse would have exploded in condemnation.  The result, no doubt, would have been shutting down the whole of The Carol Burnett Show and the exclusion of Korman, Conway, and Burnett herself from any further employment in the world of acting—to my mind, an unfortunate result because silencing should not be the goal of free speech

 

Dearleader’s minions would call such a response “cancel culture,” and demand that the censorship stop.  I would call the excoriation and explosion in the twitterverse something else:  good manners.  Good manners dictate that you do your best not to offend other people.  Good manners might lead some to attempt to silence the actors, but in and of themselves good manners are not cancel culture.  I associate good manners with what in the Declaration of Independence Thomas Jefferson calls “a decent respect to the opinions of mankind.”  It is not censorship to ask that bad manners be considered offensive.  Rather it is simply the case that free speech provokes free speech in response—or, in a more contentious kind of way, that free speech has and absolutely should have consequences.  Removing the offenders from public life, “cancelling” them as happens all the time nowadays, is a step too far, as I see it:  but strenuously arguing against what is by its very nature a breach of good manners is absolutely essential.

 

Why not simply silence offensive speech?  John Milton says in “Areopagitica” that the goal of free speech is truth.  The progress towards that goal, he says, amounts to “the wars of Truth.”  In an ideal form, the point and counterpoint of debate in free speech moves humanity along towards truth:  “Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making.” Silencing the other side precludes such “wars.”

 

As is so often the case, the modifying initial clause of Milton’s sentence bears a great deal of attention.  Silencing, for instance, does not reflect a desire to learn.  But then neither does the erasure of the present in order to return to a past that no longer exists.  Dearleader’s minions want the present moment to be a carbon copy of the past, so that jokes at the expense of Asian people, for instance, are fine and dandy because they provoke laughter in the majority—the majority in the Euro-American universe, at any rate, since in point of fact Asians constitute the majority of the planet’s human inhabitants.  But regardless of the wishes of dearleader and his minions, the present is not the past.  It’s simply not possible to make America great “again” by returning to the status quo of the 1960s and before.  America is just not the same nation as it was back then.

 

Just think about what restaurants were available back then as opposed to now.  I graduated from high school in 1970, and there was indeed a Beni Hana restaurant in Harrisburg, where a few of us went to dine before the senior prom.  There was one.  There were a couple of Chinese and Italian restaurants—I’m not counting pizza parlors—but no Mexican or Korean or Jamaican or Thai or Ethiopian or Vietnamese or Indian or . . . .  To say the least, it was a very very limited smorgasbord of cuisines that we ate from.  That kind of limitation may not seem important, but it really is.  It points to the equally limited variety in the human population of the city.  And as was Harrisburg, so was the nation in the 1960s and before.  Indeed it was quite possible then to speak about “minorities” because the majority was an obvious, clearly marked set of people.

 

There were exceptions to that rule, to be sure.  New York City, Miami, Houston, Los Angeles, Chicago were as multifaceted then as now.  But the nation as a whole did not consider such variety as in any way shape or form normative.  The evidence?  Consider the way in which the large African American community was treated back then, restricted to specific neighborhoods and excluded from the mainline world of getting and spending, from board rooms to green rooms, that constitutes the heart of our social existences.  The same kind of limitations existed for other groups, where such groups were numerous enough to constitute a noticeable percentage of the populations.  Harrisburg did not have a China town—but Los Angeles did, as did New York and Chicago.  Miami had its Cuban ghetto.  Houston had its Mexican neighborhoods.

 

Boston strikes me as the outstanding instance of such limitations, across the board—North Boston for the Italians, South Boston for the Irish, Roxbury for the African Americans, the Back Bay for the WASPS and passing Catholics.  No wonder there was a bussing crisis in Boston!

 

To make America great “again” would mean returning to that status quo.  We would have to reject and obscure the ethnic and racial diversity that has become standard in almost all of the country, even in rural America, return to the domination of social and economic life by a “majority” that has or will soon be an actual “minority” of the population, and impoverish our dining experiences.

 

That last item is not there for fun.  The variety of restaurants that are now available to us is a serious index of just how multifarious America has become.  It would not be simply that our appetites for baba ganoush would go unfulfilled, but that the life and blood of a Middle Eastern world would at best be relegated to obscurity.  America would be “great” at the cost of losing what makes it America.

 

And what would we gain?  Well, we could securely laugh at Beni Haha skits.  What a tradeoff.