Thursday, February 29, 2024

Opinion in Good [Hu]men

Back when I was pretty moist [just for you, Paul Morrison] behind the ears as the guy standing in front of the class and trying to teach, I was often judged to be too opinionated, according to the reviews that students filed at the end of a semester.  In my mind that’s not necessarily a bad thing.  I thought of what John Milton says in “Areopagitica,” that “opinion in good men [by which Milton meant humans] is but knowledge in the making.”  The assumption of goodness attributed to me is, needless to say, absolutely right, of course.

 

But then it occurred to me that despite Milton’s wisdom, the way of expressing opinion does, after all, condition how that opinion is received.  If you’re talking to someone who in your opinion is an idiot, it does absolutely no good to say, “you’re an idiot.”  If the person is really an idiot, there’s no remedy, ultimately—but wending your way around to convincing yourself that your opinion is a matter of fact will at least give some satisfaction.

 

So I worked my way around to a different way of expressing opinions in classes.  An example follows.

 

The usual reading of Robert Frost’s well-known poem, “The Road Not Taken,” is that Frost is praising the person who marches to a different drummer, who boldly goes where no human has gone before. And of course that’s an almost irresistible reading.  Probably everyone knows the poem, but just in case, here it is:

 

Two roads diverged in a yellow wood,

And sorry I could not travel both

And be one traveler, long I stood

And looked down one as far as I could

To where it bent in the undergrowth;

 

Then took the other, as just as fair,

And having perhaps the better claim,

Because it was grassy and wanted wear;

Though as for that the passing there

Had worn them really about the same,

 

And both that morning equally lay

In leaves no step had trodden black.

Oh, I kept the first for another day!

Yet knowing how way leads on to way,

I doubted if I should ever come back.

 

I shall be telling this with a sigh

Somewhere ages and ages hence:

Two roads diverged in a wood, and I—

I took the one less traveled by,

And that has made all the difference.

 

The standard reading leans very heavily on the final two lines.  And, in my opinion, that reading is just inadequate to the subtlety of the poem—a subtlety that Frost is often considered to not have, by the way, because his language seems so conversationally straightforward, colloquial, easy enough to understand. All of those characteristics of Frost’s language are true.  But subtle, I’m afraid, Frost really is.

 

So my old approach to presenting the poem in class was to say something like this:  “Yes, there’s a suggestion of the speaker’s uniqueness of choice in the poem—no doubt about that.  But leaving the poem as being all about the wonders of such uniqueness gets contradicted by almost everything in the poem, from the title, which focuses not on the unique choice but on the choice not made, to the limited vision of the speaker, who can therefore not judge anything about uniqueness or lack or uniqueness, to clear-eyed observations like ‘the passing there / Had worn them really about the same’ and ‘both that morning equally lay / In leaves no step had trodden black.’  So surely the idea that the speaker chose the unique path is just incorrect.”

 

Talk about putting people’s backs up!

 

I worked my way into an alternative approach, then.  I’d hear the standard response to the poem, and then ask, “How would you fold ‘the passing there / Had worn them really about the same’ into that reading?  And similar questions for the other evidence—that’s what the terms of the poem gives us, after all, evidence.

 

Students would wrangle with the questions, come to whatever conclusion they thought most appropriate.  Ignore the evidence, though, they couldn’t—I’d insist on that, for sure.  Ultimately I hoped that they would come to my conclusion, that the poem is fundamentally about the nostalgia for a past where the full spectrum of options in the future are wide open, and of course the wistfulness that comes when one is older because the choices one has made foreclose such openness, so one is more or less stuck in what one has chosen.  I know—“stuck” is pretty pejorative, and it’s often the case that one’s path is wonderful.  Nonetheless, the poem suggests, one is stuck in that path, though wonderful.

 

But in fact, I came not to really care about students’ opinions coinciding with mine (even though, of course, I was right J).  The real point was to get students to think through evidence, including the give and take in the class discussions as other people’s opinions were articulated, come to sustainable conclusions, and then write essays that supported those conclusions, using the evidence adduced to reach them.  If research sources came into play on a given assignment, then the evidence of those sources would come into play as well.  But the important thing, I concluded, was the process, not necessarily the “correctness” of the opinion expressed.

 

I’ve had an online presence, as they say, for many years.  I think my first encounter with the world of online discussion boards was back in the early ‘90s—my memory being what it is, I can’t date it more narrowly.  In those many years of observing and participating in online discussions, I’ve experienced many many good things.  I’ve also seen many localized world wars, mostly on the basis of an opinion asserted as I used to assert the correct reading of the poem, or on the basis of misunderstood irony or sarcasm or humor or . . . fill in your favorite linguistic shtick.  Those have been painful to see and to participate in.  Emoticons don’t work all the time, and as many a literary scholar will say, irony and sarcasm are the most difficult verbal structures to convey.  As Robert Scholes says, to know that “the majestic gingkoes of Brooklyn” is an ironic phrase requires knowledge of what gingkoes look like.  Add the brevity that online communications almost demand, and the result is . . . localized world wars.

 

Maybe we need an online version of the UN?  But then the SCOTUS might conclude that such intercession is prohibited by the First Amendment.

Sunday, January 7, 2024

Tit for Tat

 Politics recently has become a ferocious, often silly game of tit for tat.  You impeached our president, well bless your heart, we’ll impeach yours.  You accused our guy of grifting, well honey how about your guy.

It’s silly, but profoundly dangerous.  Because I’m interested in what sometimes is grandly called the life of the mind, I’m really concerned about the tit for tat as it attacks thought, speech, ideas generally.

 

I come to the question from the point of view that free speech is the bedrock of any democracy, so my perspective may or may not be your cup of tea.  But as I see it, the desire to regulate speech, and via speech to regulate thought, leads inexorably to totalitarianism.  It’s no accident that the most dangerous element of the society in Orwell’s 1984 is the thought police and its key tool, the memory hole.

 

The memory hole makes the requirements of the immediate present the only criterion for what is true and what is false.  In doing so the memory hole also condition the past so that it meets the requirements of present exigency, but its main function is to make the present the sole criterion of what is or is not acceptable.

 

With that in mind, I think about actual efforts to make the present, or at least the present as it’s filtered through some ideological sieve, the basis for considering the past.  In Florida, for instance, thanks to the ideological filter of the far right and what I see as its hysterical fear of the truth, schools are proscribed from teaching the truth about slavery as it was practiced in the US.  The notion promulgated by the state, that slaves were more or less content with their condition, is so very very far from the truth that the whole of the past has to be rewritten, the truth erased, in order to meet the requirements of the ideology that dominates the state government.

 

The process is not quite a memory hole, to be sure, but the next best thing.  The actual past is still available to the diligent student who wants to do the most cursory of research outside of the bounds of what the schools are allowed to teach.  The effect for the great majority of kids in the schools of Florida is nonetheless to falsify the past according to the requirements of the present dominant ideology.

 

The same is true on the other side of the ideological coin.  It was not that long ago that the left wing in the US decided that The Adventures of Huckleberry Finn, when it wasn’t simply banned in schools, had to be censored so that every instance of the n-word was replaced by the word “slave.”  There is an edition of the novel, edited by Alan Gribben and published in 2011, that does exactly that.  Prof. Gribben’s intention was good.  He was concerned that an unredacted version of the novel would simply be banned so that kids would not have ready access to it.  And that would have been a shame, of course, because Huckleberry Finn is as essential to understanding the US as is the Constitution.

 

But substituting the one word for the other simply falsifies the past.  It is just a matter of fact that Americans have constantly used the n-word.  Whites use it to diminish and subordinate Black people; Black people use it to reappropriate the word and assert their own position in the world.  Substituting “slave” for the word is to falsify that fact.

 

It also substitutes a legal, indeed a Constitutional condition for a social reality.  As David Sloane points out in “The N-Word in Adventures of Huckleberry Finn Reconsidered,” the substitution makes it impossible to hear the point or understand the significance of “the climactic assertion by the recovering Tom that Jim ‘ain’t no slave; he’s as free as any cretur that walks this earth!’”  I think that Tom here squints across the legal and the social realities of being Black in America before the Civil War.  But surely the legal status of a slave does not equal the social import of the n-word.

 

The left wing’s motive in erasing painful terms is to protect the sensibilities of a present-day reader, a mirror image of the right wing’s motive in banning books that trace the painful past.  For me, censoring Huckleberry Finn is about the same as banning Toni Morrison’s The Bluest Eye.  Both actions do what the memory hole does.  They both seek to impose a present valuation on the past—in Morrison’s case a relatively recent past at that.  The result is to destroy the reality of the past.

 

I suspect that the banning of books by right wing folks takes its initial impetus from the censoring of books by left wing folks.  It’s the right wing tat for the left wing tit.  But as far as I’m concerned, the two wings lift the same bird.

Thursday, January 4, 2024

A Foreign Country

 William T. Vollman's review of This Is Not Miami, by Fernanda Melchior—in the New York Times Book Review for 14 May 2023—juxtaposes two phrases that seem to me to require more than the mere pass through implied by the only conjunctive statement:  "And never mind."  First Vollman correctly says that Hernan Cortés is a "conqueror-torturer."  Then, on the other side of the "never mind," equally correctly, he says that the people whom Cortés conquered-tortured were "human-sacrificing Aztec overlords" (13).

What I think needs unpacking in those two phrases is the alienness of the past, which seems to be impossible for contemporary folks in the US to consider, accept, and contend with.  From the left side of the political spectrum, the past is always the subject of outraged moral judgments that make the behaviors of the past so reprehensible that they are perceived to be entirely alien to the present—so beyond the pale that the only possible response to the past is pure opprobrium.  From the right side of the political spectrum, the past is an anodyne story of greatness, when men were men and women weren't, and America was great.  The inevitable conclusion to that perspective seems much like Buck Turgidson's response to Soviet ambassador Alexi de Sadesky's description of the Doomsday Bomb:  "Gee I wish we had us one of those."

Maybe because I come to the question of the relation of past to present from the perspective of the left, it seems to me that the right side of the equation is simply nonsense.  Just one day of living in the past as it really was would make even the most rabid of the MAGA crowd think again.  Smallpox, anyone?  How about the complete subordination of women to men, so that a woman could not own property under her own authority, or have her own bank account, or . . . fill in the blank for whatever right you value the most.  Want to return to the day when a Black man even glancing at a white woman led to a lynching?  Or to a time when Native Americans who objected to their own exploitation were slaughtered and then their bodies dumped in an oil barrel?  If you see those behaviors as fine and dandy, then you're welcome to join the neo-nazis.  But please stay out of the present.

From the perspective of the left, it's important to recognize the validity of L. P. Hartley's comment in The Go-Between, that "The past is a foreign country; they do things differently there."  That truth just cannot be dismissed by the cursory "And never mind" of Vollman's review.  Seeing things from the left, as I said, I know only too well that it's difficult to curb the impulse to judge the past by terms that are current in the present.  But the past, in its different epochs and its different cultural milieus, has its own evaluative criteria.


Foucault argues that history proceeds by discontinuities rather than by narratives of the same.  And that's the grounds on which I see the connection between Cortés and the Aztec overlords.  It is a necessary and uncontested aspect of Aztec ways of being, not only that humans should be sacrificed to the gods, without which practice the whole universe would be endangered, but that the struggles of the gods should be reenacted in the game of ulama, a "ball" game in which the heads of the losers of the game became the "ball" used in the game.


No doubt we in the present recoil at that practice, just as we recoil at the practices of the Spanish conquistadores, whose actions against the natives of the so-called New World are disgusting.  And yet, just as the practices of the Aztecs were necessary for maintaining the universe, so too the practices of the conquistadores were essential to the expectations of their culture.  I am not at all persuaded by the idea that Europeans came to the New World in order to spread the truth of Christianity and save the souls of the poor benighted "natives."  Pelf and wealth seem to me the clear underlying motives, as is the case with Cortés, certainly, whose exploits converted him from mere soldier to the Marques del Valle.  But the way that the Spaniards approached the natives simply relfected the way that the Reconquista of the Iberian Peninsula took place, in the same way that the treatment of Native Americans by the English settlers of North America reflected the way that the "Plantation" of Ireland took place.

The underlying point is pretty simple:  people do what people are supposed to do, and what they are supposed to do is given by the complex intersection of what Foucault calls discursive practices that form the grounds from which a culture springs.  Our discursive practices arise from moments of disruption, again according to Foucault, which make the past the "foreign country" that Hartley writes about.  If Cortés or Montezuma had behaved as we do in 2023, they would have been housed in an insane asylum—of which, of course, there were none in 1520.  Instead, they would have been treated by their respective priesthoods for possession.  Conversely, if St. Joan of Arc were with us today, she would be in an insane asylum rather than in the panoply of Catholic saints.

The problem of a proper evaluation of the past becomes a serious problem, then, because it seems inevitably to lead to an ethical relativism that smacks of weak-kneed liberalism, in the older, mid-twentieth-century sense of liberalism as something that makes it impossible to make any judgments at all.

Perhaps that is true.

But I like to think of it differently.  A solid consideration of the past requires very careful study of the past and its difference from the present.  I think of Ta-Nehisi Coates's problematic response to Queen Nzinga in Between the World and Me as a case in point.  His initial response to her power is to admire her because, when the Dutch ambassador with whom she was negotiating tried to humiliate her, she showed "her power by ordering one of her advisers to all fours to make a human chair of her body" (45).  In the ruthless power games of the 17th century, Nziga outperformed the Dutch man.  She was for Coates, then, "a weapon" to wield against the equally ruthless power games of twentieth-century American race relations.

In his first response to the Queen, Coates was a bit like the MAGA crowd, seeing in the practices of the past a mirror to what the present ought to be.

And then Coates thought again—or rather, he took a class in which his professor, Linda Heywood, reframed Queen Nziga's behavior.  When Heywood "told the story of Nziga," says Coates, "she told it without any fantastic gloss, and it hit me hard as a sucker punch."  What hits him is that, in the realitiy of the present, he would be Nziga's avatar, but rather he would be like her adviser, "broken down into a chair so that a queen, heir to everything she'd ever seen, could sit" (54).  That idea runs so deeply counter to the practices of the modern world that it produces a response equivalent to my response to Cortés or Montezuma.  Revulsion.

I think, however, that neither the initial nor the secondary response is a full one.  Yes Cortés and Montezuma and Nziga behave in reprehensible ways.  But they behave in ways that are essential and necessary.  If Cortés is going to perform as he is expected to perform; if Montezuma is going to act as his religion requires him to act; if Nziga is going to overwhelm her antagonist, then all three must, absolutely must behave as they do.  My judgment that such behavior is simply unacceptable in the modern world is, ultimately, irrelevant to the historical moment of 1520 or 1640.  I am not being a relativist, in other words, but a contextualist, fully aware of the evil of the past but also fully aware that such a judgment is couched in terms of the present.

Free Speech

 I understand the argument that free speech is an essential part of any democracy. I'm 100% in favor of free speech. Can't live without it, really. At the same time, I'm also very much aware that free speech provokes—or rather "encourages," a more neutral term—response, which is equally free in its expression.

Such responses are sometimes very sharp, but they are always framed by social expectations, which is to say whatever has become standard social practice. Once upon a time it might have been hilarious to make a "cripple" jokes, and for some people it might still be ok to laugh at them. But the consensus seems to be that such jokes are reprehensible, so that when dearleader mocked the reporter, Serge Kovalesk, who suffers from arthrogryposis, most people, dearleader's minions aside, were at least disgusted by the then presidential candidate's behavior.

There was pushback. To be sure, dearleader and his epigones shrugged off the pushback. But there has not been a recurrence of such "joking," so perhaps the pushback had an effect.

That's the way that free speech works. You say X, which provokes me to say Y and elicits Z from Joe Blow, and X' from . . . . In the interaction of statements arises the very soul of democracy. As Milton says in "Areopagitica," "Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making." Milton puts a great deal of weight on the optimistic idea of "desire to learn" and of the beneficence of "good men," an optimism limited to Protestants since he explicitly excluded Catholics from the orbit of free speech.

We might find such an exclusion ridiculous—although certainly it was still a crucial aspect of American culture when JFK ran for office. But given the religious wars of Europe in the 17th century, excluding Catholics in a Protestant country was as expected as excluding Protestants in a Catholic country. So too Milton's use of "men" to mean "humans," since "men" in 1644 meant all sexes. Given the standard social practice of the 17th century, no one found the term offensive, as people might well do in the first third of the 21st century.

Sometimes the standard social practices of a period get formalized in "speech codes," as they're called. Milton's exclusion of Catholics might be a "speech code" of sorts. In 17th century England, the exclusion of Catholics also had the force of law. But in the 21st century, speech codes have no legal force, so that Milton's use of "men" to denote all human beings, offensively patriarchal though it might be to a great many people nowadays, is still in use by a great many other people, and there is no law to proscribe such usage.

The same thing goes for a great many other offensive terms and practices. No one with the slightest sense of decency would use the "n-word" to denote people who are not of European descent. But people still use that word, and although there is much much pushback against such usage, there is no law that prohibits it. Instead, as one of the benefits of free speech, the usage of the term by some people defines the character of those people. If under the license of free speech you use the n-word, and so tell me who you are, I will believe that you are that person. And I will speak against you.

The issue becomes more complicated when institutions are mixed into the equation. Dave Chapelle got a great deal of pushback for his "jokes" about gay and transexual people, for instance. That pushback from people is part of the normal interaction of free speech with other free speech. But then institutions got involved and, in Chapelle's own terms, he got "canceled," caught in the web of "cancel culture."

I realize that comics are particularly at risk in the give and take of free speech. Freud argues that the best, in the sense of the most laughter-provoking jokes are what he calls "tendentious," attack jokes. Aristophanes attacks male arrogance in "Lysistrata," philosophical arrogance in "The Clouds," and dramatists' self-importance in "The Frogs," and so on. They are hilarious plays. The objects of attack no doubt responded to Aristophanes. Still, I imagine that Socrates would have preferred "The Clouds" to the glass of hemlock that the state of Athens required him to drink.

My sense is that "cancel culture" responds to the pressure of standard social practices, fair or unfair as those practices may be. Chapelle says that corporate entities are more responsive to the pressures of the LGBTQ+ community that they are to the experience of Black people, so that Black entertainers who mock the LGBTQ+ community are "canceled" while Black citizens get shot. There may be some truth to the influence of the LGBTQ+ community, but nonetheless I think Chapelle is mixing oranges with apples. Killing people requires a legal response, whereas making jokes evokes the pushback of free speech in response to free speech. Breonna Taylor and George Floyd and Daunte Wright, and so many many others are martyrs. Chapelle is not.

"Cancel culture" is what happens in the give and take of free speech when it's a corporate entity that participates in the exchange of speech. Al Franken's "molestation" photograph led to his resigning from the US Senate, for instance. Sarah Silverman's black-face episode led to her disappearing from screens. Is it legitimate for corporate entities to "speak" as individuals might? The SCOTUS asserts that corporate entities have free speech rights, so legal practice makes it perfectly acceptable for corporations to respond to free speech in whatever way they wish. To my mind it would make better sense for the corporation to speak and argue rather than to silence and "cancel." But that's not the way corporations seem to want to respond.

We do not have an Athenian jury condemning comics or philosophers to death. "Cancel culture," unfortunate as I think it to be, is an effect of free speech countering free speech. To be sure, we do have efforts to substitute state power for free speech. All you have to do is consider the laws promulgated by the Florida state government. Unlike Chapelle, whose "cancellation" followed from his "joking" about the LGBTQ+ community, a teacher in some public school in Tampa or Jacksonville or Pensacola can end up in jail were they to say "gay." That is not quite a glass of hemlock, but certainly it is cancellation with a vengeance. When state power intervenes, free speech ends.

Sex and Gender

 The question about sexual difference seems always to come as an absolutist either/or: are there more than two sexes, the inquisitors ask, and assume that the poor object of the query will mumble some nonsense, or what the inquisitor will smirk into nonsense.  I'm thinking Piers Morgan here, but there are others.


Let me agree that on the whole, generally, usually, there are just two sexes.  Some beings have organs that produce eggs.  They are female.  Some beings have organs that produce sperm.  They are male.  Sometimes a particular being has ambiguous genetics and so the organs are less than obviously sperm or egg producers.  There are also true hermaphroditic beings, snails and earthworms for example, with organs that produce both eggs and sperm.  And there are creatures that change their sex, like Nemo—I mean, clownfish—and parrotfish.  True hermaphrodites as well as protandrous or protogynous critters are pretty rare, however; the sex changers are male when their organs produce sperm and female when they produce eggs.


In short, on the whole, generally, usually, as Milton says, male and female are "the two great sexes that animate the world."  Animate there is quite literal in Milton’s use, I think—the two great sexes bring anima, spirit, life to the world.


Of course Milton did not know about the millions of species of asexual bacteria and viruses, lichen and some fungi and ferns, which bring life into the world without egg or sperm, so without the two great sexes.  But that’s a different story.


For the grand inquisitor who wants to assert the either/or of sexual difference, however, the problem is that in humans, at any rate—and no doubt in other creatures as well, although I’m not ethologist enough to assert as much—in humans, sexual difference is only one aspect, one register, of sexuality, and so of sexual identity.  The biology of sexual difference intersects immediately with a whole range of other issues—social, psychological, familial, legal, religious . . . .  An infinite number of registers in which sexuality is elaborated.


Those complexities of sexuality are not a matter of sex, or rather only very partially a matter of sex.  The whole complex of issues that invest sexuality all together produces what we call gender.  Gender is to sex as a symphony is to an octave.


I suspect we’re all aware of that difference in gender as opposed to sex.  Even the most dualistic of inquisitors will refer to a “feminine” man or to a “masculine” woman.  In those phrases the adjective in quotation marks refers to gender, and the substantive modified by the adjective refers to sex.


A quick stroll through Freud’s Civilization and Its Discontents, which I invoke quite deliberately with a sort of malice aforethought, yields the following gender characteristics.  Feminine is passive; masculine is active.  Feminine focuses on family; masculine seeks out friendships.  Feminine is therefore of the household; masculine is of the society.  Feminine comforts and supports; masculine commands.  In the fruity phrase of Satan’s erroneous vision when he first sees Adam and Eve in Milton’s Paradise Lost, “He for God; she for God in him.”


Such gender stereotypes may well attract the perhaps satanic and certainly Freudian imagination of MAGATs, since clearly they limn out a social universe of difference that defines for them when America was great.


But what Freud does is as absolutist an either/or dualism as is the either/or of sexual difference.  Freud in effect maps sex onto gender in a one-to-one relationship.  That dualism is absurdly reductive.  It does not allow, for instance, for a woman who is ferociously dominant in the workplace but gently comforting with her children, or for a man who gladly serves as boss in the workplace but is submissive in the bedroom.  Gender gets expressed in so complicated and nuanced a series of behaviors, attitudes, and mind sets that Freudian reduction of gender is as false as Satan's dichotomy of sex in Paradise Lost.


Via Freud's absurdity I mean to suggest the absurdity of most current considerations of gender.  Or should I say the disappearance of gender from current discourse.  Recently it seems, gender gets absorbed into the dualism of sexual difference.  When the grand inquisitor asks about how many sexes there are, bullying the poor object of the inquisition, he does not want to hear about the myriad possibilities of gender identities.  He does not want to hear anything remotely like the complex reality of human existence.  He wants to hear only about the duality of male and female.  He certainly doesn't want to hear that just as the modern world has made it possible for humans to fly, so to the modern world has made it possible for a fully mature male to become like Nemo and turn into a female, or for a fully mature female to become like a parrotfish and turn into a male.  Yes, to be sure, Mr. Inquisitor, we're not at the point where the new male or the new female has the organs to produce eggs or sperm.  Maybe we'll never get to that point.  But then, though we humans now fly, we do not flap our wings to do so.


And definitely the grand inquisitor does not want to complicate matters even further with questions of sexual preference.  A transgendered woman loving a cisgendered woman?  A cisgendered man loving a transgendered man?  No thanks, says the inquisitor.  Gimme man or woman, either/or, and gimme man loving woman.  Gimme a world where Freud's dualism is the name of the game, and any variation is nothing other than perversion.


Such limited imaginations, such falsification of reality, such assertion of socializations as absolute truth would be laughable were it not so dangerous.

Monday, October 17, 2022

Generations

Boomers are guilty of inventing “generational difference” as a valid way of dividing people from people.  Well, maybe Boomers were not the first inventors because there are signs of such nonsense in the 18th century habit of denoting “the last age” as so different from the wonders of the Enlightenment that the two “ages” simply had nothing in common.  Still, it’s the Boomers who invented the generational cry, “don’t trust anyone over thirty” meme—or what would have been a meme had there been such a thing back in those antique days of yore.

 

But even back then generational differences were just so much nonsense.  I remember lots of people my age who were as aggressively pro-Vietnam War as “we” who were definitely against it; as many go-get’em protocapitalists among my peers as among the “over thirty” crowd; as many folks in my age group who cared nothing at all about the ecosystem as folks who were all for the first Earth Day and beyond.  By the same token, as I think back to the “Jews will not replace us” crowd at Charlottesville, I note that almost none of them seemed to be anywhere close to being Boomers—and the same for the crowd at the Jan. 6th event, for that matter—whereas the arguments against the Charlottesville rednecks and the Jan. 6th insurrectionists include many Boomers.

 

I have no basis for thinking as I do, but nonetheless I will assert that there are jerks and assholes as well as wise and compassionate people in every generation.

 

That’s not to say that generational differences are null and void as ways of analyzing how people behave.  Shared experiences produce shared perspectives and behaviors.  People who were faced with the challenges of the 1940s shared a commitment that people who grew up in the 1960s simply did not have.  That difference in experience no doubt produced differences in behavior.

 

At the same time, the scarcities of the Great Depression and WW II compared to the foison plenty of the 1960s did not necessarily produce predictable outcomes.  My stepfather, one of the generation raised in the 1930s during the Great Depression and gone to war in the 1940s, scrimped and saved throughout his entire life.  But then, I, raised during the great times of the 1960s and gone to college in the 1970s, scrimped and saved throughout my entire life as well.

 

Scarcity vs. plenty did not produce difference in behavior.

 

On the other hand, my stepfather was much into command and control, whereas I am absolutely not at all in that ballpark—even in that game, for that matter.  Is that difference a product of the difference between being raised in the 1930s as opposed to the 1960s?  Maybe.  Maybe not.

 

At any rate, I think that any generational truism, about any generation at all, must absolutely be undermined.  Are Boomers selfish?  Sure.  They are also quite generous.  Are Gen Xers grumpy?  Sure.  They are also quite sweet.  Are Millennials lazy?  Sure.  They are also quite driven by values.  And so on.  Similarly, some Boomers are grumpy and lazy, and some Gen Xers are selfish and lazy, and some Millennials are . . . .

 

The only thing that the blanket application of generational notions accomplishes is to divide—and as the very old generation of Romans used to say, divide et impera, divide and conquer.  In each historical period, for each generation, it’s always the same category of people who cherish such divisions.  You can see who they are at a glance.  Without exception, they chortle their way to the bank.

Thursday, September 22, 2022

Empire

 It’s undeniable that Britain had the largest empire of the last round of empires in the history of the world.  And it’s also undeniable that the cost of empire was borne by the subordinated peoples, wherever those people might find themselves.  For me it’s also self-evident that the idea of empire and of monarchy are irrelevant in the modern world, not just because empire produces misery for the subordinated and because monarchy produces ridiculous ideas of genealogical superiority, but also because both empire and monarchy make it hard, if not impossible for equality to flourish.

At the same time, it seems to me that the past can’t be judged on the basis of the present.  That’s the latest academic fashion, presentism, that sees the whole of the past from the perspective of what is currently accepted as true and undeniable.  When I was a young whippersnapper, the academic fashion was exactly the opposite.  We set about estranging the past, as we called it, in recognition that things back then were just absolutely different from things nowadays.  That was not intended to excuse horrible behavior in the past, but rather to recognize that the horrible behavior was the product of complicated social interactions.

 

Some of my favorite passages in Ta-Nehisi Coates’s Between the World and Me are what he says about Queen Nzinga, whom he initially admires because of the power she exercised in response to the insults directed towards her by Dutch traders.  Later he reconsiders the admiration when he recognizes that she exercises her power by making one of her retainers into a human chair so that she, “heir to everything she’d ever seen, could sit.”

 

I love those references in two different ways.  First because Coates finds a source of pride in his past.  That seems to me pretty important for everyone.  Second, though, I love the references because they also point to the idea that the past is different from the present, and that it’s also important, maybe crucial, to understand the difference.

 

So was Britain—indeed, all of Europe from 1492 to the present, which continues to express European hegemony—horrible in exercising imperial control over the whole of the world?  Yes indeed.  Can, or rather should we judge that horror from the perspective of the present?

 

Yes and no, I think.

 

The yes depends on recognizing that folks in the past are as ethically engaged as we are.  In Notes on the State of Virginia, for instance, Thomas Jefferson lays out pretty clearly just how evil slavery is, for the slaves as well as for the slave owners.  He knows that the practice of slavery from which he benefits is horrible.  But he continues to own his slaves and to rape his slave mistress and then enslave his own children.  In terms of his own ethical framework, he’s positively evil.

 

The no depends on recognizing that folks in the past contend with relations among each other that dictate behavior.  I don’t know enough about Queen Nzinga to say that her own ethical framework condemns her behavior.  But how else is Nzinga going to demonstrate her power to the Dutch?  How else is Jefferson going to demonstrate his social standing to the other plantation owners of Virginia?  Again, I don’t think that recognizing those differences from the past mean that I excuse Nzinga’s objectification of her subjects or Jefferson’s commodification of human beings.  But Nzinga’s royal standing makes it possible for her to reject the Dutch insolence and Jefferson’s patrician privilege makes it possible for him to write the Declaration of Independence.

 

So too the European imperial framework, I think, albeit more complexly and more fraught with a mixture of evil and “good.”