Friday, May 27, 2016

First They Came for the Politically Correct ...

Now, this is interesting; an exchange between Atlantic blogger Conor Friedersdorf and an anonymous 22-year-old Donald Trump supporter from San Francisco.  But this bit from Friedersdorf's correspondent seems to say a lot:
In my first job, I mentioned that I enjoyed Hulk Hogan to a colleague who also liked the WWE. I was not aware at the time, but Hogan had recently made news for his use of some racial or homophobic slur. I was met with a horrified stare. By simply saying I liked his showmanship, I was lumped into saying I too was racist or homophobic.

I feel like I have to hide my beliefs.
I've noticed that PC Cryers, whether they're Republican or Democrat or something else, are very fragile snowflakes. Notice that it's not clear that he even got more than a "horrified stare." but he knows that he was "lumped into saying [he] too was racist or homophobic." Maybe I'm wrong, but that is all he says. So even a "horrified stare" (also his interpretation) makes him "feel like I have to hide my beliefs."  Wow. Even if his interpretation was correct, I've always gotten such lumpings from the Right.  (Oh, you don't think we should invade Iraq? I bet you'd like to see Osama Bin Laden as President!)  Now, think of the good old days (which are still with us in most of the United States) when you could lose your job for being gay, and you couldn't even get certain jobs or go to certain schools if you weren't white. Think of the physical attacks that targeted civil-rights demonstrators, or school kids who had to be guarded by armed soldiers to enter a previously white-only public school. And even then they weren't safe.  But this guy is too tender to cope with political disagreement. He talks as though he shouldn't encounter any expression of disagreement at all.

What's bitterly funny is that this twerp believes that Trump is more tolerant than the people he calls "PC," and he likes the idea that if Trump became President he'd be safer from PC (which is white-person politically-correct jargon for disagreement). He's talking about a guy who encourages vigiilante violence in his rallies, remember. ".... I think Trump would likely do what he can to protect free speech" is merely delusional, much like the fantasies many Democrats had about Obama in 2008. I think this whiner might find that he won't be as safe as he likes to think in a Trump America.  He has an Asian-American girlfriend, for example, and Asian-Americans have been targets of racist violence by nativist thugs in the recent past; they could be again.  I think things would likely get worse, but he doesn't mind as long as someone else is the target: "Admittedly," he writes, "I do not focus on the human cost either,"  I'd rather not find out. Things are bad enough already.

This doesn't mean I'm not hostile to the authoritarian tendencies among many liberals and "progressives," including the gay movement alas. I've often written and spoken out about it. The typical "Oh how can you say such awful things!?" liberal reaction to offensive statements deserves scorn, and I give it. We need more and better debate and discussion, and people need to inform themselves, and we all need to speak out more without trying to punish those we disagree with, even when we think they're badly wrong.

Tuesday, May 24, 2016

The Bengal Famine Commemorative Tea Towel

https://www.keepcalm-generator.com/
Keep Calm and Click On It, you know you want to.  But maybe not; I suppose and hope this meme has passed its sell-by date by now.

I'm reading The Ministry of Nostalgia by Owen Hatherley, published earlier this year by Verso, and so far it's very good.  (An excerpt here; you can order the book directly from the publisher, including the e-book at about half what it would cost from Amazon.)  Among other things, I've learned where the original Keep Calm poster came from (the British propaganda ministry in 1939, but it was never used officially, and only surfaced at the end of the twentieth century.  Hatherley also discusses "hauntology," an ambivalent alt-music subgenre mixing nostalgia with anxiety that sounds interesting.  And to give credit where it's due, I stole the title of this post from page 72.

But what I want to write about today is a thought that occurred to me soon after I began reading The Ministry of Nostalgia.  There is a lot of nostalgia (or "amnesia turned around," as the poet Adrienne Rich called it) for the World War II period, in the US no less than in the UK.  It occurred to me how odd this is, when the nostalgia is expressed (as it often is) by people who see government as a danger to freedom.  The war years were a time of Big Government on joy juice: enforced conformism on a scale that you only see in that kind of war, with censorship, rationing of food and other goods, wage and price controls, limitations on union activity, and a general disregard for individual freedom in the name of a greater collectivist good. Winston Churchill had to cope with looting by civilians and by officials during the Blitz.  Of course it was also a time of resistance, with black markets in consumer goods, complaints (however muffled and cautious) about restrictions on travel, and profiteering by business and individuals when they could get away with it.

Which reminds me that the genuine threat of external danger didn't entirely stop people or organizations from looking for someone they could bully and attack at home.  For example, Manning Marable wrote in his biography of Malcolm X (Viking Press, 2011),
In response to blacks' modest gains in employment [in the 1930s and 40s], thousands of white workers participated in "hate strikes" during the war years, especially in skilled positions. In July 1943, for example, white racists briefly paralyzed part of Baltimore's Bethlehem Shipyards. In August the following year, white streetcar drivers in Philadelphia, outraged at the assignment of eight black motormen, staged a six-day strike. In response, Roosevelt dispatched five thousand troops and issued an executive order placing the streetcar company under army control [56].
To say nothing of the Detroit race riots, also in 1943. when white thugs ran wild for three days, killing 25 African- Americans and doing millions of dollars' worth of damage. (Seventeen of the blacks were killed by police; none of the nine whites who died were killed by police.)  Time magazine's retrospective account is interesting:
As a matter of fact, the Axis propaganda machine predictably jumped all over the news of America’s 1943 race riots, citing them as evidence of a corrupt, weak and fatally divided culture. (A few years later, of course, that corrupt, weak, fatally divided culture emerged from the war victorious and more powerful than any other single nation on the planet.)
That victorious, powerful culture remained fiercely racist and imperialist, both in policy and in popular attitudes.  But Time had to wave the flag after criticizing the rabble in Detroit (and in Texas, Alabama, and California) for giving aid and comfort to the enemy; they wouldn't want to go too far.

My point here is not so much the unreliability of nostalgia, which isn't exactly news, but that people who indulge in it are longing for a time they'd have hated to live in, and to some extent they're aware of it.  The culprits in England include the supposedly anti-government Thatcherites (though Thatcher was all for repression on her terms).  The Bush administration, like the Johnson administration planning its war in Vietnam, knew full well that it couldn't prosecute the War on Terror by asking Americans to make economic sacrifices -- indeed, Dubya urged Americans to do their part by shopping.  As Hatherley shows, the austerity of the war years is very different from the austerity we are being asked to accept today.

More to my point, Hatherley spends a lot of time on austerity nostalgia among the Left, though it's not uncontested there.  Hatherley quotes Perry Anderson's critique of the leftist historian E. P. Thompson:
Anderson claimed that when looking at the working class of the present, Thompson could see only that of the past: 'the divorce between his intimacy and concord with the late 18th and early 19th centuries, and his distance and lack of touch with the second half of the 20th century, is baffling.  It is a divorce that is evidently rooted deep in the sensibility'.  The composition of the working class - who they are, what they do for a living, where the political fault lines might lie -- is increasingly ignored, in favor of a vague, windy imprecise notion of 'The People' -- and 'who the "common people" are is never said.  They exist only as figments in this moralistic rhetoric.  The fact that the majority of the population in England in this period voted consistently for Conservative governments is brusehd aside' [49-50].
We have the same problem in the US, where the phrase "We the People," used as if it were a single word all across the political spectrum, always makes me wary.  The cult of Woody Guthrie is, I think an example of leftish austerity nostalgia here, epitomized by Bruce Springsteen in my generation.  (In England, the singer Billy Bragg recorded a couple of albums of Guthrie's lyrics with tunes Bragg supplied.  Hatherley mentions Bragg as an "early adopter of austerity nostalgia in the 1980s," who in his memoir The Progressive Patriot "did not flirt with racism in the way that many of these writers have done; the 'patriotism' that he refers to was that of tolerance and multiculturalism" (37).  Compare the attempts of many American liberals, progressives and others to reclaim patriotism for the Good Guys "without [as Hatherley says of their British counterparts] partaking in any of the 'sad passions' that actually makes much of the right's politics so powerful -- resentment, hatred, bitterness" (38).

So, halfway through The Ministry of Nostalgia, I'm enjoying it and very pleased.  It's polemical and very quotable but still properly sensitive to ambiguity.  Someone should do a similar book about the same syndrome here in the US. 

Monday, May 23, 2016

Neo-pro, Neo-con

Daniel Larison recommended this article that attacks a claim made by a pundit for a respectable media outlet "that neoconservatives have been part of a broad foreign policy consensus dating back to the ’50s."

You know, I'm not entirely sure what neoconservatives are.  The pundit, Eliot Cohen, makes some risible statements, for example that "the two-generation-old American foreign policy consensus ... held that American interests were ineluctably intertwined with American values, and that when possible, each should reinforce the other, as when the promotion of liberty and human rights helped to weaken the Soviet Union."  Oh, yes, we all know how "American values" promoted liberty and human rights around the world, and continue to do so.  But Paul Pillar, Cohen's critic, has his own blind spots.
Dwight Eisenhower's presidency was one of foreign policy restraint. Ike didn't dive into Southeast Asia when the French were losing, he didn't attempt rollback in Eastern Europe, and he came down hard on the British, French, and Israelis during their Suez escapade. Richard Nixon's foreign policy was characterized by realism, balance of power, and extraction from a major war rather than starting one. Ronald Reagan, despite the image of standing up to the Evil Empire, didn't try to wage Cold War forever like some in his administration did. He saw the value of negotiation with adversaries, and when faced with high costs from overseas military deployments (think Lebanon in 1983-84), his response was retrenchment rather than doubling down. George H.W. Bush had one of the most successful foreign policies of all, thanks to not trying to accomplish too much with overseas military expeditions, and to his administration being broad-thinking and forward-looking victors of the Cold War.
I suppose most of these statements could be explicated in ways that would make them less absurd than they are at first glance, but that's because Pillar is overlooking, deliberately or through ignorance, facts that would complicate them, and perhaps undermine his argument.

Take his account of Eisenhower, who continued Truman's policy of massive support for the French war in Indochina, analogous to Obama's support for the current Saudi blitzkrief in Yemen. True, when the French gave up Eisenhower didn't "dive in," if that's supposed to mean a direct invasion by US forces.  Instead Eisenhower undermined the political settlement that followed, by bringing in and supporting a viciously repressive client, which soon led to resistance by the Vietnamese and ultimately (less than a decade later) a direct US invasion by Eisenhower's successor.  Eisenhower also used covert action to overthrow govenments that he considered insufficiently cooperative with US "interests."  Guatemala and Iran were what he considered successful interventions, both involving the installation of singularly brutal dictatorships that the US supported for decades; Indonesia was such a failure that his administration did their best to ensure it would be forgotten, with considerable success.  It's currently fashionable to whitewash Ike, but his main success was minimizing US losses, while maximizing casualties in the countries he chose as targets.

As for Nixon, his "extraction from a major war" didn't happen.  He extended and escalated the war in Vietnam while starting a new one in Cambodia, again with minimal US losses and maximal losses among Cambodians.  I suppose Pillar has in mind Nixon's "Vietnamization" program, which was supposed to turn the work of waging the US to South Vietnamese forces, but the US remained involved in Vietnam throughout Nixon's tenure, and only got out under his appointed successor Gerald Ford.

Reagan, like Eisenhower, preferred "covert" (meaning, not publicized in the US but well-known elsewhere in the world) and proxy activity, but his first impulse was different.  (Why were US troops in Lebanon to begin with, for example?)  Bush the Elder invaded Panama and Iraq, again with minimal US casualties but maximal Panamanian and Iraqi losses.  His disinclination "to accomplish too much with overseas military expeditions" presumably refers to Bush's initial promise to support the Iraqi uprising against Saddam Hussein immediately after the Gulf War, and his subsequent inaction when that uprising was put down with harshness typical of US clients defending their turf. (It would not have required a military expedition to support the uprising, by the way; but letting Iraqi rebels use "captured Iraqi equipment" against Saddam wasn't acceptable to Bush.)  Bush's supposed aversion to overseas military expeditions is also belied by the unseemly haste with which he reacted with military force to the Iraqi invasion of Kuwait in the first place.

So Pillar's critique seems to overlook important contrary evidence and considerations about the post-WWII US foreign policy consensus.  Whoever the neoconservatives are, US policy has mostly involved state terror, violence, direct military intervention when possible and covert intervention by repressive American proxies when discretion required it.  Whatever the neocons wrought, it seems to have differed from the consensus mainly in degree, not in kind.

Saturday, May 21, 2016

This Way to the Regress

I just bought a digital copy of Conversations about Psychology and Sexual Orientation (NYU Press, 1999), by Janis S. Bohan, Glenda M. Russell, and several other contributors.  I've read it before, and found it useful enough (if only to criticize) that getting the ebook felt worthwhile once the price dropped to a reasonable level.

This time I went first to Leonore Tiefer's contribution, "Don't Look for Perfects: A Commentary on Clinical Work and Social Constructionism."  Tiefer is a psychologist, a sex therapist, and a professor of psychiatry, and author of Sex Is Not a Natural Act (2nd edn, Westview Press, 2004), which I liked.  (It's about time to reread it, I guess.)  I was disappointed by the opening, under the header "Sexual Orientation: Oppression or Identity?"  (Don't you love false antitheses?)
Writing this commentary raises a great irony for me.  As a deep social constructionist, I see sexual orientation as an idea that emerged near the end of the nineteenth century as part of the new profession of psychiatry's effort to busy itself segmenting the behavioral and intrapsychic world into neat little boxes of normal and abnormal.  In my mind, the categories of heterosexual and homosexual cannot be separated from their historical origins -- everything else is rationalization and a more or less disguised fulfillment of that original psychiatric phase.

Fast-forward to 1998.  I am writing this commentary as a clinician, that is, a person who must and does think in terms of normal and abnormal (or else be a total hypocrite) in her or his work.  People consult me and listen to me because they have confidence that I can offer insight and advice based on some understanding of normal and abnormal.  The social changes of the past third of a century have erased the normal/abnormal dichotomy from sophisticated discussions of "sexual orientation."  Now, the term is merely descriptive -- whom does one love and desire, a person of the same sex or a person of the other sex (or both)?  The reality of the categories is taken for granted, and the big controversies are about etiology (which some might argue is a sign that the normal/abnormal dichotomy has not really been erased from sophisticated clinical discussions!) [77].
Tiefer does attempt in the succeeding pages to think of ways people might deal with their problems without the normal/abnormal dichotomy or a belief in an individual's "true sexual nature," ways which turn out to be fairly simple, intuitive and non-paradoxical.  They involve careful listening (a client-centered approach that is hardly new) and critical thinking.  I approve of these ideas wholeheartedly, but they don't have a lot to do with social constructionism.

Being a social constructionist, deep or shallow, doesn't by itself commit you to any particular historical narrative or to any specific construction of a category.  Tiefer's opening reminds me of a biologist I debated in the 90s, who said that his training caused him to seek biological explanations for every aspect of human behavior.  I responded that his training was, on his account, unscientific if science is supposed to seek knowledge without preconceptions; ruling out non-biological explanations in advance of the evidence is as invalid as ruling out biological ones.  It seems to me that Tiefer is taking the latter tack here, motivated by the same kind of binary she's trying to reject, and by an essentialist notion of "psychiatry." 

I think her account is also inaccurate historically, for the same reason.  "Sexual orientation" as a concept didn't originate in the nineteenth-century, so Tiefer appears to be assuming that the concept has a nature that persists through changes of theory and terminology.  "Homosexuality" has always been an incoherent concept, and the advent of the term "sexual orientation" hasn't made it any more coherent.  As I've pointed out before, "sexual orientation" is formally defined as the direction of one's sexual desires, but in use it refers to gender inversion, and overlaps confusingly with "gender identity" and other incoherent ideas.

Nor do categories and classifications necessarily involve any assumptions about "the realities of the categories," though it's true that clinicians, like most people, tend to forget this.  Consider a hypothetical categorical division, "people shorter than six feet tall" and "people six feet or taller."  The differences between the classes are "real" for some understanding of "real," but they aren't absolute. This classification might be useful for some purpose or other, and it would be perfectly valid to use it -- until the clinician or researcher began thinking of the two groups as mutually exclusive and different from each other by nature.  In practice that doesn't seem to take very long.  The same is true of commonly used dichotomies like "masculine/feminine," "Catholic/Protestant," "theist/atheist," or "bisexual/monosexual."  Alfred Kinsey tried to use "homosexual" and "heterosexual" in this neutral way, and encountered fierce resistance not only from clinicians attached to the essentialist use but from later sex researchers who saw themselves as working in his tradition but moving "beyond Kinsey."

"Normal/abnormal" is a prime example of this confusion.  If "normal" simply means something like "what most people do" and "abnormal" means "what most people don't do," there's nothing invidious or harmful in the binary.  You can be part of a minority, even a small one, and that's just fine.  But most people, "sophisticated" or not, have trouble sticking with that construction.  Sooner or later, they figure that if you're not jumping off the bridge with everyone else, you're a loser, a freak, uncool, sick.  If there weren't something wrong with you, you'd be hanging with or at least admitting the superiority of the cool kids.  It's worth remembering of course that the normal-as-normative is often a trait of the few.  The cool kids are always a minority, often a small one.  People are ambivalent about this, and their attitude is driven by factors other than numbers.

Anyone who wants to assume "the realities of the categories" needs to remember that in the real world, categories tend to be porous, often with very wide variation among the members.  Kinsey, who can be classified as an "Aristotelean," cut his professional teeth by studying gall wasps.  This brought him face to face with the problems of classification, and he brought that approach to human sexuality, intending to map the variation in sexual behavior rather than locate essences in it.  This latter approach, which can be classified as "Platonic," dominated the sciences in his day as it still does.  Kinsey's critics argued that he should have looked for essences of human sexuality (the "normal") rather than being distracted by range and variety, which they saw as surface distractions from the Real.  (See Peter Hegarty's Gentlemen's Disagreement: Alfred Kinsey, Lewis Terman, and the Sexual Politics of Smart Men [Chicago, 2013] for an intelligent discussion of aspects of this controversy.)  But from what I know of him, I don't think Kinsey himself, any more than Aristotle, was a social constructionist.  Recognizing and studying variety is perfectly compatible with believing that the variety is a feature of the "real" world.

Consider a case Tiefer mentions later, "a patient who recently came to me when he was increasingly preoccupied with sexual fantasies about (as well as a budding sexual relationship with) his secretary." Tiefer notes that "sexual identity was not an issue for him," though there is such an identity available for such a person, namely "adulterer."  "[But] the idea of 'true sexual nature" was..." (81).  As with the transsexual patients Tiefer discusses, it's possible for a clinician to be useful to a patient without invoking "normal" or "abnormal," which really aren't useful categories most of the time anyway.  If your patient is, say, hearing voices that urge her to kill herself, normality and abnormality are beside the point.  But ditto for a benign or neutral situation, like a person who doesn't conform to official gender norms.  It was surely "abnormal" in various senses for men to grow their hair long in 1960s America (and remember that "long" was highly subjective when a military buzz cut was the standard), but there was nothing wrong with it, nor did it really say anything about their "gender identities."

There probably is no correct answer to a question like "Should I have an affair with my secretary?" or "Should I leave my husband?" or "Should I seek sex/gender reassignment surgery?"  ("Sex reassignment" and "gender reassignment," by the way, express different conceptual understandings, though the procedure involved is the same.)   Patients asking these questions may be looking for an authority figure to make their choices for them.  According to some constructions of authority, that may be what such figures are for, but that shows just how uneasy many people are about the idea of choice.  They want to believe that outcomes are predictable -- if you do this, you'll be happy; if you do that, you'll be unhappy -- and that someone (God, a priest, a doctor) knows which one to seek.  But there are no guarantees, and this, again, has nothing to do with social constructionism in itself, even if you want to believe, as many of its proponents clearly do despite their disavowals, that social construction gives a true picture of reality.

Wednesday, May 18, 2016

Make the World Go Away

One of my Facebook friends from high school posted today:
I'm thinking of a break from Facebook. I can't take the political shit that's going down. Bathrooms, Donald Trump bashing, people getting beat up. Can't take the stupidity I see on here. And the hate. How did we get so unforgiving to other people and their way of life. What happened to peace and love?
Well, I certainly sympathize; I've had the same idea myself.  But two things stop me. One is that many of the people who complain about negativitity and hate on Facebook or elsewhere are complaining because they posted something hateful, dishonest, and negative on Facebook, and someone called them on it. (One good example, but there are many others, is the person who told me she'd like to shoot Mexicans for target practice.  She was very upset when I called her racist, I was being mean and political, and she was tired of all this political stuff.) Such people never take responsibility for what they themselves say.  Responsibility is for other people -- they can just exult in their Trump-like, Reagan-like talent for blurting out whatever pops into their heads, because they're telling it like it is, and they don't care who they offend.  But you'd better not offend them, because that's being negative, and they're very special snowflakes, too good for this world.

It also seems to me that one reason the world is in the trouble it is, is that so many people react to the problems they see by running away from them, hiding their heads in the sand, retreating into "mindfulness" and other egocentric practices. Apparently they believe that if they don't see it, it isn't happening, and if they ignore it, it'll go away by itself.  I decided as far back as high school that I wasn't going to do that. And it's not easy. But do you want to know why Trump, for example, is so successful? It's because people didn't want to engage with the people around them.

And it also seems to me that this is a First World luxury. I noticed this in high school too, when I heard white people saying that they were tired of hearing about black people's problems.  Not as tired as black people were!  (And are.)  Maybe my friend can make the world go away, but poor people can't. Black people can't stop police shootings of unarmed kids by giving up Facebook. I can't escape antigay bigotry by giving up Facebook. Women who need abortions can't get them by giving up Facebook. Children killed by Obama's drones can't come back to life by giving up Facebook.

But, as I say, I sympathize. Everyone has to set their own limits and do what they can.  My limits include not remaining silent when confronted by bigotry and injustice, which of course is considered negative and hateful by many.  I can live with that.  Or, to put it another way:


Except that I don't accept these Western either/or binaries: you have to do both.

Tuesday, May 17, 2016

Already It Was Impossible to Say Which Was Which

The Intercept reports that Donald Trump appears to be changing his stance on entitlements.  Up till now Trump has insisted that he would protect Social Security, Medicare, and Medicaid against attempts to cut benefits, promising to "focus on economic growth so that we’d get 'so rich you don’t have to do that.'"

But now he's backing down, or at least reconsidering.
Trump policy adviser and co-chairman Sam Clovis said last week that the real estate mogul would look at changes to all federal programs, “including entitlement programs like Social Security and Medicare,” as part of a deficit reduction effort.

Clovis made the comments at the 2016 Fiscal Summit of the Pete Peterson Foundation, an organization whose founder has spent almost half a billion dollars to hype the U.S. debt and persuade people that the Medicare and Social Security programs are unsustainable. Trump also met privately last week with House Speaker Paul Ryan, R-Wis., an outspoken Medicare privatization advocate.
All pretty predictable, no?  My first reaction was that this sounded familiar.  Didn't Barack Obama follow a similar trajectory?  Yes, he did.

When he was campaigning for his first term as President, he told an AARP convention on September 6, 2008 (via), "But his [McCain’s] campaign has gone even further, suggesting that the best answer to the growing pressures on Social Security might be to cut cost-of-living adjustments or raise the retirement age. I will not do either."

Of course, he did both.  In 2010 he appointed a commission to make recommendations on cutting the national deficit, packing it with deficit hawks (including Paul Ryan).  Despite the way Obama had rigged it, the Simpson-Bowles commission was unable to agree on conclusions, so Alan Simpson and Erskine Bowles wrote up their own recommendations, which President Obama and most of the media treated as if they represented the commission as a whole.  These recommendations included phasing in a raise in the "retirement age" (meaning the age at which a retiree can receive full benefits) to 69 and changing the index for cost-of-living adjustments so as to lessen those adjustments, resulting in a benefits cut of 3 percent according to the economist Dean BakerObama also announced his willingness to use cuts in Social Security and Medicare to bargain with the Republican Congress on the debt ceiling.  The Business Roundtable, a gang of corporate CEOs, would prefer raising "full retirement age" to 70, and of course the usual Republican suspects in Congress were already on board for that.

So there's concern that "A Trump presidency would threaten programs like Social Security."  It would probably would, but so would any other Republican candidate's presidency.  So, most likely, would a Clinton presidency.  But an Obama presidency already has.

Sunday, May 15, 2016

All Critics Are Equal, But Some Critics Are More Equal Than Others

I hope I can wiggle out of having to read this book, and even more I hope I can wiggle out of having to buy it.  I suppose I can just request that the library get a copy...

What I'm talking about is something called This Thing We Call Literature, by one Arthur Krystal, published earlier this year by Oxford University Press.   It got a laudatory review from Micah Mattix at The American Conservative, and you know, slogans like "In Defense of Great Books" are a red flag for me.  So for the moment, I'm just reviewing the review.  If Mattix misrepresented Krystal, I beg pardon, but the position sketched out here is one I've encountered before.

According to Mattix,
... Krystal’s main concern is not to chop individual writers down to size and extol others—though he does do some cutting—as much as it is to defend the value of hierarchical thinking with respect to literature. “The prevailing mood,” he writes, “regards hierarchies with suspicion: Who’s to say who is worth reading and who isn’t?” While a willingness to include “formerly disenfranchised artists and writers” in the canon is a good thing, “the fact that writers are all entitled to a fair hearing doesn’t mean that they are equal.”
I wonder if Krystal actually backs up the insinuation there, that "the prevailing mood" is that all writers are equal.  It sounds like he's buying into the widespread if not prevailing notion that equality means sameness.  The quotations indicate a fondness on his part for the gaseous cliché and le mot injuste, but then you don't have to be a good writer yourself to appreciate good writing by others.

Maybe I got it wrong, but I always had the impression that the call for reconsidering "formerly disenfranchised [not really the right word, Arthur] artists and writers" was motivated not by the belief that they were all equally good, but by the very reasonable belief that some good writers had been ignored because they had chosen to have the wrong skin color or genitals.  Granted, in the heat of the politicking, some people probably overstated the virtues of the artists they championed, but it's not as if that didn't happen with the usual white male suspects as well.  And the racism and sexism of the literary establishment was never really a secret; in the good old days, white men could and did openly dismiss work by women and Negroes and homosexuals and the Irish, just because they were women and homosexuals and Negroes and Irish.

I admit, I "regard hierarchies with suspicion," and I think that "Who's to say who is worth reading and who isn't?" is a fair question.  From Mattix's review I am not sure Arthur Krystal is one to say it.
Unfortunately, Krystal doesn’t help his case—which I think is almost entirely right, by the way—by often failing to demonstrate in detail how canonical writers are actually better than the minor writers who were forgotten ... But in these first two essays, he mostly lists writers and critics or turns to ex cathedra pronouncements—“War and Peace is objectively greater than The War of Words”—which are rarely very satisfying, however correct they may be.

When he does get more specific, things can get a little thorny. Is it true, for example, that great novels “rely more on accuracy of characterization than on the events that their characters react to”? I suppose it depends on what “rely more” and “accuracy” mean. Without further explanation, questions abound: Is Raskolnikov in Crime and Punishment accurate? Does the novel rely more on characterization than events? How about in classical drama? The Iliad and The Odyssey? Pamela and Jane Eyre?

... Krystal notes in passing that some fiction that goes by the name “literary” isn’t, but he doesn’t explain why, give examples, or offer even a brief analysis of the failures of literary fiction.
I feel about the claim that some works just are great rather the way I feel about claims about objective reality: yes, I can go along with that, but how do you tell what's really great, and what's really real?  In the case of art, we're talking about value judgments rather than measurements, and those judgments must function within traditions.  They also change over time.  The canon, which both Mattix and Krystal admit has "sociological roots," changes.  Krystal says that those roots don't mean the aesthetic judgments are invalid, but how can you tell?  He invokes "a credible, if not monolithic, consensus among informed readers," which is plausible, except that informed readers disagree with each other and change their minds over time.  Numerous authors who used to be canonical aren't anymore.

Mattix's questions might well be supplemented by others about different artistic traditions, such as Chinese poetry.  Much if not most poetry doesn't translate well, and I still regret not having stuck with Russian long enough to read Pushkin in the original.  Mattix quotes Krystal's lament that today's poetry lacks "music."  Music is generally the first thing you lose in translation.  Which is true of prose too, if less so.

An interviewer asked Gore Vidal about his claim that, "when it comes to matters of prose and of fiction at this time and in this place, I am authority."  Whence, she inquired, comes this authority?
It is earned, mostly, but it is also a matter of temperament. The critic must know more than either writer or academic. He must also value experience and have a truth-telling nature. I think I have that. In their youth most people worry whether or not other people will like them. Not me. I had the choice of going under or surviving, and I survived by understanding (after the iron- if not the silver- had entered my soul) that it is I who am keeping score. What matters is what I think, not what others think of me; and I am willing to say what I think. That is the critical temperament. Edmund Wilson had it, but almost no one else now does, except for a few elderly Englishmen. 
I still agree with Vidal, but I'd add that he wasn't the only authority.  Others would disagree with him about the value of different writers, and I disagreed with him about some of those he recommended -- Italo Calvino, for example.  But still, the job of a critic is to persuade, by explaining why and how he or she thinks this particular work functions and succeeds (or fails).

One thing that seems increasingly important to me is that a critic, who reads widely and deeply, may forget that most people don't.  As Edmund White has said, "A canon is for people who don't like to read, people who want to know the bare minimum of titles they must consume in order to be considered polished, well rounded, civilized. Any real reader seeks the names of more and more books, not fewer and fewer."  And, I'd add, a real reader in this sense isn't all that concerned with greatness, though he or she won't be undiscriminating or non-judgmental either.  But most people aren't, and never will be, "real readers" in this sense.  It's up to teachers to try to guide as many of their students toward a sense of what reading, or Literature with a capital L, has to offer.  (I wonder, though, how many teachers really know that themselves?)  If greatness is a thing, then it should be detectable by any reader, not declared ex cathedra.

It sounds as if Arthur Krystal lacks the capacity to do the critic's job, at least in Vidal's league.  Maybe I'll read some of his other work.  This Thing We Call Literature is only a little over a hundred pages long.