The “literary novel,” which is to say the middlebrow novel, is written by and for a specific class, and often represents it to itself in just the way it sees itself; but the project of the realist novel is supposed to be the representation of society as it is. Suppose one took this class-cultural narcissism to be a matter of bad taste — in the old-fashioned sense in which “good taste” could encompass a bearing toward the world as well as a merely aesthetic preference. Would one not, then, be viscerally disgusted to read a novel describing its characters’ bourgeois lives as boastfully as if it were their parents’ Christmas letter?

The month before graduation, she dons a killer interview suit—sleek, gray, professional, inexorable as a NoCal earthquake. She interviews with eight campus reps and gets three offers. She takes a job as a casting process supervisor for a molding outfit in Portland, because it offers the most chance to travel. They send her to Korea. She falls in love with the country. In four months, she learns more Korean than she knows Chinese.
Her sisters, too, wander across the map. Carmen winds up at Yale, studying economics. Amelia gets a job nursing wounded wildlife in a discovery center in Colorado.

This (from Richard Powers’s The Overstory) inspired me with a moment of genuine revulsion; it should’ve, but I’m willing to bet didn’t, many others.

(Yes, of course it could be free indirect discourse intended as subjective. It just isn’t.)

“Is it time to worry that literary novels will be among the next casualties of Trump Derangement Syndrome?” asks Dwight Garner, in reviewing Jonathan Lethem’s disastrous new The Feral Detective. One can only assume both the question and the future tense are rhetorical; the answer is obviously “yes, and it’s already happening.” Two writers who’d previously seemed to have such social-satiric talent as Lethem and Gary Shteyngart both having published in recent months by far the worst novels of their respective careers is obviously no coincidence, particularly given the conceptually feeble yet rhetorically emphatic gestures of “Trump Era” topicality in both of their godawful books. Apart from the sad fact itself there may only be a bit more to say; but it’s interesting to observe what form this catastrophe of the liberal imagination has taken. Neither novel has very much to say about politics itself, and each is careful to maintain a sort of equanimity about the silly out-of-touchness of urbanites and liberals. And both are road novels with New Yorker protagonists — each of whom you might describe as an “elite” of a different sort, Lethem’s an impeccably liberal journalist, a cultural elite, where Shteyngart’s is a rich hedge-funder — whom we follow into a place that is heavily underlined as the Real America, a land of remote backcountry highways populated by mysterious, threatening (and sexually alluring!) pickup-truck drivers. This, it seems, is how the “literary novel,” that creature of middlebrow good taste, has run itself aground in Trump’s America: by embarrassedly, half-heartedly exoticizing the land of bad taste, big hair, bad coffee, and big trucks in contrast against which it exists — a land of fantasy, of course, and one which its writers don’t seem to have visited very recently.

I stumbled on a striking family resemblance between two readings of universalizing rhetoric. Seamus Perry on Auden:

Such cartoonish ethical imbecility comes not from malice but from not really being a grown-up, from a basic frivolity that finds irresistible anything that’s extravagant, impressive, stylish: ‘All poets adore explosions, thunderstorms, conflagrations, ruins, scenes of spectacular carnage,’ Auden would later write. ‘The poetic imagination is not at all a desirable quality in a chief of state.’ All poets? At such moments you realise that what seem designed to be universal claims are actually instructions to self, emerging out of a very individual history, and that they probably don’t make that much sense applied to anyone else.

Adam Phillips on Freud (from the introduction to Becoming Freud):

And the “we” Freud was referring to was possibly, he thought, not the fin de siècle Viennese middle class that he knew, but the entire human race. Freud, in other words, in the way of the great nineteenth century European intellectuals, was also a great generalizer. Freud, in actuality, met a very small group of people in his life, but universalising a point—one of Freud’s most interesting papers, for example, he entitled “On the Universal Tendency to Debasement in the Sphere of Love” (1912)—was a way of rhetorically enforcing it.

Universalizing is not, these days, accorded this kind of interpretive charity. It is no longer understood as a commonplace form of mild hyperbole; particularly strained or overextended universalizing, unapologetic universalizing without the proper caveats extended in respectful genuflection to particularity, is more commonly damned with the zeal of an SAT-taker picking the choice that says “some” over the one that says “all.” It’s one of the biggest translation problems that many readers face in trying to understand the “social theory” (loosely; the way the society of the past talked about itself) of the past (say, roughly, the prewar past or the pre-1970s past). There’s obviously a family resemblance beyond their historical era between Freud and Auden as well — universalizers of a very particular kind — but the broader social background is the more important interpretive missing piece. We (“we”!) no longer live in a rhetorical world in which universality is an aim important enough to aim, however tendentiously, at.

On the passing of “neoliberalism” outside the realm of political economy.

After listening to enough people of a particular kind talk — for the sake of economy I’ll uncharitably call the kind I have in mind “confusedly leftist humanities grad student” — you realize that it’s surprisingly possible to have insight into the subjective experience of life under late capitalism without having any kind of real understanding of political-economic critique, of what an old-fashioned Marxist might just call politics full stop, the kind that’s about understanding the distribution of power in the world, not about chronicling its psychological effects. There seems to be a whole generation of the overeducated who are familiar and articulate users of a set of critical (in the broadly aesthetic sense) tools for closely examining individual experience and its traces — doing “deep” close-readings of all the ephemera of online life and finding in them the traces of the schizoaffect imposed upon us all by the neoliberal media environment, and so on — but have no understanding of mass politics whatever, instead understanding the political world by doing the same thing to “leaders” and the public faces of movements, as individuals. The person whose political critique stops at “neoliberalism” in this sense is a kind of post-liberal or a “postmodernist” in the diagnostic rather than the theoretical sense; the age has created a “theory” which detours people from mass politics even as it appears as a form of critique, a critical game that you can play endlessly without ever getting anyplace new.

Much of the confusion around the word “neoliberalism” arises from its ambiguity with respect to this question: in one discourse (a), that of political economy, it refers to a given political situation and a strictly political-ideological agenda, but in another discourse (b) it refers to all the subjective effects of this system. It would seem that the inhabitants of (b) ought to find another word, or just learn to say “the effects of neoliberalism on subjective life” — indeed this might serve to clarify their own confusion as well as everyone else’s.

A possible taxonomy of journalistic fields and institutions:

  1. Those in which a Stephen Glass or Jayson Blair-style fraud would be discovered immediately
  2. Those in which a Stephen Glass or Jayson Blair-style fraud would be discovered eventually, and corrected
  3. Those in which no one could tell a Stephen Glass or Jayson Blair-style fraud apart from the usual course of things

Everyone wishes, and some just pretend, that the mainstream American corporate media’s coverage of, say, national politics, were (1); it’s probably mostly (2). The business-and-technology media may reside instead on the boundary between (2) and (3). If a salacious headline depends on enough complicated technical stuff, it never has to be retracted.

(Previously on the tech press, in a similar spirit)

A persistently strange feature of the present is that a significant fraction of self-identified liberals (the moralistic fraction) have little investment if any in some of the fundamental principles of liberalism — pluralism, freedom of thought, the separation of private belief from public life, et cetera — and indeed are deeply uncomfortable with open public debate, the central value of their supposed belief system. Their response to “you believe X and feel you have good reasons to do so; but others believe Y, and also feel their own reasons for it are good” is more often anger than argument; they prefer condemnation to explanation. Frequently the one pointing this out will be damned as a Y-believer himself; barring this he will be accused of “equating” the two beliefs, “saying they are the same” — in a less childish register, the idea seems to be that merely making the comparison “X and Y are both beliefs” somehow implies, impermissibly, that they are equally good beliefs. (Foot-stomping about all comparisons, all arguments involving any kind of analogy — “how dare you say these two cases are the same” — frequently emerges from the same mindset. The moralist learns each specific case by rote, or rather by feeling.)

The perennially useful, leaden-titled ed-psych work Forms of Ethical and Intellectual Development in the College Years casts this, correctly, as an infantile or underdeveloped response to the central problem of intellectual adulthood — what it calls “relativity.” Being a grownup, in this charmingly dated liberal-rationalistic term for a still-vital problem, is learning to live rationally in a world of conflicting views; framed in this way the “liberal” Internet moralist’s outrage at most forms of (even quietly rational) dissent is understandable as a form of denial, in the fullest psychological sense.

(An agnotological hypothesis.) As recursion or inclusion — the nesting of one thought, or clause, within another, more complex thought or sentence — is commonly contended to be the most important building-block of reason, discomprehension of it is one of the central features of stupidity. The stupid response, the unintentional misreading, of a complex thought often centers on a discomprehended nesting — a mention of an idea mistaken for a use of it, a comment on an argument mistaken for an endorsement of it. The stupid mentally erase quotation marks, and mentally collapse syntactic bracketing, responding to words as they experience them in order, rather than in the logical relations expressed by the syntax of complex sentences.

Consider as examples the following forms of exchange between one person capable of reason and one incapable of it — which I think will be all too commonly recognized in everyone’s experience of the Internet.

A: What would you say to someone who said X?
B: I knew it! You believe X! You’re a bad person: only bad people believe X.

A: You said X, but X has the bad consequence Y, which I’m sure you agree we don’t want.
B: Why do you want Y to happen? Y is bad.

In these and many other similar situations it seems almost as though, for the stupid, quotation marks do not exist; mention is use; to name a belief at all is to subscribe to it (or at least to name it without performing an elaborate ritual of condemnation akin to spitting three times and throwing salt over your shoulder, the way a medieval peasant might after naming the Devil). Argument is, for them, much like a kind of incantation: to speak a view out loud is to summon it, to invite it to presence in the mind, and therefore indistinct from advocating it.

“The point is not Harvard, but which Harvard” (C. Wright Mills).

I spent most of the week of the Kavanaugh confirmation hearings in the desert with barely any cellular signal but eventually, unfortunately, returned and caught up with the bulk of this episode of the obsessive self-harm that is the world of news-and-commentary consumption. The only really salient point to emerge from the ugly mess as usual seems to be the one made on Chapo, about the truly disturbing hollow-eyed amorality that this episode once again reveals in the (bipartisan!) American ruling class.

But this isn’t a matter merely of some small inner circle of rulers and their lieutenants, a “power elite” of a few hundred or a few thousand; the authoritarian rot is deeper. If as perfect an exemplar of the American educational “meritocracy” as “Tiger Mom” Amy Chua is now known to have groomed female law students including her own daughter for sexualized humiliation under Kavanaugh, what this hazing reveals to us is just the new, liberal-meritocratic face of the authoritarian personality in America — which draws on the same sadism and submission, in its ritualized hazings and covert celebration of impunity for the powerful, as ever, while wrapping these in a thin pretense of meritocratic ambition and educated impartiality. The striving climber, seeking only a personal way up in the hierarchy, and the authoritarian sadist, taking pleasure in enforcing it on the weak, are not so distinct as they might once have seemed. There may only have been the one Harvard after all.

In the course of trying to remember the title of a memoir by a “disappeared” survivor of a torture camp that I’d read long ago, I ended up looking at its reviews on Goodreads. A solid 75% of them were people who’d been assigned to read it in class complaining that the book was depressing, sad, no fun. One concluded, in what its author must’ve thought was generosity, “I’m still glad I read it, because now I can speak knowledgeably on the subject.”

This may be the only thing the Internet is truly good for: an unexpected encounter with blithe depravity — and a useful reminder how many of the seemingly grown people around us are moral infants.

(Home from the HillTokyo Twilight) There’s strikingly little agreement about what exactly “acting” is and how you can tell when someone is doing it; but at minimum it seems clear that there’s often a complementary relation between acting and casting. If range — the ability to portray dissimilar roles — is an attribute or token of the skill of acting, its only clear opposite is typecasting, the portrayal of similar roles again and again. But of course many beloved actors had, and have, relatively little range; and indeed the old studio star system relied heavily on this, allowing each new appearance of an actor to underline with less effort for an already-prepared audience the character traits they were known to possess, the ones that came, in a sense for free, with casting that actor for the part: the type. If Robert Mitchum was, in the kindest possible phrase for this total absence of range, “a natural,” this is another way of saying that not that much of what he did was acting, in the sense that he could ever convince an audience of the interior traits of some other person, some non-Mitchum, whom he was portraying. Instead, he was almost always on screen to portray the very traits you’d cast Mitchum to portray — doing that thing he did, when placed in front of a camera, or indeed conveying the collection of character traits that “Robert Mitchum” came quickly to mean: effortless, insouciant masculinity, etc. This is the sense in which “get me someone who looks like Jake Gyllenhaal” is more than a joke about what Hollywood fame means: to be truly famous is to become a type, and to aspire to fame is to attempt to fill one. This is why a proposition like “George Hamilton could’ve been Tony Perkins,” or “Chishu Ryu was the Japanese Jimmy Stewart,” has an easily comprehensible meaning: it translates to an analogy about types — about the range and extension of character traits that a movie could obtain, for free, by putting a face to a role (in the latter and more interesting case, those of the sympathetic, dutiful, put-upon Everyman).

But there’s a different sense in which the “acting” that’s the opposite of underacting or non-acting isn’t range, nor is it histrionic overacting (the kind of heavily underlined AC-ting! or ostentatious impressionism that I think of as Streeping). Instead it’s something more like rapid face-making, or the ability to convey mutiple emotions with tiny variations of expression; or in actor-ese it’s the kind of thing meant by the cliché acting is reacting. The thing I want “acting” to mean, for me its true meaning, is the kind of subtlety and emotive depth conveyed by Setsuko Hara in a single second, in showing us how she (her character) feels upon hearing a line — and then showing us how she doesn’t want her father to see that, and then showing us how she has concealed it, all with about three facial muscles in the blink of an eye, or of a camera shutter. This is something that few actors even can be seen to do, and even of those capable, far fewer know how not to overdo.

If my life were being described by an aspiring novelist desperately seeking a telling detail (and it’s an interesting thing, how hard it is to figure out what your own telling details are) then I imagine that I could be placed pretty precisely, both in geography and in social history, by the fact that it took me a long time to learn not to confuse Jane Jarvis with Jane Jacobs — an urban organist and an organic urbanist.

“Postliteracy” may or may not be a good rubric for a broad cultural-historical analysis,* but it sure as hell names a real phenomenon in the world of communication technology. Emoji are like a regression from language to pictogram; we now “write” a signature on a screen more and more of the time with our fingers. Lately when I go through the pantomime of a contract like this I feel like an illiterate 18th- or 19th-Century sailor or soldier being told to “make his mark.” Point-of-sale systems make Queequegs of us all.

* This is probably the same “post-” involved in words like “postmodernity,” in the sense that it makes more sense as a periodization if you substitute “late” — or even “de-” or “decadent” — instead. The “post-” of historical regress, or anti-teleology, or just change that doesn’t seem to be for the better.

I was thinking about the complicated semiotics of the sexual signals in ads for men’s underwear — the way the ads communicate that the product “is gay”* or, on the other hand, distance themselves from it. And I came across the following sentence about the way customers want the ads’ models to look:

“They want models who are somewhat aspirational, and they want to look like the guy in the pictures, but every model can’t be blond, hairless and perfect.”

This is a typical piece of ad-speak in some ways but also an interestingly revealing one. Like much of the language-like noise that emerges from the mouths of advertisers it is, denotatively, almost empty of meaning — “aspirational” is just another word for what people “want,” so the first clause is a simple tautology; the second clause is undecidably ambiguous about the very matter under discussion (does this mean the models should look like the purchasers or like the purchasers’ fantasies of themselves?); only the third seems to make a prescription, and even there it only does so in the form of a pseudo-universal “can’t” that’s strictly just a falsehood (of course they can, perfect hairless blond models being readily available). If you can recover a meaning here, most of it is latent and connotational, a matter of reconstructing what the speaker meant to say or what he was hinting at obliquely. It’s only based on its position in the article that you can tell this sentence is meant to say that the ads should have more normal-looking, approachable models in them.

But it’s interesting even so how all the wanting seems to work. The business of advertising is the creation of the needs and desires that drive consumption, but the speaker — clearly a habitué of displacement — has displaced even the responsibility for this creation onto an imagined need in the consumer. He won’t say outright, and probably can’t even consciously think, that the aspiration to look like an “aspirational” model is itself inculcated by advertising; instead he pretends it answers to an existing desire. He doesn’t want to think about where “aspiration” ends and mere fantasy begins, and indeed how we all tell the difference when we’re looking at all those decapitated racks of abs.

* That is, more or less, what it is that purchasing the product will or won’t signal about the purchaser’s sexual identity — to others, or equally to himself. Like a lot of advertising the goal seems to be to associate the product with an identity, so that it’s consumed not for its own functional or even aesthetic attributes but rather as a component of a self bought off the rack. There’s more money in selves.

There’s a rule of threes for the rhetoric of future history. A science-fiction writer will conventionally establish the continuity of the imagined future with the nonfictional past by listing three (more rarely, five or seven) things. If the writer is unimaginative, playing it safe, seeking to establish credibility, two of the three will be real; if the writer is more daring, adventurous, or fantastical, two of the three will be imagined.

In a more extended version of the real-and-future-history copia you can see the other syntactic hallmark of the trope, which may even be truly unconscious: the real and the fictional cases tend not to occupy the same sentence. This typical example is from Clifford Simak’s imagination, in 1953 (Ring Around the Sun), of the next thirty or so years of the Cold War:

“The cold war still goes on,” said Mr. Flanders. “it’s been going on for almost thirty years. It warms up now and then, but it never does explode. Has it ever occurred to you, Mr. Vickers, that there have been a dozen times at least when there should have been real war, but somehow or other it has never come to be?”

“I hadn’t thought of it.”

“But it’s the truth. First there was the Berlin airlift trouble and the fighting in Greece. Either one of them could have set off a full scale war, but each of them was settled. Then there was Korea and that was settled, too. Then Iran threatened to blow up the world, but we got safely past it. Then there were the Manila incidents and the flareup in Alaska and the Indian crisis and half a dozen others. But all of them were settled, one way or another.”

The line between fact and fiction, of course, falls between Iran and Manila. It’s not coincidental that the fictional incidents appear in one continuous sentence rather than standing closer to on their own syntactically as the non-fictional ones do.

Of course you could write a book (indeed many people have) on the ways class is distorted and deflected in the American vernacular. And of course most American talk of “class” is a bunch of bullshit about perceived cultural status rather than economic power, relation to the means of production, or ability to control the terms of one’s labor. But there’s one small feature of that discourse that seems underdiscussed even granting all of this: the way people talk about mobility while actually meaning position. You can see it in the “upwardly mobile” component of the original acronym behind “yuppie,” and equally in the recent (post-2008 recession) uptick in discussion of the “downwardly mobile” status of well-educated young people — none of this is actually about mobility, about people changing status over the course of their lives. Instead it’s about a mismatch between the perception of someone’s status and the no-longer-avoidable fact of their actual material level of wealth — based in the first case on the residual WASP ruling-class sense of the non-elite nature of professional labor, and in the second on the general presumption that higher education confers middle-class status automatically. The only “mobility” involved is in the head of the person speaking the words.

I’ve sometimes thought there might be a useful critical practice — something at least in the vein of a finger exercise for pianists, if not a bit more theoretically interesting than that — in attempting to make connections within an artistic jumble, or to find least common aesthetic denominators among dissimilar, haphazardly or randomly collocated texts. Here, then, are a few permutations of the movies I happened to watch over the last week.

(Summer 1993 + Hereditary) The experience of childhood is uniquely captured by muffled, barely intelligible offscreen dialogue. Sound design isn’t talked about enough in movie criticism as an analogue to focus and framing — and in this case a filmmaker who knows they can be selective with what the audience is allowed to know and understand has a power almost akin to a parent’s over a child, which in this case can be made into the basis of the medium’s reimposition of a world of experience otherwise permanently lost to an adult audience.

(HereditaryWhite Material) The child soldier is a far more terrifying presence than the Satanic mumblings of a black mass. In order to evoke a real horror you have to know what is horrifying in the real world, and avoid what’s merely a campy recreation of something that might have horrified a very gullible person a century ago. We still have demons and you can still do demonology by conjuring them on screen; they’re just not contained in pentagrams anymore. To draw pentagrams after Auschwitz — it may not be barbaric, exactly, but it’s awfully silly.

(White MaterialDouble Lover) It’s almost bizarre to observe how many of the “foreign films” that play only in “art-house” American theaters to tiny self-satisfied middlebrow audiences are, in fact, just what Hollywood ought to be doing, or would do if it were more competent. Claire Denis, who might be thought of as a Christopher Nolan with an actual brain and a sense of humanity, and Francois Ozon, who’s basically a younger De Palma, are not making some kind of forbiddingly cerebral high art that should be beyond a mass American audience. Is there anything that explains this beyond monolingualism?

American English is a weird language because so many of its norms and what you’d otherwise call its “rules” are incredibly weakly held. The puzzle of why Americans are so bad at spotting fake American accents from TV and film actors has only one convincing answer — that there are so many different real American accents that many Americans don’t find themselves able to rule out almost anything, figuring that any weird combination of vowels might just be some regional variation they don’t know, rather than a slip or a mistake. (“Everyone knows what an American accent sounds like, except the Americans,” says the purveyor of one of the worst ever recorded.) And if you watch the World Cup you’ll see this flexibility even past the breaking point applies not just to the phonetic features of accents, but even to syntax — half the time American soccer commentators violate one of the clearest, most distinctive syntactic rules of American English, the use of singular verbs with collective nouns. In an unconscious contamination from the predominance of British commentary on the sport, presumably, it seems like many monolingual American jocks and sportswriters have learned to speak their own language wrong — and to write it.

A dangerously innocuous cousin of “Fuck you, got mine” is “Be patient, got mine.” Material security and life satisfaction make people complacent; in comfort the problems of other people can be recognized, sighed over, and their fund drives donated to, in perfect decorum.

The work of ideology is just as much to render unthinkable certain thoughts as to render others prominent; a durable false obviousness can be constructed only by negating other obvious ideas, rendering them falsely complicated, contaminating them with the association of confusion and doubt. Rather than the production of new ideas this is often the chief responsibility of the ideologue, whose ability to sow what the computer salesmen used to call FUD — fear, uncertainty, and doubt — is a key professional skill. And often enough FUD is most easily produced by someone genuinely experiencing it himself, whose understanding of the problem in question is honestly confused, whose false complexities and woolly-minded verbiage emerges from a true inability to see what’s going on. If the most callous ruler’s obvious expressions of contempt for the ruled is, to someone, “brilliantly” mysterious and endlessly open to interpretation, or if “left” and “right” really are, in all honesty, incomprehensible terms to them, if “politics” seems to them to be a matter of mysteriously shifting symbolic interpretation, individual expressions of rhetorical allegiance and personal identity totally divorced from the material distribution of power and resources — then that person will often be preferred, professionally advantaged, as a political commentator, as their ability to produce confusion serves the needs of power far better than those who can see things as they are and describe them in simple terms.

Comparative sports aesthetics actually is more than a little like comparative literature — although sports fans don’t have the same vocabulary (often enough, even to see that aesthetics is the thing they’re discussing). If you’re a lifelong fan of one sport one of the easiest things to notice and consider, as you try to enjoy another, is what’s the same about the experience and what isn’t.

As a lifelong baseball fan I’ve been sold on the charms of the World Cup, predictably enough, largely by watching with friends from other countries and asking for explanations; by now it’s stuck with me enough that I find myself waking up early to watch even some of the unexciting first-round matches, even though I couldn’t tell you what was happening in soccer at all during the other three years.

One of the things that I notice is soccer’s very different experience of time and pace. Not so much between the games, since that’s to be expected: a baseball fan is used to watching 162 games a year, and nothing in the middle of the long daily slog of the baseball season can be expected to have the excitement of a less-than-yearly elimination tournament where even a dedicated fan sees each national team play only a few times. But within them — each of these sports is dull, to those who dislike it, because the games take hours, while the few moments that end up deciding them take only seconds. In each case you’re watching — if you’re watching from this casual perspective, at least, rather than the detail-oriented knowledgeable fan’s — for a moment of grace, a single glorious feat of skill, or maybe two or three of them, spread out across a couple of hours. But in baseball your time and your attention are punctuated: there are spaces between pitches, and between at-bats, while soccer is continuous. You talk during a baseball game confident that you’re not missing anything, pausing in a planned way for the action; you talk during a World Cup match the way you talk while you’re driving, always with half an eye on the road, ready to stop mid-sentence if you suddenly have to pay attention. And you can’t count on seeing things again: because of those brief pauses baseball is well-suited to replays and slow-motion, but there’s nowhere to fit them in a soccer match. Because of this and because of its transience and importance the World Cup feels more like theater and less like film — more like something that happens once and then is simply over, all its flaws and all its little glories now only memories.

The phrase “vulgar materialism”  has been somewhat unaccountably renewing its circulation in the online left recently. While the phrase has a certain position in some historical debates around Marxism (in which use “vulgar” is more or less synonymous with “mechanical” or “deterministic”), it may just as easily, I worry, in this case have been borrowed and repurposed from, or at least amplified by the same words’ use to mean something different in, the mystico-religious irrationalism of the alt-right; there’s too often very little gap between the autodidactic jargon-salad you get from the online “left” and “right” as people try on stances, ideas, and rhetorical postures to see what fits.*

The use of “vulgar” is of course meant to signal the speaker’s sophistication; I believe in a complicated materialism, they are saying, not a simple one. But such speakers, as do so many overzealous sophisticates, lack the courage of their claimed conviction. To be a materialist is to be vulgar in at least two senses. It is to signal an allegiance to low things, to believe in the ultimate determinative power of the uncomplicated, unadorned material facts of life and of unsophisticated, simple, poor people over (or underlying) complex or intricate cultural epiphenomena and the sophisticated people who like to talk of them. And hence it is also to be common, in the etymological meaning of the word: a materialist is necessarily one who speaks in the vulgate, the common language of everyone, all of us having access to the material facts of ordinary, everyday human life, which by virtue of their ordinariness do not require esoteric terminology available only to initiates. There’s a buried oxymoron in “vulgar materialism,” a cognitive dissonance — common too to other kinds of alleged or avowed democrats, like bourgeois liberals** — which is being subdued only with palpable difficulty by the term; one cannot avow materialism and claim at the same time to be a sophisticate. (At least in the political sense used in the Marxist and Left-Hegelian tradition; it’s of course a complicated term denoting a lot of other philosophical positions as well.)

* That is, what “fits” the speaker libidinally and in rhetorical practice, rather than in the sense of the words’ fitting to the empirical world; most of what’s being worked out in these play-fights is what words you can use to feel good, or to make others feel bad, to be seen as “winning” by some audience however well- or ill-read.

** This, of course, is no coincidence.

(This is the 365th daily post on this blog. Having proven to myself that I could keep it up, I’ll be decreasing their frequency somewhat from now on. I’ve greatly appreciated all the supportive and interesting comments and very much hope they’ll continue.)

An interesting passage — too long to quote in its entirety — in Lefebvre’s later foreword to his Critique of Everyday Life deals with the unfriendly reception of the book in what he calls “dogmatic Marxist” quarters. From the samples Lefebvre provides and indeed from what one knows of its argument’s historical situation it’s not hard to imagine how garbled and jargon-laden, formulaic and stupid, this reception must have been; it’s not, in that way, particularly unfamiliar. But Lefebvre more than most is able to put his finger on the stupidity’s underlying psychology rather than politicize it: he doesn’t make the all-too-frequent mistake of taking the dogmatist’s stupidity as a principled intellectual objection or as reflective of a political difference. Instead, he quite properly caricatures it. (This is important: there are stances to which this and not fussy courteous seriousness is the correct response.)

Clearly I have a rationalist and Cartesian background. It is indeed conceivable that at one time or another these tendencies of mine may conflict with the materialist and dialectical thinking which, despite its differences with my background, is my starting point; and I know this. It is equally conceivable that I have resolved this conflict, or that I will resolve it, in a creative way. And what could be more natural than such a conflict, given that everything is contradictory and that we only move ahead in and through contradictions? But here comes the sectarian, dogmatic Marxist. Frowning, threatening, contemptuous, or indifferent, he smells something he disapproves of, something incompatible, something impossible: one cannot be both a Cartesian and a Marxist; Marxism is radically different from Cartesianism in respect of its theory, its content and its aims. Therefore he will define me as a Cartesian, fixing me for ever, labelling me, nailing me to the whipping post of Cartesianism. And he will be very pleased with himself: he will have served the cause of Marxism and the proletariat. And if I answer back, he will brand me with other labels, other epithets. In fact he will have transformed a problem, in other words a real conflict – and therefore a creative one in so far as I am able to resolve it – into an irresolvable, unproductive antinomy.

“Dogmatism” is Lefebvre’s label, and a useful one, for the penchant of the stupid for “labels” and “epithets” — for something I’ve bemoaned before: the kind of premature naming provided by -isms. He had in mind a type of person, and a way of thinking, you’d encounter then in Party meetings and among minor intellectuals, but it’s something you can still observe in a lot of other places, in the academy or online, among teachers and students anxious to demonstrate their authoritative comprehension of things they don’t quite understand, or just in talking to people about complicated histories and complicated ideas: the act of labeling often precedes — and often also short-circuits or terminates — any real understanding. No metaphor for the fixity demanded, by the lazy, in capsule summary of complicated, living ideas, is better than butterfly collecting. (Or as Twain may not have said, frog dissection.) You can attach a label to a thought, but only once you’ve killed it. And you’d have to be very anxious about your ability to grasp the living one in order to want to.

Every interview with Tariq Ali is worth listening to, but I found this one particularly salutary — for the brusque dismissal of the stupidity of the sectarian left’s refusal of social-democratic meliorism, given the unpromising present political conjuncture; and for the clarity on the Western left’s unpreparedness after 1989, its failure to understand what the collapse of the Soviet Union meant for the power-balance of left politics everywhere else.

Film’s relation to ideas* is of course never simply that of container to content. There are a lot of good films with bad ideas in them; bad ideas are socially familiar to us all, and there’s a long catalogue of deeply pleasurable realistic movies about people bullshitting, lovably and/or laughably, from Rohmer to Linklater. And we don’t usually think (I hope?) the point of those movies is just to adumbrate the ideas spoken in them. Nor is the problem limited to realism — “Il faut confronter les idées vagues avec des images claires,” the maxim that Godard’s Maoisant scamps tape to their wall in La Chinoise, suggests (perhaps vaguely itself) that a Brechtian didactic or political cinema, too, requires bad ideas to be included, for the purpose of requiring the audience to “confront” them. (Though the film’s lengthy track record of being taken far too seriously, or both too seriously and too lightly in the wrong ways, suggests caution for anyone who’d want to use its aesthetics as a model. If anything it’s a savagely accurate comic prophecy of the failures of the generation of 1968 — but one that far too many people still take just as romantically as they take those failures.)

But all the same it does seem possible (just as it would for any other form of fiction) for a movie to bullshit in its own right — to endorse more than it should, to present as deep or laudable, bad ideas, muddled arguments, and so on. It doesn’t seem as if many of First Reformed’s critical plaudits come from viewers who found the film’s potted moral debates nonsensical even as they admired the suspense-film technique in which the bafflegab is quietly wrapped; rather, the movie somehow seduced its audience, it seems, into taking very seriously a bunch of truly silly non sequiturs and pseudoprofundities. This is probably both a matter of filmmaking technique, and thus worthy of praise, at least on some level, and an expression of critical stupidity, and thus worth condemning on others.

* Whatever those are, yes.

The minds of the novelist and the debater intersect in what the psychologists call “theory of mind”: each populates a putatively social exchange — the life of a character in his world; the position of an argument against its opponent — with imaginary people, and each succeeds or fails on the strength of that imagination. In the ideal case both a psychological novelist and a liberal rationalist philosopher could live alone, in a solitary cell or a hermit’s retreat but amid an imagined crowd, having fully followed out the snippy coffee-cup maxim “if I want to know what you think, I’ll tell you what it is.”

Perhaps it’s merely due to the economy of wasted time, but there are many bad books, stupid ideas, etc. which it is incredibly entertaining to summarize, or to hear briefly summarized, or even to imagine. The existence or nonexistence of the disparaged thing isn’t even germane to this form of amusement: satire and reality are, after all, interchangeable when they’re done right and wrong, respectively. Still, one is irritated, rather than amused, at having read the existent book; is this due to the wasted time, or to the realization thus enforced of its existence?

The first part of Andrew O’Hagan’s article on the Grenfell Tower fire, recounting the fire as it happened and the lives of those who lived in the building, is tremendously affecting. Of the rest of it, perhaps the less said, the better; O’Hagan’s account of the politics around the fire seems unfair and confused to me, but Part I of the article by itself is still worth the painful reading.

“He who has ears to hear, let him hear.” Is it the most Emersonian line in the Bible or the most like a riddle, though itself about the solving of riddles? Yes, or perhaps there’s no difference between the two — but it does seem on one reading to indicate that the ability to hear inheres in the nature of the ears, and not at all in what it is they’re hearing; or, on the other, to indicate that hearing is a very tricky skill that people ought to train their ears into. The question is how, or whether, you can get yourself the ears if you want to (and perhaps also whether you should even bother wanting to).

The question of sentimentalism in art is never adequately theorized — even if by “theorized” you just mean discussed in a way that makes the real stakes of the judgement explicit. And the reason, I fear, is almost exclusively the kind of social tact and circumspection that subordinates aesthetic taste to the social-politeness sense of “good taste”: one doesn’t want to insult the person affected by the sentimentalism, to call them a rube or a soft touch for schmaltz — but one must, if one wants to keep making any distinction between art and schmaltz. The discrimination of honest sentiment, even of the simplest and starkest kind (which of course is any art’s prerogative) from manipulative button-pushing sentimentality, or mere slack self-indulgence, is no simple matter; but one cannot even begin to draw it without being willing to face up to how sizable an audience the latter has — or that they should know better! (A lot of middlebrow realistic fiction is basically sentimental pap wrapped in a justifying pretext of sophistication; but you can’t get anywhere reading it if you don’t see the pap for what it is, and call it that — and realize that it’s what the audience wants to consume, and the rest is packaging.)