"Thou shalt not extinguish thine anger, but shall master it, that thy conscience may not be blunted by adjustment to wrong causes."
-The Dutch Ten Commandments to Foil the Nazis


Adventures in Cultural Consumption:

Kitchen Confidential
Medium Raw, Anthony Bourdain

A few observations:

Kitchen Confidential was obviously a very popular book. It seems to be one, however, where the feature of the book that makes it good is quite different than the one identified as the reason for its success. People might like the foul-mouthed bad boy with entertaining stories and a no-nonsense attitude, but the book transcends food porn because Bourdain was, by his own admission, never more than a pretty good chef at a pretty good restaurant, and one for whom the future was bleak. Kitchen Confidential is essentially a tragic book, because its author had fallen into a trap of his own devising, and was never going to get out. He tells the stories he does because he has nothing to lose in telling them, and nothing to gain in being discreet.

He also stresses all the important points that, for understandable reasons, never make it into cookbooks or television cooking shows: that the most important things are reliability and repetition, and that virtually everything in the kitchen is pretty hard work if you want to do it well. It is manual labor, with all the difficulties that implies: you may build up the requisite skills to make things easier, but it shows in one's body, mind and spirit no matter how 'easy' it may now be.

In this sense, it's a helpful corrective to a certain kind of fetishizing of manual labor of which conservative political thought is sometimes guilty--I'm thinking Shop Class as Soulcraft and a professor or two who have declared a preference for farm life--that hard physical work is better, and somehow purer. Unlike many of these projects, Bourdain is clear on where his skills fall on the spectrum of ability--the bottom end of the top end, let's say--as opposed to those who never give any clear sense of how successful they are at their chosen vocations. This allows Bourdain to see quite clearly the varying levels of skill, and judge them accordingly: he can understand how and why the very top chefs are good, even if he can't do what they do, and he knows enough to recognize what should be within reach for competent chefs and criticize them for failing to reach that level. This is the other big point: cooking may once have been a choice for the Bourdain of Kitchen Confidential, but it ceased to be one once his livelihood depended on it. The professor who wants a farm will always, in some sense, be playing at it, because the farm is optional. The person who has other options is engaging in a higher or lower form of authenticity tourism, and this is too little recognized.

And it's this notion--of 'authenticity tourism'--that makes Medium Raw compelling. As a taster or consumer of food, Bourdain is world-class and quite interesting; any time his cooking abilities come into play, he is quite clear-eyed about his limitations compared to the people he knows now. He has at least one really remarkable skill that comes out of this: the willingness to second-guess his judgments. Thus the essay on Alice Waters, which notes many of her battier pronouncements, but also those aspects of her influence that have been positive, and thus issues a thoroughly mixed verdict, but intentionally so: someone with a long career mixed with good and bad moments will be difficult to sum up adequately, and rather than write an appreciation or a Slate-style takedown, it should just all end up in the essay.


Harry Potter Blogging, However Improbable That May Seem

So, this proposed alternative ending is terrible for two entirely separate reasons:

1. It shows total ignorance of the mythic source material. Harry has to choose to die, or else the prominent Christian arc of the story fails (I suppose it's little noticed that his parents' gravestone has an apposite Biblical quotation on it that's in line with the metaphysics of dead people in the series). Harry has the comfort of those who have died surrounding him, and his sacrifice is still terrifying, because he doesn't know if he will come back: this is why it's a sacrifice. He has to die. (This is also why he has a stable wife, kids, and friends at the end: dude has suffered enough. People seem to hate this for not being realistic enough in a story about magic.) To kill Voldemort and emerge unscathed is perhaps more 'badass,' but it makes him less recognizably human.

2. The ending is also needlessly cruel in what seems to be the modern style: people really want Don Draper to commit suicide at the end of Mad Men, people debated whether it'd be cooler for Walter White to die or take out a bunch of the bad guys, Boardwalk Empire only rises to any aesthetic heights when devising new ways to show people being killed. An ending can only be 'real' if it's tragic, if it kicks the main characters in the most ingenious of ways. I would think one need only point out that adult life isn't like that, but I'm not sure that would convince anyone.


Remaining Illegible in an Information Age

There's a longstanding joke about statistical and quantitative analysis: a man loses his keys one night, and is looking for them under a streetlight. When asked why he's looking only there and not elsewhere, he replies "because that's where the light is." So it goes: people interested in data have a vested interest in reducing the world to data and ignoring everything else. In political science, despite the increasing sophistication of statistical methods, there is very little that can be robustly modeled, and what can be modeled is usually of little value in predicting the future. The bluster is quite high, but it is rarely delivered on. 

The same seems to be true of concerns over the internet and privacy. People are all too happy to reduce individuals to their data, where data means "those things a person buys or publicly associates themselves with." There also seems to be a great deal of concern that those with the technological ability to refine the software that makes this possible are embracing this future without thinking too carefully about it. 

I think it's rather the other way around: our technological masters have convinced themselves that the small part of a person's life that is directly measurable constitutes the whole of it, and have therefore confined themselves to creating finer-grained data over an ever smaller portion of that life.

Three reasons for this:

1. The "like," which I take to be the gold standard for people voluntarily surrendering information about themselves, has only been around for a short time. There are limits to the information that can be gained from a like (anyone remember "can this pickle get more fans than Nickelback"?), and the other information sources do not inspire confidence in the quality of the algorithms used: Facebook keeps asking me when I got engaged to my wife, despite the fact that it was listed as an event on Facebook when it happened. 
And this is the quality of information that is made readily available. Most data is not in any form to be used. I was an active participant in online R.E.M. fan groups in the mid-90s, under my own name, all of which is available on the internet (as are 12 years of blog archives) to anyone so enterprising as to search. There are large stretches of my life that are pretty useless for data-mining purposes, even though the data exists. Which segues into the larger point: tastes and interests change over time, which means even if you have the data and even if it's in a form that can be used, all I have to do is not like, say, How I Met Your Mother anymore, and that information is useless.

2. The recommendations data can generate are limited, and bad more frequently than they should be. Amazon has nearly 15 years of my purchase history, and it rarely recommends something I'm interested in. The relevant information is not available to them: I buy a Javier Cercas novel because Roberto Bolaño mentioned him in an essay and made the novel sound interesting; I read Roberto Bolaño because I was bored with my then-current options and my mother thought I might like it. I buy a novel but don't like it; I buy a movie because it's cheap and worth a shot. There's no one reason I consume the way I do, and the connections between one thing and another are often illegible to data collecting schemes. If someone possessed all of my information: credit card, browser history, Amazon buying, Netflix history, then they might possibly be able to find patterns and predict. But no one entity has all that information, and there are pretty solid reasons to assume the relevant parties are unlikely to consolidate.
Also, Netflix is always going to suggest House of Cards to me, even though I've never watched it. They have to, it's in the essence of advertising one of their own shows, but it compromises the purity of the data, and thus its usefulness (see also buying your way into a google search).

3. Technology and behavior is opaque. I briefly considered pursuing "professional internet writer" as a career, but backed off it when I took a serious look at the economic prospects. Since then, I am more involved on twitter, less on blogs and Facebook. But here's the thing: even when I was more involved overall, most of my life was happening offline. Significant aspects of my life get no showing at all; the self that is exposed is intentionally chosen, and thus a persona, and thus not a complete reflection of who I am. And all this as one of the technologically linked-in: most people I know conduct even less of their life online, to say nothing of people whose jobs and lives don't permit this level of interaction: CBS is still the #1 network, even though I don't know anyone who watches it, and hit movies, music, etc, are all things far outside my interest. (To say nothing of the new generation of apps for which privacy and anonymity are the point: Snapchat et al)

So it seems appropriate to allow the Zuckerbergs and the Bezoses of the world to continue their attempts to find their keys under their very bright streetlights, and not worry very much about it.


The Secret of Cheers, or How to Maintain High Quality for a Really Long Time:

Beliefs about a current renaissance of TV quality notwithstanding, I have occasionally pointed out that network shows are at an extreme disadvantage compared to cable because their runs have to be longer: 22 or 24 episodes rather than 6 or 10 or 13. This is also true of the length of shows themselves: 25 minutes in the 80s for a 30 minute show, rather than 22 (or less; Archer runs just barely over 20) now. Given that, I find it remarkable that Cheers is easily and obviously a better sitcom than anything else that has aired since: more time at a higher level than anyone else. Now, like most viewers, I don't like them all equally (I could ignore most Cliff episodes and not feel much loss), but the quality is always present.

The secret, as it turns out, is in the parenthetical above: there are a few episodes each season devoted to specific characters: 25 episodes might include...

2 Norm episodes (or 3)
2 Cliffs
2 Carlas
2 Coach/Woodys
2 Frasiers (after season 3)

...a couple that introduce random characters into the mix, and a couple that focus on aspects of Sam or Diane/Rebecca's personality that have nothing to do with the main story arc. So whatever that story arc is, it only has to drive 10-15 episodes in a season, sometimes fewer: you've just managed to create a smaller prestige show inside a larger network one. Further, the rules are different in the non-arcing episodes: Norm and Cliff have to end up pretty much where they start, and to the extent anyone else gets an arc, it sets up slowly over seasons; each episode is mostly an excuse for the writers to do a funny idea. It hardly matters whether it goes anywhere or not, since that's not the viewer's expectation.

Compare this to other shows, including those that are explicitly attempting to work within the Cheers paradigm, and it's clear how hard this is to do: Parks and Rec can hardly do a story without a significant Leslie component even if it's "about" someone else (this, I think, is why the Ron and Tammy segments are so well regarded), or how much New Girl flails around with how central Nick and Jess' relationship needs to be.



The Luminaries, Eleanor Catton

Tinker Tailor Soldier Spy, John LeCarré
Generations of Winter, Vasily Aksyonov

I'm about a third of the way through The Luminaries, and it has one obvious problem: my final judgment of the book is going to depend entirely on how clever its central mystery is. This situation recurs with some frequency in genre writing. Generations of Winter was engagingly written--I quite enjoyed the section from the dog's perspective--but since it is intended to track onto a specific part of Russian history, and the novel announces its theme of "fortunes waxed and waned during the Soviet Union era" early on, there's no real suspense. Some people will be killed, others will not; to be engaging, at least some things must happen that the reader does not suspect, which means it is not surprising that the most ideologically correct character breaks down, etc. Escaping from one convention means escaping into another. It's hard to sustain momentum when the outcomes are known and the plots are conventional (The Magnificent Ambersons, but with a surgeon's family), but at least the writing is good.

Tinker Tailor Solider Spy: what a terrible book, and significantly more difficult in its construction than it needed to be. Most of the action is relayed in dialogue, long after the events have taken place, with a heavy helping of jargon. The suspense relies on nothing more than not telling you who is responsible for as long as possible: it's the only mystery in the book, not much of one by halfway through the book, and quite obviously being intentionally withheld.

An aside: The Brothers Karamazov is a novel with a strong genre component. Its worth is dependent on the fact that it is also many other things: it hardly matters who killed the father, because it's the reactions of Dmitri, Ivan, and Alexei that matter, and these depend in no crucial way on knowing whodunnit. Crime and Punishment seems as good an example as any that there can be no mystery at all to who committed a crime and still have the story be gripping (or, to cite an example from last night's movie watching, RoboCop).

Which brings us back to The Luminaries. The style is assured, though I am never quite certain whether the narrator's voice is supposed to be "sounds like a 21st century person sounding 19th century-ish" or actually 19th century-ish. But there's nothing else other than the central mystery to engage the reader--just shy of 300 pages, we're still in flashback to events before the novel began, and I suspect we'll be there for awhile still--and much of the tension or narrative momentum comes from the writer's decisions about how to introduce information, not from the information itself. This does not leave me confident in the overall direction of the novel, though I'd be happy to be wrong.

The thing that's funny about this:

"But surely some of the frustration with [Greta Gerwig's decision to do a CBS sitcom] is actually a projection of frustration at the medium she is (perhaps only temporarily) leaving behind, which is now so dominated by effects-driven—not to mention male-oriented—spectacles that there is no money for movies about people, emotions, and ideas—not to mention women. " that all the examples of innovative TV work and movie-stars-going-to-tv are all white men. And it makes me wonder whether the relentless hyping up of Girls has at least as much to do with the fact that Lena Dunham must be present to balance out all that Great White Maleness so that tv looks more inclusive and radical.

Labels: ,


Two interventions in the latest round of the CCOA debate. Tangential to that line of argument, and so here, not there.

1. Phoebe writes, in the comments: "If the entire system were to change, and everyone just went to the nearest state school, rather than to a school tailored to needs or preferences of students like them, then maybe we could have a conversation about what College consists of, down to the specific texts."

I find it interesting that these arguments are always at elite liberal arts colleges or Ivy League schools, and never (one MOOC revolt aside) at places like Virginia, UNC, or my alma mater, Michigan. Conservatives at Michigan would complain about the University-mandated Race and Ethnicity requirement, but mine was fulfilled learning about the treatment of the Irish by the British in the early 20th century, so it was kind of a hollow complaint. The reason is, largely, that once a university passes a certain size, it becomes logistically impossible--and kind of silly--to assume the purpose of a university education is unitary. Education then requires many departments with many professors, and so by definition a large number of successful paths. The intricacy of major requirements was usually tied to the size of the department, if only because it became difficult to ensure what classes might be available the larger (and thus more irregular) the department gets; the ability to specialize increases with departmental size. (A department with 50 professors each teaching (say) a 2-1 where half of them get to teach a course based on their research is likely to produce some questionable-seeming courses even if the curriculum as a whole is conventional.) If the scenario Phoebe envisions were to come to pass, I would expect this would make a conversation about what college consists of, down to the specific texts, impossible (this, of course, supports Phoebe's overall point).

And, goodness, class size: I took exactly five classes with fewer than 40 students: two sections of Spanish (at about 30), my two senior seminars in philosophy and political science, and one We Need a 400-level Course to Get Our Philosophy Degree. Seminar-style pedagogy is barely possible with the Chicago-mandated Core course size of 19: it is impossible above 40. As a teacher, your aims and outcomes need to be radically different, not least because the students have different expectations. There were 400 people in my Intro to Political Theory, 45-60 for my course on Dante, a similar number for my course on the Russian novel, near 100 for my British history in the 20th century courses, 45-60 in every art history course I took. There's less discussion, more lecturing, more emphasis on getting students to do synthetic work in papers and exams to demonstrate they have done some learning on their own, and professors tend to be thrilled to get students in office hours who just want to talk about their subjects. Come to think of it, the perfect conservative university education may be hiding at big state schools, even right now.

2. Having taught the Core-iest of social science Core courses, and having received my education from Michigan in what was a (voluntarily chosen!) classics-oriented manner, I am willing to venture the opinion that the second is superior to the first. The reason is simple: repetition. Approximately one year of exposure to the classics and then a conventional university education in another subject will not (necessarily) leave a student in a noticeably better place. Students tend to grasp Aristotle, or Aquinas, or Marx, only at the point the course moves on to another figure, and so what remains is rarely coherent and situated in the right context. The eventual result is, depending on the level of reference you prefer, the character in Balzac's Lost Illusions who likes to quote Ciceronian maxims to prove how educated he is, though he long ago forgot the context for any of them, or Father Guido Sarducci's Five-Minute University; the persistence of people who read Aristotle long ago and are positive they remember what he said is astounding. Reading Shakespeare, or Milton, or Aristotle, or Locke only once will rarely make much of an impact: reading it a number of times in different circumstances will. So apart from an attempt to mandate not just introductory course selection, but all course options, it seems like this will fail; and even if all course options were mandated, it'd still require the will of the individual student to make it work.

Labels: ,


Art and the Artist

(In response to Phoebe specifically, but also in general)

I tweet the following semi-periodically, and it probably deserves a longer explanation:

"A periodic reminder of how much relevance an artist's personal life has to the aesthetic evaluation of their work: none."

There are two primary reasons behind this: it is difficult, if not impossible, to know enough about an artist's personal life to make a definitive moral judgment about who they are, and the decision to let ethical questions dictate aesthetic responses has a tendency to crowd out the time one might take to enjoy art.

In order to get this argument off the ground, I have to stipulate one--hopefully uncontroversial--element: all human beings engage in a mix of good and bad behavior, and almost everyone's bad end includes some actions, opinions, and mental states they would prefer not to be made public. Nick Hornby has a riff on this in High Fidelity, where the narrator, having just admitted to having done five terrible things to an ex, asks his readers to think of the five worst things they've done to someone they were dating, and then asks them how they feel about being judgmental. People tend to want to accentuate the good things they have done and ignore the bad, or simply questionable, things. Outside observers who are not neutral tend to focus on the good or the bad. Consequently, the information that's available to us about artists from secondhand sources tends to be either good or bad, depending on whether it's a friend or an enemy speaking. But the information these people have is limited, and only the artist will ever really know.

Which is, as it happens, how it works with most of the people we meet: we just accept as a fact that their personal mix of goodness or badness remains unknown, and so we judge on those elements of their personality that are visible to us.

For artists, whose personalities tend to be in larger view, intentionally or not, the judgments appear to be trickier. How do we feel about Roman Polanski, or, assuming the charges are true, Woody Allen? I liked Michael Chabon's The Amazing Adventures of Kavalier and Klay, and The Yiddish Policemen's Union, but I also read the book of essays where he gets very judgmental about his single mother's dating habits, and humblebrags that even though he wasn't cool, he definitely slept with at least one of her friends, and many other women he found desirable. What do I do with that? Jonathan Franzen appears to want to be the world's worst posthumous friend to David Foster Wallace, and seems to have some issues with anger and women. Dan Harmon appears to be obsessive and alienating to a number of people he works with, and the Hollywood grist mill provides more examples than I could even list here.

(I also note that the routine excess of popular musicians never generates the same ire, or certainly doesn't now. The last things I can think of are David Bowie's saying England was ready for a fascist leader, and Eric Clapton's effusive praise of Enoch Powell, both of which have obviously harmed their careers. (It's also, I believe, not entirely clear that Clapton does not still agree with Powell))

The issues multiply with past artists: what are we to do with Hitchcock's misogyny (especially as it is mostly absent from his earlier British films)? Should we lend credence to the rumors about Grace Kelly's active personal life? The grubby details of Barbara Stanwyck's rise to fame? Spencer Tracy's multi-decade affair combined with a refusal to divorce his wife? How should we react to the more sordid details of W.H. Auden's personal life, or T.S. Eliot's anti-Semitism and crypto-fascism? Ernest Hemingway leaving his wife and young child for his mistress? Scott Fitzgerald's drinking, and treatment of his wife? How about Jean-Jacques Rousseau sending his children to the orphanage? And what of the people whose morality we can discern nothing at all: Homer, for example?

At this point, there are two options: one is to define a standard of those shortcomings which are morally relevant to consider, both present-day standards and those which are appropriate for the time. Failure to do the latter would require excluding large swaths of culture for insupportable stereotyping by race and gender. The former is also revealing, in that there are a number of indiscretions which are not considered to be reasons to boycott an artist's work. Even so, one can define an acceptable standard, assess the information available, determine what in it is reliable and does not come from a source that might be compromised, and come to a conclusion about whether the artist's work can be supported. This is, not to put too fine a point on it, a lot of work, and fully engaging in this process would eat up more time than the average book or movie would be worth.

The other is to conclude, perhaps even reluctantly, that ethical questions are separate for aesthetic ones, and aesthetic reasons should guide our decisions about what to consume. Works of art can get away from their creators, after all: even the staunchest defender of the view that Mark Twain was a racist has to acknowledge that Jim is the human and moral core of Huckleberry Finn, and Woody Allen can say something important about the need for moral standards in Manhattan even if he's a lout. If Roman Polanski makes a good movie, it is good because it is in some important way human, or truthful, or beautiful, or excellent, even if its director is none of those things. The capacity to create something unexpectedly great is one of the formative reasons for reading and watching and listening. And, moreover, the capacity to recognize in the thing one has done wrong its meaning is the redemptive possibility of art, and not to be taken lightly; Fitzgerald's late-in-life stories are his failures realized, someone who understands the depths of what he has done wrong. The emotional core of Annie Hall is those final scenes in which the character understands exactly how and why he was wrong, and creates something to fix it, which leads him to fix himself.

All that being said, I am fine with moral reasons working on the margin of aesthetic choices: vita brevis, longa ars after all. I've seen a couple of the 'good' Polanski movies and did not much care for them, so my uninterest and the additional moral reasons provide an incentive not to attempt a new film that garners good reviews. This is not incidental, since I consider periodically revisiting things I have disliked in the past to be one of my responsibilities as a consumer of culture. The moral judgments themselves I am also comfortable with: if Woody Allen did what is alleged, he deserves a mark against him. 

(A theological aside: I am uncomfortable with this even as I acknowledge my attitude about it. I think I am bound to hope for his redemption and to believe in its possibility, so even while maintaining my moral judgment I think I must also accept that one day I might need to transcend it, and let it go. And accept the possibility that this might come about without my ever being aware of it.)

There are also relevant economic angles and authorship questions along the lines of "A Woody Allen film or one he co-wrote and directed and which a bunch of other people made critical aesthetic contributions to?" and "Scott Fitzgerald or Fitzgerald and Maxwell Perkins?", but this post is too long as it is. 

Labels: , , ,