The Secret of Cheers, or How to Maintain High Quality for a Really Long Time:
Beliefs about a current renaissance of TV quality notwithstanding, I have occasionally pointed out that network shows are at an extreme disadvantage compared to cable because their runs have to be longer: 22 or 24 episodes rather than 6 or 10 or 13. This is also true of the length of shows themselves: 25 minutes in the 80s for a 30 minute show, rather than 22 (or less; Archer runs just barely over 20) now. Given that, I find it remarkable that Cheers is easily and obviously a better sitcom than anything else that has aired since: more time at a higher level than anyone else. Now, like most viewers, I don't like them all equally (I could ignore most Cliff episodes and not feel much loss), but the quality is always present.
The secret, as it turns out, is in the parenthetical above: there are a few episodes each season devoted to specific characters: 25 episodes might include...
2 Norm episodes (or 3)
2 Frasiers (after season 3)
...a couple that introduce random characters into the mix, and a couple that focus on aspects of Sam or Diane/Rebecca's personality that have nothing to do with the main story arc. So whatever that story arc is, it only has to drive 10-15 episodes in a season, sometimes fewer: you've just managed to create a smaller prestige show inside a larger network one. Further, the rules are different in the non-arcing episodes: Norm and Cliff have to end up pretty much where they start, and to the extent anyone else gets an arc, it sets up slowly over seasons; each episode is mostly an excuse for the writers to do a funny idea. It hardly matters whether it goes anywhere or not, since that's not the viewer's expectation.
Compare this to other shows, including those that are explicitly attempting to work within the Cheers paradigm, and it's clear how hard this is to do: Parks and Rec can hardly do a story without a significant Leslie component even if it's "about" someone else (this, I think, is why the Ron and Tammy segments are so well regarded), or how much New Girl flails around with how central Nick and Jess' relationship needs to be.
The Secret of Cheers, or How to Maintain High Quality for a Really Long Time:
The Luminaries, Eleanor Catton
The thing that's funny about this:
"But surely some of the frustration with [Greta Gerwig's decision to do a CBS sitcom] is actually a projection of frustration at the medium she is (perhaps only temporarily) leaving behind, which is now so dominated by effects-driven—not to mention male-oriented—spectacles that there is no money for movies about people, emotions, and ideas—not to mention women. "
...is that all the examples of innovative TV work and movie-stars-going-to-tv are all white men. And it makes me wonder whether the relentless hyping up of Girls has at least as much to do with the fact that Lena Dunham must be present to balance out all that Great White Maleness so that tv looks more inclusive and radical.
Two interventions in the latest round of the CCOA debate. Tangential to that line of argument, and so here, not there.
1. Phoebe writes, in the comments: "If the entire system were to change, and everyone just went to the nearest state school, rather than to a school tailored to needs or preferences of students like them, then maybe we could have a conversation about what College consists of, down to the specific texts."
I find it interesting that these arguments are always at elite liberal arts colleges or Ivy League schools, and never (one MOOC revolt aside) at places like Virginia, UNC, or my alma mater, Michigan. Conservatives at Michigan would complain about the University-mandated Race and Ethnicity requirement, but mine was fulfilled learning about the treatment of the Irish by the British in the early 20th century, so it was kind of a hollow complaint. The reason is, largely, that once a university passes a certain size, it becomes logistically impossible--and kind of silly--to assume the purpose of a university education is unitary. Education then requires many departments with many professors, and so by definition a large number of successful paths. The intricacy of major requirements was usually tied to the size of the department, if only because it became difficult to ensure what classes might be available the larger (and thus more irregular) the department gets; the ability to specialize increases with departmental size. (A department with 50 professors each teaching (say) a 2-1 where half of them get to teach a course based on their research is likely to produce some questionable-seeming courses even if the curriculum as a whole is conventional.) If the scenario Phoebe envisions were to come to pass, I would expect this would make a conversation about what college consists of, down to the specific texts, impossible (this, of course, supports Phoebe's overall point).
And, goodness, class size: I took exactly five classes with fewer than 40 students: two sections of Spanish (at about 30), my two senior seminars in philosophy and political science, and one We Need a 400-level Course to Get Our Philosophy Degree. Seminar-style pedagogy is barely possible with the Chicago-mandated Core course size of 19: it is impossible above 40. As a teacher, your aims and outcomes need to be radically different, not least because the students have different expectations. There were 400 people in my Intro to Political Theory, 45-60 for my course on Dante, a similar number for my course on the Russian novel, near 100 for my British history in the 20th century courses, 45-60 in every art history course I took. There's less discussion, more lecturing, more emphasis on getting students to do synthetic work in papers and exams to demonstrate they have done some learning on their own, and professors tend to be thrilled to get students in office hours who just want to talk about their subjects. Come to think of it, the perfect conservative university education may be hiding at big state schools, even right now.
2. Having taught the Core-iest of social science Core courses, and having received my education from Michigan in what was a (voluntarily chosen!) classics-oriented manner, I am willing to venture the opinion that the second is superior to the first. The reason is simple: repetition. Approximately one year of exposure to the classics and then a conventional university education in another subject will not (necessarily) leave a student in a noticeably better place. Students tend to grasp Aristotle, or Aquinas, or Marx, only at the point the course moves on to another figure, and so what remains is rarely coherent and situated in the right context. The eventual result is, depending on the level of reference you prefer, the character in Balzac's Lost Illusions who likes to quote Ciceronian maxims to prove how educated he is, though he long ago forgot the context for any of them, or Father Guido Sarducci's Five-Minute University; the persistence of people who read Aristotle long ago and are positive they remember what he said is astounding. Reading Shakespeare, or Milton, or Aristotle, or Locke only once will rarely make much of an impact: reading it a number of times in different circumstances will. So apart from an attempt to mandate not just introductory course selection, but all course options, it seems like this will fail; and even if all course options were mandated, it'd still require the will of the individual student to make it work.
Art and the Artist
(In response to Phoebe specifically, but also in general)
I tweet the following semi-periodically, and it probably deserves a longer explanation:
This Is Not a Metaphor for Anything
25 years ago, let's say, you couldn't get good coffee in America: only Folger's, Maxwell House, or instant. Hipster college towns and European imports notwithstanding (International Coffees, anyone?), the average cup was bad; so bad, in fact, that people didn't really know what they were drinking was substandard.
Then came Starbucks, and the return of coffee culture (Cheers being replaced by Cafe Nervosa in Frasier and Central Perk in Friends would be representative), and it was possible to have good-to-great coffee almost anywhere.
In the early 00s, Starbucks switches over from barista-led espresso machines to automatic ones. The caramel machiatto is a runaway success.
Now: a divide between coffee snobs and everyone else. The rise in popularity of single-cup coffee makers, whose most notable feature is the willingness to pump increasing amounts of water through the same amount of grounds, and whose flavor profiles range from "bold" to "extra bold" because otherwise you can't taste the coffee,* concurrent with the rise in 'coffee' drinks as milk-and-sugar delivery vehicles. You can get a good cup of coffee, but you probably have to make it yourself, or seek out a niche coffee location.
Are we better off now than we were? By how much, exactly?
*The only exception I've found are the Starbucks k-cups, which taste exactly like Starbucks coffee. Whether this is a good thing is left as an exercise for the reader.
It seems fitting and appropriate that Norm Geras' last blog post was one in which he recommended books he'd loved. He was a reliably great presence on the internet, and an excellent recommender of worthwhile things (Art Blakey's "Moanin'" played at my wedding reception, an album I bought because Norm once recommended it).
What else to say, really, except that it was "The Contract of Mutual Indifference" that set the course for everything that interested me in grad school and since, and it remains engaging and provocative?
I am less inclined than most to be concerned about kids only watching what's on nexflix these days, despite a great amount of hand-wringing on the subject. The reason is simple: I used to be one of those people. Approximately ten years after I recognized this as a potential issue, I've seen 76 of AFI's top 100, and 53 of Sight & Sound's top 110. There are a few issues being conflated:
1. Access: part of the reason my film education was so poor prior to 2003 is that movie distribution was poor. The movies I had seen were for the usual reasons: a smattering of classics available at the local video store, a few VHS tapes my parents had bought or would rent from the library (the latter of which skewed towards Merchant-Ivory and Jane Austen adaptations for obvious reasons), and the requisite freshman year friend who was taking a film class and invited me along to the screenings that taught me the value of foreign films (The 400 Blows, Red Sorghum) and hate Robert Altman. It was hard to get a hold of very many movies at all, much less high-quality copies, so even if I'd had the money and the inclination, my options would have been sorely limited.
There's precious little appreciation that the introduction and acceptance of DVDs in the early 2000s had the same democratizing effect as the introduction of CDs in the late 80s: everyone converted their catalogues to the new format, which meant cheap, high(ish)-quality editions of almost all movies. My interest in music was sustained because I could go into, say, Best Buy in 1996 and get five cds for $25, because they were bands no one had ever heard of, and the store just wanted to get rid of the inventory. I bought Woody Allen movies and foreign films for the same reasons: I had no idea if it was good, but it was $10. This move is also what makes Netflix possible as a business.
It's worth circling back to this: if someone isn't interested in movies now, it's a question of will. But will can be changed, if it's not a question of berating people for their poor choices, worrying about the future, etc.
2. Canon: If there's a logic behind the AFI list, I've never been able to find it. "These are good movies that were made in America" is not going to ease anyone's transition into watching film. Nor Sight & Sound's "these are foreign films that usually have good cinematography." For their to be a "new Netflix canon" there must be an "old non-Netflix canon" it's replacing, except there isn't one. Film has lists, but no narrative: The Cabinet of Dr. Caligari, Gone With the Wind, Annie Hall and, oh, Up are all quite good films, but good in different ways and different genres for quite disparate reasons. A generation of filmmakers might be working in sync with what came before (the Cahiers du Cinema people drawing from noir and westerns), or in opposition to it (the 70s Auteurs versus the studio system); careers are significantly longer and credit is given for doing the unexpected (Tony Curtis in Sweet Smell of Success; Billy Wilder writing Stalag 17), so the fact that Fritz Lang directed it or Marlon Brando starred in it doesn't tell you much. College students looking for an exhaustive-if-limited selection of films they need to watch (something on the stature of Pitchfork's best of lists) have no real options.
3. Points of ingress: The most fortunate coincidence in my years of movie-watching was having Woody Allen be the first figure I was interested in: he constantly makes reference to other films and is not particularly shy about doing so. From him, I found Bergman and Fellini, and thus New Wave and Neorealism, the 70s movies to which his best works were a contrast, and on from there. It takes a lot of time: I watched many westerns until I figured out I didn't particularly like them, and spent a lot of time figuring out I will never care for Kurosawa (Dursu Uzala, maybe). Along the way, there were surprises: Powell/Pressburger, say, or Buñuel, who I didn't expect to like, but did.
4. Let me again state a general opinion that one of the problems with American culture is the idea that all one's tastes have to be set at age 18, if not sooner. I prefer different things at 31 than I did at 21, and would suggest that the process of aging itself has allowed me to like things I would have previously dismissed. That students arrive at college--or leave!--not having sampled widely from world cinema is not a problem. There's a whole life to make that happen.