It’s the month of lists, you’ve no doubt noticed. It’s an entertaining yet tedious time of year: we get to find records we might have slept on and argue over albums that are underrated or underhated. Maybe it's the lack of galvanizing consensus albums this year, or the oft-repeated complaint that everyone's favorites are too boring, but I'm detecting an odd, if fascinating tone to the usual kvetching this time around. It's not the same old sux/rulz dichotomy: rather, there seems to be a distinct lack of trust, a suspicion of the listmakers' motives.
Over at Idolator, Dan Gibson asked readers—genuinely!— “Have you had any luck yet with the mysterious albums that seem to populate the middle of these lists? Is that Girl Talk disc everyone seems to list any good?” To wonder aloud whether an album frequently appearing on everyone’s list is actually any good says something about Gibson’s lack of faith in his music-consuming peers. It’s doubly amusing considering he is asking the vague entity that is “Idolator readers”—many of whom are probably the same ones putting Girl Talk on their lists. Is Gibson going to feel any more inclined to download Girl Talk based on some commenter’s recommendation? (Not likely, when the next commenter will say it’s overrated.)
I don’t think Gibson is alone. It seems to be assumed by all readers that somewhere on any given top ten list, a gap must exist between the truly outstanding and the merely good. The question is, where is that gap? No one’s list includes that important detail, hence no one’s list is trustworthy. Add to that one's personal knowledge of how few of 07 picks still get regular play around the house. Surely the same will happen with these 08 releases. All the more suspicious then: these lists must be designed to swindle you out of your time, money, and belief that indie rock can change your life. Right?
Too, we have no reference point: okay, you’ve got a top ten—how many albums did you actually hear? How can I possibly know how discerning you’re being? Shit, you’ve got a top fifty—are you just ranking every album you bought? How can I possibly know how discerning you're being?
It’s easy to mistrust the hobbyist bloggers—all of whom, surely, are too lazy, stupid, or sheeplike to stake out their own personal tastes, independent from the domineering tastes of Pitchfork’s critics or the Machiavellian commercial interests of Stereogum. I mean, it’s one thing for Vampire Weekend and Lil’ Wayne to show up on their lists—it’s in their editorial interests, right? For some reason? Cuz they're pro?—but why would the hobbyists, for whom nothing is at stake, claim to like these albums? They’re clearly drinking the Kool-Aid and can’t be trusted. Right?
Then there are the Kool-Aid dispensers: Pitchfork, Paste, Q, et al. As this ILM thread demonstrates, such lists are to be dissected in order to determine each publication's true motives, for surely it cannot be that they like all these albums. Probably they’re catering to their readers’ tastes—but, predictably, they must throw in a few albums from outside their stock genre in an attempt to display the illusion of critical breadth. So there's a "token" hip hop act, a "token" dance album and a "token" pop album. You see through it, don’t you: these mags are puffing themselves up in hopes you don’t notice how critically corrupt they are. Right?
In further effort to show how on-the-pulse their critics are—because even the publication knows its list is bullshit!—you're made privy to their individual lists. See, staff-derived lists place undue weight on populist albums like that dastardly Fleet Foxes record that appeals to everyone but is no one’s favorite. (right?) So it racks a bunch of points in the staff polling and places high on the list, way ahead of that one actually outstanding album that not enough people spent time with, hence its inclusion on the low end of the top 100. But the individual critics’ list rectifies that! Except that all that amounts to is a bunch of jerks that, other than their inclusion of Fleet Foxes, are just trying to show off how obscure they can be. They’re not being honest—if it were that good, we all would have heard of it. Right?
You can’t win.
What does it mean, as an avid reader of indie music blogs, that you suspect the motives of every blogger you read? Can no one steer you right? Is every blog's purview too limited/populist, too wide-ranging/obscure, too professionalized, too amatuer? Must you depend on Largehearted Boy’s list of lists to surmise some nebulous overall endorsement of ten supposedly outstanding records? Maybe you do. Maybe that’s also why so many people view this year’s indie releases as boring. Reading so many reactions to the lists that have come up so far is like seeing a mass realization that the trainspotters we depend on aren't flagging down the right cars. (Don't look at me: when I get around to posting my list, you’ll see some of those boring releases, too.) How's that Stephen Stills song go? If you can't find the blog you trust, trust the blog you're with?
I don't really know what it all means. I just sense some collective feeling that indie rock—or was it the critics and bloggers?—let us down this year. Or we let ourselves down, by not trying a little harder to find that album that really meant something—to me, to us, to music. It's probably out there: who found it? Pipe up!