Writer of things that go bump in the night

Tag: Superhero (Page 1 of 5)

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

There He Was… and in He Walked: Lessons on Mythic Storytelling from the Mariachi Trilogy

In belated observation of Día de los Muertos, here’s an appreciation for the idiosyncratic storytelling of Robert Rodriguez’s Mariachi trilogy, a neo-Western action series that emerged from the indie-cinema scene of the 1990s and can only be deemed, by current Hollywood standards, an anti-franchise.  The movies and the manner in which they were made have a lot to teach us about what it means to be creative—and how to best practice creativity.


Before the shared cinematic universe became the holy grail of Hollywood, the coup d’éclat for any aspiring franchise—and we can probably credit Star Wars for this—was the trilogy.

In contrast with serialized IPs (James Bond and Jason Voorhees, for instance), the trilogy came to be viewed, rightly or wrongly, as something “complete”—a story arc with a tidy three-act design—and, accordingly, many filmmakers have leaned into this assumption, exaggerating a given series’ creative development post factum with their All part of the grand plan! assurances.

This peculiar compulsion we’ve cultivated in recent decades—storytellers and audiences alike—to reverse-engineer a “unified whole” from a series of related narratives, each of which developed independently and organically, is antithetical to how creativity works, and even to what storytelling is about.

Nowhere is the fluidity of the creative process on greater, more glorious display than in the experimental trilogy—that is, when a low-budget indie attains such commercial success, it begets a studio-financed remake that simultaneously functions as a de facto sequel, only to then be followed by a creatively emboldened third film that completely breaks from the established formula in favor of presenting an ambitiously gonzo epic.  Trilogies in this mode—and, alas, it’s pretty exclusive club—include Sam Raimi’s Evil Dead, George Miller’s Mad Max, and Robert Rodriguez’s El Mariachi.

Robert Rodriguez at the world premiere of “Alita: Battle Angel” on January 31, 2019 in London (Eamonn M. McCormack/Getty)

A film student at the University of Texas at Austin in the early nineties, Rodriguez self-financed El Mariachi with a few thousand dollars he’d earned as a medical lab rat; the project wasn’t meant to be much more than a modest trial run at directing a feature film that he’d hoped to perhaps sell to the then-burgeoning Spanish-language home-video market.  He reasoned that practical experience would be the best teacher, and if he could sell El Mariachi, it would give him the confidence and funds to produce yet more projects—increasingly ambitious and polished efforts—that would allow him to make a living doing what he loved.  He had no aspirations of power lunches at The Ivy or red-carpet premieres at Mann’s Chinese Theatre, only pursuing the art of cinematic storytelling—not necessarily Hollywood filmmaking, a different beast—to the fullest extent possible.

If you want to be a filmmaker and you can’t afford film school, know that you don’t really learn anything in film school anyway.  They can never teach you how to tell a story.  You don’t want to learn that from them anyway, or all you’ll do is tell stories like everyone else.  You learn to tell stories by telling stories.  And you want to discover your own way of doing things.

In school they also don’t teach you how to make a movie when you have no money and no crew.  They teach you how to make a big movie with a big crew so that when you graduate you can go to Hollywood and get a job pulling cables on someone else’s movie.

Robert Rodriguez, Rebel without a Crew, or, How a 23-Year-Old Filmmaker with $7,000 Became a Hollywood Player (New York:  Plume, 1996), xiii–xiv

They don’t teach a lot of things about Hollywood in film school, like how so many of the industry’s power brokers—from producers and studio execs to agents and managers—are altogether unqualified for their jobs.  These folks think they understand cinematic storytelling because they’ve watched movies their entire lives, but they’ve never seriously tried their hand at screenwriting or filmmaking.  Accordingly, the town’s power structure is designed to keep its screenwriters and filmmakers subordinate, to make sure the storytellers understand they take their creative marching orders from people who are themselves utterly mystified by the craft (not that they’d ever admit to that).

It’s the only field I know of whereby the qualified authorities are entirely subservient to desk-jockey dilettanti, but I suppose that’s what happens when a subjective art form underpins a multibillion-dollar industry.  Regardless, that upside-down hierarchy comes from a place of deep insecurity on both ends of the totem pole, and is in no way conducive to creativity, hence the premium on tried-and-true brands over original stories, on blockbusters over groundbreakers.  As I discovered the hard way—more on that in a minute—Hollywood is arguably the last place any ambitiously imaginative storyteller ought to aspire to be.  Rodriguez seemed to understand that long before he ever set foot in L.A.:

Continue reading

Too Much Perspective: On Writing with Moral Imagination

Practicing morally imaginative storytelling means scrutinizing the values and messages encrypted in the fiction we produce—but it does not mean passing a “purity test.”


In Marty Di Bergi’s 1984 rockumentary This Is Spinal Tap, the titular British heavy-metal band, faced with ebbing popularity and flagging album sales, embarks on a disaster-prone tour of North America in support of its latest release, the critically savaged Smell the Glove.  During a stopover at Graceland to pay their respects to the King of Rock and Roll at his gravesite, lead vocalist David St. Hubbins comments, “Well, this is thoroughly depressing.”

To which bandmate and childhood best friend Nigel Tufnel responds, “It really puts perspective on things, though, doesn’t it?”

“Too much.  There’s too much fucking perspective now.”

It’s a sentiment to which we can all relate, collectively endowed as we’ve become with a migrainous case of “2020 vision.”  At the start of the pandemic, long before we had any sense of what we were in for let alone any perspective on it, I like many essayists felt the urge or need or even the responsibility to say something about it, despite knowing I had no useful or meaningful insight.  I netted out with an acknowledgment that the months to come would present a rare Digital Age opportunity for quiet introspection and reflection—one in which we might expand our moral imagination of what’s possible, to invoke the exquisite wisdom of my mentor Al Gore, and perhaps envision a world on the other side appreciably more just, equitable, and sustainable than the one we had before the global shutdown.

Did we ever.  Here in the United States, we are now wrestling with issues of economic inequality, structural racism, police brutality, environmental justice, and fair access to affordable housing and healthcare with an awareness and an urgency not seen in generations, and President Joe Biden—responding to the social movements of his times like FDR and LBJ before him—has proposed a host of progressive legislation that matches the visionary, transformative ambition of the New Deal and the Great Society.

Reuters via the New York Times

With heartening moral imagination (certainly more than this democratic eco-socialist expected from him), Biden is attempting to turn the page on the Randian, neoliberal narrative of the past forty years and write a new chapter in the American story—one founded on an ethos of sympathetic coexistence, not extractive exploitation.  With our continued grassroots support and, when necessary, pressure, he might even be the unlikely hero to pull it off, too—our Nixon in China.

As for me?  I spent most of the pandemic thinking about narrativity myself.  Doing nothing, after all, was a privilege of the privileged, with whom I am obliged to be counted.  So, I used the time in self-quarantine to think and to write about the stories we tell, and I arrived at the resolute conclusion that we—the storytellers—need to do a lot better.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 1

Editor’s note:  By even the indefensibly prolix standards of this blog, the following essay—an analytical piece on Hollywood mega-franchises and how audiences wind up serving them more than they serve us—is a lengthy one.  Accordingly, “In the Multiverse of Madness” will be published in two separate parts, with the concluding installment following this one by a week.  I thank you in advance for your time and attention, neither of which I take for granted.


In last month’s post, I proffered that when a fan-favorite media franchise no longer serves us—when we come to recognize some of the popular fictions we’ve cherished embody values we no longer endorse, and potentially even threaten to stand in the way of where we need to go—often the best thing we can do for ourselves is to let it go, purposely and permanently.

Letting go is not about “canceling” (someone like disgraced geek god Joss Whedon) or boycotting (the films of, say, Woody Allen); it’s not about taking action at all.  Instead, letting go is not doing something any longer—not renting out any more space in your life or in your head to the likes of Whedon or Allen, or even to the culturally defining popular narratives whose very ubiquity we take as a God-given absolute:  Star Wars, Star Trek, Harry Potter, DC and Marvel, to name but a sampling.

Despite the universal prevalence of those transmedia brands—not merely the plethora of movies and TV shows, but the licensed apparel and iPhone cases, the die-cast collectables and plush toys—we can, if we choose, be done with any or all those franchises as of… right now.  To learn to live without them entirely.  And happily.  Even lifelong, hardcore superfans can learn to let go of their preferred multimedia pastimes.

It’s both easier and harder than you may think.

Just imagine never caring about ANY of this ever again…

But wait!  What if you happen to genuinely enjoy Star Wars or Star Trek or DC or Marvel?  If you’re a fan, and some or all of those entertainment franchises add value to your life’s experience, by all means, disregard this post’s advice.  Though perhaps first consider this:

For most of Hollywood history, the movie business has needed a hostage buyer, a customer with little choice but to purchase the product.  First, this was the theatre chains, which the studios owned, or controlled, until 1948, when the Supreme Court forced the studios to sell them on antitrust grounds.  In the eighties and nineties, video stores partly filled the role.  But, increasingly, the hostage buyer is us.

Today, the major franchises are commercially invulnerable because they offer up proprietary universes that their legions of fans are desperate to reënter on almost any terms.  These reliable sources of profit are now Hollywood’s financial bedrock.

Stephen Metcalf, “How Superheroes Made Movie Stars Expendable,” New Yorker, May 21, 2018

Consider:  How many of us are unwitting “hostage buyers”—fans who continue to subscribe to certain multimedia franchises no longer out of pleasure, but lately out of habit?  Out of decades-long conditioning?  We may watch Star Wars, for instance, simply because we’ve always watched Star Wars, even if we can’t truly recall the last time we actually enjoyed it the way we did when we were ten years old—with pure and wondrous abandon.  Bad word-of-mouth will steer us clear of a one-off bomb like Blackhat or King Arthur:  Legend of the Sword or The Happytime Murders, but it’ll merely lower our expectations for Star Wars:  The Rise of Skywalker and X-Men:  Dark Phoenix and Terminator:  Dark Fate, not deter us from seeing those umpteenth sequels for ourselves.

When that happens—when we’re willing to spend our money, time, and attention (our three primary modes of currency) on a product we know in advance is shit—we’re no longer fans of those franchises so much as brand loyalists.  Habit buyers, if not outright hostage buyers.  And it can be hard to recognize that in ourselves—harder than we might realize.  I was still reading Batman comics into my thirties, who-knows-how-many years after I stopped enjoying them—long after a once-joyful pleasure became an interminably joyless obligation.  So, why was I still reading and collecting them?

Because I’d always read comics, from the time I was a kid; I’d buy them at the corner candy store in my Bronx neighborhood with loose change I’d rummaged from the couch cushions and reread each one a thousand times.  I’d share them with my grade-school gang, and vice versa.  I’d collected them for as long as I could remember, so it truly never occurred to me a day might come when they no longer added value to my life—when they’d outlived their onetime reliable purpose.  And for years after I reached that point of terminally diminished returns, I’d continue to spend money, to say nothing of time and attention, on a habit I wasn’t enjoying—that did nothing but clutter my home with more worthless shit that went straight into indefinite “storage” in the closet.  Why the hell did I do that?

Because I’d ceased to be a fan and had instead become an obedient brand loyalist—an institutionalized hostage buyer.  And, to be sure, corporate multimedia initiatives—which is to say the those so-called “mega-franchises” from which there is always one more must-see/must-have sequel, prequel, sidequel, spinoff, TV series, tie-in comic, videogame, and branded “collectible” being produced—very much count on our continued, unchallenged fidelity to once-beloved concepts and characters…

… and they are doubling down on the billion-dollar bet they’ve placed on it:

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

The End: Lessons for Storytellers from the Trump Saga

The election of Joseph R. Biden Jr. earlier this month offered the very thing our movie franchises and television series have denied us for two decades:  catharsis.


For a writer, it turns out I may suffer from a staggering lack of imagination.

I will confess to anxiously entertaining all the apocalyptic post–Election Day scenarios contemplated by even our most sober pundits and analysts:  the disillusion-fueled outrage on the left should Trump eke out a narrow Electoral College win despite losing the popular vote to Biden; or, the armed militias activated by the president in the event of his loss.  Like the set of a Snake Plissken movie, store windows on Fifth Avenue and Rodeo Drive were boarded up; correspondingly, I barricaded my own front and balcony doors as I watched, sick to my stomach, an endless caravan of MAGA-bannered pickup trucks roar past my home in the liberal bastion of Los Angeles the weekend before Election Day.  I girded for the possibility (if not inevitability) of social breakdown, fully aware I would not be cast in the part of uber-competent dystopian hero—the Rick Grimes or Mad Max—in that story.

What I never imagined—not once, even fleetingly—was that upon receiving official word of a Biden/Harris victory, cities across the country, and the world over, would spontaneously erupt into large-scale celebration worthy of an MGM musical.  Ding-dong!  The witch is dead!  It was a perfectly conventional—and conventionally predictable—Hollywood ending, yet I never saw it coming.

The galaxy celebrates the death of Darth Vader

Despite all the warnings I’ve issued about the unconscious maleficent messaging in our commercial fiction—stories in which messianic saviors redeem our inept/corrupt public institutions (Star Wars and superhero sagas), armed men with badges act without even the smallest measure of accountability (action movies and police procedurals), and environmental destruction/societal collapse are not merely inevitable but preferable (Mad Max:  Fury Road, The Walking Dead), because apocalypse absolves us from our burdensome civic responsibilities—this election season has exposed my own susceptibility to pop-cultural conditioning.

It wasn’t merely a spirit of doomism I nursed throughout October; it was an unchallenged assumption that the interminable Trump narrative would simply do what all our stories now do:  hold us in a state of real-time presentism (“We’ll have to wait and see” and “I will keep you in suspense” are common refrains from the outgoing president) rather than arrive at definitive conclusion.

The erosion of cathartic narrativity is a subject I’ve admittedly addressed a lot here on the blog since I first published “Journey’s End” over five years ago, but it’s essential to understanding how the Trump presidency came to be, and why we all felt such an atavistic sense of relief when it reached an end on November 7.

Around the turn of the millennium, storytellers mostly abandoned the Aristotelian narrative arc—with its rising tension, climax, and catharsis—in favor of “storyless” fiction with either a satirical-deconstructionist agenda (Family Guy, Community) or to emulate the kind of open-ended worldbuilding previously the exclusive province of tabletop RPGs and videogames (Game of Thrones, Westworld).

Continue reading

What Comes Next: Lessons on Democracy and Narrative from “Hamilton”

Less than three months out from arguably the most important presidential election in living memory, our democracy is in deep, deep shit.

Need we recap?  Commuting Roger Stone.  Gassing Lafayette Square.  Suppressing the vote.  Sabotaging the Postal Service.  Floating the postponement—and actively undermining the credibility—of the November election.  Sending federal agents to detain (read:  abduct) protestors in Portland.  And that’s just a topline best-of-Trump-2020 compilation.

This is America?

Let’s face it:  The spirit of nihilism that animates MAGA was never about making America great again so much as it was burning the Republic to the ground.  That’s what Trump’s supporters really voted for in 2016, and it’s the one big (if never quite explicit) campaign promise he might actually deliver on:  reifying the very American carnage he once claimed exclusive qualification to redress.  To wit:  The nightly news plays like an apocalyptic bookend to the rousing founding-of-America story told in Hamilton.

Daveed Diggs, Okieriete Onaodowan, Anthony Ramos, and Lin-Manuel Miranda in “Hamilton”

While Lin-Manuel Miranda’s revolutionary masterpiece certainly challenges us to appreciate anew the value and purpose of democracy—a timely reminder if ever there was one—it somewhat less conspicuously does the same for an equally imperiled institution:  narrative itself.

Hamilton has been described by its creator as “a story about America then, told by America now” (Edward Delman, “How Lin-Manuel Miranda Shapes History,” The Atlantic, September 29, 2015).  But if the musical’s creative approach to its subject matter is unorthodox, its narrative structure is very much a conventional hero’s journey.  (For my Save the Cat! scholars, it’s a “Real-Life Superhero” tale, and not, as some “experts” would have you believe, Golden Fleece.)  The power in and of narrative is a central preoccupation of Hamilton; the show literally opens with a dramatic question posed to the audience:

How does a bastard, orphan, son of a whore and a
Scotsman, dropped in the middle of a forgotten
Spot in the Caribbean by providence, impoverished, in squalor,
Grow up to be a hero and a scholar?

Alexander Hamilton is a man who imagines—who writes—his way out of poverty, and, in turn, “rewrote the game,” by “Poppin’ a squat on conventional wisdom”—meaning, the institutionalized “divine right of kings” narrative.

Continue reading

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑