Writer of things that go bump in the night

Tag: comic book (Page 1 of 3)

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 1

Editor’s note:  By even the indefensibly prolix standards of this blog, the following essay—an analytical piece on Hollywood mega-franchises and how audiences wind up serving them more than they serve us—is a lengthy one.  Accordingly, “In the Multiverse of Madness” will be published in two separate parts, with the concluding installment following this one by a week.  I thank you in advance for your time and attention, neither of which I take for granted.


In last month’s post, I proffered that when a fan-favorite media franchise no longer serves us—when we come to recognize some of the popular fictions we’ve cherished embody values we no longer endorse, and potentially even threaten to stand in the way of where we need to go—often the best thing we can do for ourselves is to let it go, purposely and permanently.

Letting go is not about “canceling” (someone like disgraced geek god Joss Whedon) or boycotting (the films of, say, Woody Allen); it’s not about taking action at all.  Instead, letting go is not doing something any longer—not renting out any more space in your life or in your head to the likes of Whedon or Allen, or even to the culturally defining popular narratives whose very ubiquity we take as a God-given absolute:  Star Wars, Star Trek, Harry Potter, DC and Marvel, to name but a sampling.

Despite the universal prevalence of those transmedia brands—not merely the plethora of movies and TV shows, but the licensed apparel and iPhone cases, the die-cast collectables and plush toys—we can, if we choose, be done with any or all those franchises as of… right now.  To learn to live without them entirely.  And happily.  Even lifelong, hardcore superfans can learn to let go of their preferred multimedia pastimes.

It’s both easier and harder than you may think.

Just imagine never caring about ANY of this ever again…

But wait!  What if you happen to genuinely enjoy Star Wars or Star Trek or DC or Marvel?  If you’re a fan, and some or all of those entertainment franchises add value to your life’s experience, by all means, disregard this post’s advice.  Though perhaps first consider this:

For most of Hollywood history, the movie business has needed a hostage buyer, a customer with little choice but to purchase the product.  First, this was the theatre chains, which the studios owned, or controlled, until 1948, when the Supreme Court forced the studios to sell them on antitrust grounds.  In the eighties and nineties, video stores partly filled the role.  But, increasingly, the hostage buyer is us.

Today, the major franchises are commercially invulnerable because they offer up proprietary universes that their legions of fans are desperate to reënter on almost any terms.  These reliable sources of profit are now Hollywood’s financial bedrock.

Stephen Metcalf, “How Superheroes Made Movie Stars Expendable,” New Yorker, May 21, 2018

Consider:  How many of us are unwitting “hostage buyers”—fans who continue to subscribe to certain multimedia franchises no longer out of pleasure, but lately out of habit?  Out of decades-long conditioning?  We may watch Star Wars, for instance, simply because we’ve always watched Star Wars, even if we can’t truly recall the last time we actually enjoyed it the way we did when we were ten years old—with pure and wondrous abandon.  Bad word-of-mouth will steer us clear of a one-off bomb like Blackhat or King Arthur:  Legend of the Sword or The Happytime Murders, but it’ll merely lower our expectations for Star Wars:  The Rise of Skywalker and X-Men:  Dark Phoenix and Terminator:  Dark Fate, not deter us from seeing those umpteenth sequels for ourselves.

When that happens—when we’re willing to spend our money, time, and attention (our three primary modes of currency) on a product we know in advance is shit—we’re no longer fans of those franchises so much as brand loyalists.  Habit buyers, if not outright hostage buyers.  And it can be hard to recognize that in ourselves—harder than we might realize.  I was still reading Batman comics into my thirties, who-knows-how-many years after I stopped enjoying them—long after a once-joyful pleasure became an interminably joyless obligation.  So, why was I still reading and collecting them?

Because I’d always read comics, from the time I was a kid; I’d buy them at the corner candy store in my Bronx neighborhood with loose change I’d rummaged from the couch cushions and reread each one a thousand times.  I’d share them with my grade-school gang, and vice versa.  I’d collected them for as long as I could remember, so it truly never occurred to me a day might come when they no longer added value to my life—when they’d outlived their onetime reliable purpose.  And for years after I reached that point of terminally diminished returns, I’d continue to spend money, to say nothing of time and attention, on a habit I wasn’t enjoying—that did nothing but clutter my home with more worthless shit that went straight into indefinite “storage” in the closet.  Why the hell did I do that?

Because I’d ceased to be a fan and had instead become an obedient brand loyalist—an institutionalized hostage buyer.  And, to be sure, corporate multimedia initiatives—which is to say the those so-called “mega-franchises” from which there is always one more must-see/must-have sequel, prequel, sidequel, spinoff, TV series, tie-in comic, videogame, and branded “collectible” being produced—very much count on our continued, unchallenged fidelity to once-beloved concepts and characters…

… and they are doubling down on the billion-dollar bet they’ve placed on it:

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading

Oh, Snap! The Nostalgia-Industrial Complex — ’90s Edition

Et tu, Millennials?  The old nostalgia-industrial complex got its hooks into you, too, huh?  I must ask:  Have you not witnessed in firsthand horror what pining for the good old days has done to Generation X…?

To recap:  We Xers have thus far spent the twenty-first century reliving all our childhood favorites—Star Wars, Super Friends, Karate Kid, Ghostbusters, Lethal Weapon, Halloween, Bill & Ted, Tron, Transformers, Terminator, Top Gun—a pathological exercise in self-infantilization that has catastrophically retarded both the culture as well as a generation of middle-aged adults who are at this point more passionately invested in Skywalkers and superheroes than are the juvenile audiences for whom those characters were originally intended.

Always keen to recognize—and replicate—a winning formula, a new permutation of forward-thinking backward-gazing has recently seized Hollywood:  Sell nineties-era nostalgia to the generation that came of age in that decade!  Over the past few years, we got a pair of Jurassic Park remakes-masquerading-as-sequels that didn’t inspire a single word of enthusiasm (certainly not a second viewing), but nonetheless earned over a billion dollars apiece, while our last conventional movie star, Dwayne Johnson, used his considerable clout (or more aptly muscle?) to resurrect both Jumanji and Baywatch.  As for this year?  Hope you’re excited for warmed-over helpings of The Lion King, Men in Black, Toy Story, Aladdin, and yet more Jumanji.  And while we’re at it, let’s welcome back slacker duo Jay and Silent Bob, because surely their grunge-era stoner humor still holds up in middle-age—

Our sentiments exactly, fellas…

—as well as Will Smith and Martin Lawrence, back from buddy-cop purgatory for more Bad Boys badassery!  You know damn well whatcha gonna do when they come for you:  Buy a ticket!

For an indeterminate, but clearly not immeasurable, swath of moviegoers, there is no marketing campaign more alluring than one that taps into foggy childhood memories. . . .

. . . The great nostalgia-industrial complex will [continue] steamrollering us against our better judgment into multiplexes, hoping for a simulacrum of the first high we felt watching great characters years ago.

Tom Philip, “Summer ’19 Brought To You By Nostalgia-Bait Movies,” Opinion, New York Times, July 4, 2019

Not just multiplexes.  (And how are those even still a thing?)  On the small screen, VH1 revived game-changing nineties slasher franchise Scream this summer (how, for that matter, is VH1 still a thing?), and new iterations of decade-defining teen melodramas 90210 and Party of Five are on the way.  Dope.

Continue reading

Tim Burton’s “Batman” at 30—and the Cultural Legacy of the Summer of 1989

In order to appreciate the state of commercial adolescence to which Generation X has been disproportionately consigned, one needs to consider Tim Burton’s Batman in its sociocultural context:  how it inadvertently provided a blueprint to reconceptualize superheroes from innocent entertainment meant to inspire the imagination of children to hyperviolent wish-fulfillment fantasies for commercially infantilized adults.


The weekly theatrical debut of a new franchise tentpole, voraciously bulling aside the $200 million–budgeted blockbuster released a mere seven days prior, is par for the course nowadays, but back in 1989—thirty summers ago per the calendar, though seemingly as recently as yesterday by the nebulous barometer of memory—we’d never before experienced anything like that.

That was the year that gave us new entries in such ongoing adventures as Indiana Jones, Star Trek, Ghostbusters, The Karate Kid, Lethal Weapon, James Bond, and Back to the Future, lowbrow comedies Police Academy, Fletch, and Vacation, as well as slasher staples Friday the 13th, A Nightmare on Elm Street, and Halloween—to say nothing of launching all-new franchises with Bill & Ted’s Excellent Adventure, Major League, Pet Sematary, Honey, I Shrunk the Kids, Weekend at Bernie’s, and Look Who’s Talking.  To anyone who’d grown up in the nascent home-video era—that period in which all the aforementioned series (save 007) were born and could thusly be re-watched and obsessed-over ad infinitum—1989 was the Christmas of summer-movie seasons.

Tim Burton's "Batman"
Michael Keaton in Tim Burton’s “Batman” (1989)

But none of those films, huge as many of them were, dominated the cultural spotlight that year as pervasively as Tim Burton’s Batman, released on this date in 1989.

Out of the Shadows

I can hear my thirteen-year-old nephew now:  “One superhero movie?  Wow—how’d you handle the excitement?”

Yeah, I know.  But it was exciting.  I was thirteen myself in 1989, spending most of my free time with my grade-school gang at the neighborhood comic shop down on Broadway, steeped in a subculture that hadn’t yet attained popular acceptance.  Richard Donner’s Superman (1978) had been the only previous attempt at a reverent comic-book adaptation, and, creatively and financially successful though it was, most of that goodwill had been squandered in the intervening decade by a succession of increasingly subpar sequels (through no fault of the marvelous Christopher Reeve, who makes even the worst of them watchable).

Christopher Reeve and Margot Kidder in “Superman: The Movie”

As for Batman:  It’s crucial to remember, and easy enough now to overlook, that in the late eighties, the prevailing public perception of the character was not Frank Miller’s Dark Knight, but rather Adam West’s “Bright Knight” from the self-consciously campy acid-trip of a TV series that had aired twenty years earlier.  In the wake of that show’s cancelation, a concerted effort was made by the character’s creative custodians at DC Comics—first Dennis O’Neil and Neal Adams, then Steve Englehart and Marshall Rogers, and most effectively Miller with his aptly titled The Dark Knight Returns—to reestablish Batman as the “nocturnal avenger” he was originally conceived to be.

“Dark Knight Triumphant” (July 1986); art by Frank Miller and Lynn Varley

But if you weren’t following the comics—and, in those days, few over thirteen years old were—the predominant impression the name “Batman” conjured wasn’t the ferocious Miller rendering above so much as this:

Continue reading

Maybe It’s Time: Here’s to Making 2019 the First Official Year of the 21st Century

“Maybe it’s time to let the old ways die.”  How ironically apropos that in a world led by a reality-show president, where facts are subjective and everything from our energy sources to our economic policies to our pop culture are the antiquated vestiges of a previous century, that a lyric by a fictitious rock star from a remake of a remake of a remake of a movie from 1937 should emerge as the perfect, hopeful mantra of an impending (if belated) new millennial era.  I propose officially adopting it as such; it might make what comes next a little easier to accept for those of us still clinging nostalgically to the 1950s (Baby boomers) and the 1980s (Gen X).

If you belong to one of those analog generations—I’m an Xer myself—and you’ve ever had the frustrating experience of working with a Millennial, you know their nonlinear minds interpret the world in an entirely different manner than those that came before them.  The first wave arrived in the workforce a decade ago, expecting a seat at the table before they’d earned one, demanding their voices be heard before their opinions were informed by practical experience.  Their operating philosophy seemed to be:  Yeah, but just because we’ve always done it that way doesn’t mean we shouldn’t try it… this way.  In their view, the arduous, incremental, straight-line path of our institutionalized practices and protocols didn’t square with their hyperlinked grasp of our new Digital Age reality.  Thusly, conventional (read:  linear) thinking was to be openly challenged, not obediently emulated.

Like many of my fellow Xers that came up the hard way—those of us that knew our place, paid our dues (there’s that pesky sense of linearity again), never assumed we had all the answers—that worldview has often left me bewildered at best, infuriated at worst.  And the sense of entitlement so endemic to Millennials is only compounded by their corresponding characteristic of impatience:

“They’ve grown up in a world of instant gratification.  You want to buy something—you go on Amazon, it arrives the next day.  You want to watch a movie?  Log on and watch a movie—you don’t check movie times.  You want to watch a TV show?  Binge!  You don’t even have to wait week to week to week.  Right?  I know people who skip seasons just so they can binge at the end of the season.  Right?  Instant gratification.”

Simon Sinek, “Simon Sinek,” Inside Quest with Tom Bilyeu, August 7, 2016

Now, to a middle-aged generation still trying (without success) to take the seat at the head of the table from the unyielding occupancy of the Boomers, the Millennials’ impulse—their self-ordained imperative—to grab the wheel and make “meaningful impact” is their most vexing attribute.

And—Christ help me for saying this—it just might change everything for the better.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑