Writer of things that go bump in the night

Tag: multiverse

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 1

Editor’s note:  By even the indefensibly prolix standards of this blog, the following essay—an analytical piece on Hollywood mega-franchises and how audiences wind up serving them more than they serve us—is a lengthy one.  Accordingly, “In the Multiverse of Madness” will be published in two separate parts, with the concluding installment following this one by a week.  I thank you in advance for your time and attention, neither of which I take for granted.


In last month’s post, I proffered that when a fan-favorite media franchise no longer serves us—when we come to recognize some of the popular fictions we’ve cherished embody values we no longer endorse, and potentially even threaten to stand in the way of where we need to go—often the best thing we can do for ourselves is to let it go, purposely and permanently.

Letting go is not about “canceling” (someone like disgraced geek god Joss Whedon) or boycotting (the films of, say, Woody Allen); it’s not about taking action at all.  Instead, letting go is not doing something any longer—not renting out any more space in your life or in your head to the likes of Whedon or Allen, or even to the culturally defining popular narratives whose very ubiquity we take as a God-given absolute:  Star Wars, Star Trek, Harry Potter, DC and Marvel, to name but a sampling.

Despite the universal prevalence of those transmedia brands—not merely the plethora of movies and TV shows, but the licensed apparel and iPhone cases, the die-cast collectables and plush toys—we can, if we choose, be done with any or all those franchises as of… right now.  To learn to live without them entirely.  And happily.  Even lifelong, hardcore superfans can learn to let go of their preferred multimedia pastimes.

It’s both easier and harder than you may think.

Just imagine never caring about ANY of this ever again…

But wait!  What if you happen to genuinely enjoy Star Wars or Star Trek or DC or Marvel?  If you’re a fan, and some or all of those entertainment franchises add value to your life’s experience, by all means, disregard this post’s advice.  Though perhaps first consider this:

For most of Hollywood history, the movie business has needed a hostage buyer, a customer with little choice but to purchase the product.  First, this was the theatre chains, which the studios owned, or controlled, until 1948, when the Supreme Court forced the studios to sell them on antitrust grounds.  In the eighties and nineties, video stores partly filled the role.  But, increasingly, the hostage buyer is us.

Today, the major franchises are commercially invulnerable because they offer up proprietary universes that their legions of fans are desperate to reënter on almost any terms.  These reliable sources of profit are now Hollywood’s financial bedrock.

Stephen Metcalf, “How Superheroes Made Movie Stars Expendable,” New Yorker, May 21, 2018

Consider:  How many of us are unwitting “hostage buyers”—fans who continue to subscribe to certain multimedia franchises no longer out of pleasure, but lately out of habit?  Out of decades-long conditioning?  We may watch Star Wars, for instance, simply because we’ve always watched Star Wars, even if we can’t truly recall the last time we actually enjoyed it the way we did when we were ten years old—with pure and wondrous abandon.  Bad word-of-mouth will steer us clear of a one-off bomb like Blackhat or King Arthur:  Legend of the Sword or The Happytime Murders, but it’ll merely lower our expectations for Star Wars:  The Rise of Skywalker and X-Men:  Dark Phoenix and Terminator:  Dark Fate, not deter us from seeing those umpteenth sequels for ourselves.

When that happens—when we’re willing to spend our money, time, and attention (our three primary modes of currency) on a product we know in advance is shit—we’re no longer fans of those franchises so much as brand loyalists.  Habit buyers, if not outright hostage buyers.  And it can be hard to recognize that in ourselves—harder than we might realize.  I was still reading Batman comics into my thirties, who-knows-how-many years after I stopped enjoying them—long after a once-joyful pleasure became an interminably joyless obligation.  So, why was I still reading and collecting them?

Because I’d always read comics, from the time I was a kid; I’d buy them at the corner candy store in my Bronx neighborhood with loose change I’d rummaged from the couch cushions and reread each one a thousand times.  I’d share them with my grade-school gang, and vice versa.  I’d collected them for as long as I could remember, so it truly never occurred to me a day might come when they no longer added value to my life—when they’d outlived their onetime reliable purpose.  And for years after I reached that point of terminally diminished returns, I’d continue to spend money, to say nothing of time and attention, on a habit I wasn’t enjoying—that did nothing but clutter my home with more worthless shit that went straight into indefinite “storage” in the closet.  Why the hell did I do that?

Because I’d ceased to be a fan and had instead become an obedient brand loyalist—an institutionalized hostage buyer.  And, to be sure, corporate multimedia initiatives—which is to say the those so-called “mega-franchises” from which there is always one more must-see/must-have sequel, prequel, sidequel, spinoff, TV series, tie-in comic, videogame, and branded “collectible” being produced—very much count on our continued, unchallenged fidelity to once-beloved concepts and characters…

… and they are doubling down on the billion-dollar bet they’ve placed on it:

Continue reading

Misery Sans Company: On the Opportunities and Epiphanies of Self-Isolation

March?  Please!  I’ve been in self-isolation since January.

No, I was not clairvoyantly alerted to the impending coronavirus pandemic; only our dear leader can claim that pansophic distinction.  Rather, my wife started a new job at the beginning of the year, necessitating a commute, thereby leaving me carless.  (Voluntarily carless, I should stipulate:  I refuse to be a two-vehicle household; as it is, this congenital city kid, certified tree-hugger, and avowed minimalist owns one car under protest.)

My obstinance, however, comes at a cost:  I don’t live within convenient walking distance of anything save a Chevron station (the irony of which is only so amusing), so while the missus is at work, I’m effectively immobilized.  I got nowhere to go… save the home office opposite my bedroom.  Thusly, I made a conscious decision at the start of the year to embrace my newfound confinement as a creative opportunity—to spend the entirety of winter devoted all but exclusively to breaking the back of my new novel.  I kept my socializing and climate activism to a minimum during this period, submitting to the kind of regimented hourly schedule I haven’t known since my college days.

Johnny Depp in creative self-isolation in “Secret Window” (2004), from Stephen King’s novella

Before long, my period of self-imposed artistic self-isolation was yielding measurable results, and I’d been looking forward to emerging from social exile.  The week I’d earmarked for my “coming-out party”?  You guessed it:  The Ides of March.

I instead spent St. Paddy’s week mostly reeling, knocked sideways—as I imagine many were—by the speed and scale at which this crisis ballooned.  But in the days that followed, I resolved to compartmentalize—to get back to work.  I still had my codified daily routine, after all, which required a few adjustments and allowances under the new circumstances, and I had a project completely outlined and ready to “go to pages.”  So, that’s what I turned to.

And in short order, I’d produced the first two chapters, which, for me, are always the hardest to write, because I have no narrative momentum to work with as I do in later scenes.  You open a blank Scrivener document, and—BOOM!—all your careful planning and plotting, your meticulously considered character arcs and cerebral theme work?  It ain’t worth shit at that ex nihilo instant.  You may’ve built the world, but how do you get into it?  Writing that first sentence, that first paragraph, that first scene, that first chapter is like feeling your way around in the dark.  (Fittingly, my first chapter is literally about three guys finding their way through a forest path in the pitch black of night.)

“Going to pages” turned out to be just the intellectual occupation I needed to quell my anxiety, to give me a reprieve from our present reality.  And now that I’ve got story momentum, slipping into the world of my fiction every morning is as easy as flicking on the television.  For the three or four hours a day I withdraw to my personal paracosm, I’m not thinking about anything other than those characters and their problems.  As such, I’ve thus far sat out this crisis in my study, trafficking in my daydreams to pass the time; I’m not treating patients, or bagging groceries, or delivering packages, or working the supply chain, or performing any of the vital services upholding our fragile social order.  Instead, I’m playing make-believe.

Self-isolation didn’t serve Stephen King’s Jack Torrance particularly well in “The Shining”

It wasn’t long ago—Christmas, in fact—I’d issued an earnest, hopeful plea that in the year to come we might all forsake our comforting fictions, our private parallel dimensions, in favor of consciously reconnecting with our shared nonfictional universe.  And now here many of us find ourselves, banished from the streets, from the company of others, confined by ex officio decree to our own hermetic bubbles—as of this writing, 97% of the world is under stay-at-home orders—with nowhere to retreat but our escapist fantasies.  I’ve been reliant upon them, too—even grateful for them.

And that got me thinking about Stephen King’s Misery.  As masterful, and faithful in plotting, as Rob Reiner’s movie adaptation (working from a screenplay by William Goldman) is to King’s book, the theme—the entire point of the narrative—gets completely lost in translation.  This is a story about addiction, as only King could tell it:  It’s about how drugs (in this case, prescription-grade painkillers) help us cope with misery, but it’s also about how art can be an addictive—and redemptive—coping mechanism, as well; how it can turn misery into a kind of beauty, especially for the artist himself.

Continue reading

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑