Writer of things that go bump in the night

Tag: nostalgia (Page 1 of 2)

Highway to Hell:  Car Culture and Hollywood’s Hero-Worship of the Automobile

With road-trip season upon us once again, here’s an examination of how American car culture has been romanticized by the entertainment industry; how automobiles, far from enablers of freedom and individuality, are in fact “turbo-boosted engines of inequality”; and how Hollywood can help remedy an ecocultural crisis it’s played no small role in propagating.


In any given episode, the action reliably starts the same way:  a wide shot of the Batcave, Batmobile turning on its rotating platform to face the cavemouth, camera panning left as the Dynamic Duo descend the Batpoles.  Satin capes billowing, Batman and Robin hop into their modified 1955 Lincoln Futura, buckle up—decades before it was legally required, incidentally—and the engine whines to life as they run through their pre-launch checklist:

ROBIN:  Atomic batteries to power.  Turbines to speed.

BATMAN:  Roger.  Ready to move out.

A blast of flame from the car’s rear thruster—whoosh!—and off they’d race to save the day.

By the time the 1980s had rolled around, when I was first watching Batman (1966–1968) in syndicated reruns, every TV and movie hero worth his salt got around the city in a conspicuously slick set of wheels.  Muscle cars proved popular with working-class ’70s sleuths Jim Rockford (Pontiac Firebird) and Starsky and Hutch (Ford Gran Torino).  The neon-chic aesthetic of Reagan era, however, called for something a bit sportier, like the Ferrari, the prestige ride of choice for Honolulu-based gumshoe Thomas Magnum (Magnum, P.I.) and buddy cops Crockett and Tubbs (Miami Vice).  The ’80s were nothing if not ostentatiously aspirational.

Even when cars were patently comical, they came off as cool despite themselves:  the Bluesmobile, the 1974 Dodge Monaco used in The Blues Brothers (1980); the Ectomobile, the 1959 Cadillac Miller-Meteor Sentinel in Ghostbusters (1984); the Wolfmobile, a refurbished bread truck that Michael J. Fox and his pal use for “urban surfing” in Teen Wolf (1985).

The DMC DeLorean time machine from Back to the Future is clearly meant to be absurd, designed in the same kitchen-sink spirit as the Wagon Queen Family Truckster from National Lampoon’s Vacation (1983), but what nine-year-old boy in 1985 didn’t want to be Michael J. Fox, sliding across the stainless-steel hood and yanking the gull-wing door shut behind him?  And like the characters themselves, the DeLorean evolved with each movie, going from nuclear-powered sports car (Part I) to cold-fusion flyer (Part II) to steampunk-retrofitted railcar (Part III).  “Maverick” Mitchell’s need for speed didn’t hold a candle to Marty McFly’s, who’s very existence depended on the DeLorean’s capacity to reach 88 miles per hour.

Vehicles that carried teams of heroes offered their own vicarious pleasure.  Case in point:  the 1983 GMC Vandura, with its red stripe and rooftop spoiler, that served as the A-Team’s transpo and unofficial HQ—a place where they could bicker comically one minute then emerge through the sunroof the next to spray indiscriminate gunfire from their AK-47s.  The van even had a little “sibling”:  the Chevrolet Corvette (C4) that Faceman would occasionally drive, marked with the same diagonal stripe.  Did it make sense for wanted fugitives to cruise L.A. in such a distinct set of wheels?  Not really.  But it was cool as hell, so.

The Mystery Machine was the only recurring location, as it were, on Scooby-Doo, Where Are You! (1969), and the van’s groovy paint scheme provided contrast with the series’ gloomy visuals.  Speaking of animated adventures, when once-ascetic Vietnam vet John Rambo made the intuitive leap from R-rated action movies to after-school cartoon series (1986), he was furnished with Defender, a 6×6 assault jeep.  Not to be outdone, the most popular military-themed animated franchise of the ’80s, G.I. Joe:  A Real American Hero (1983–1986), featured over 250 discrete vehicles, and the characters that drove them were, for the most part, an afterthought:

With the debut of the 3 ¾” figures in 1982, Hasbro also offered a range of vehicles and playsets for use with them.  In actual fact, the 3 ¾” line was conceived as a way to primarily sell vehicles—the figures were only there to fill them out!

‘3 ¾” Vehicles,’ YoJoe!

But who needs drivers when the vehicles themselves are the characters?  The protagonists of The Transformers (1984–1987) were known as the Autobots, a race of ancient, sentient robots from a distant planet that conveniently shapeshifted into 1980s-specific cars like the Porsche 924 and Lamborghini Countach, among scores of others.  (The premise was so deliriously toyetic, it never occurred to us to question the logic of it.)  Offering the best of both G.I. Joe and The Transformers, the paramilitary task force of M.A.S.K. (1985–1986), whose base of operations was a mountainside gas station (what might be described as Blofeld’s volcano lair meets the Boar’s Nest), drove armored vehicles that transformed into… entirely different vehicles.

Many movies and shows not only featured cars as prominent narrative elements, but literally took place on the roadVacationMad Max (1979).  Smokey and the Bandit (1977).  CHiPs (1977–1983).  Sometimes the car was so important it had a proper name:  General Lee from The Dukes of Hazzard (1979–1985).  Christ, sometimes it was the goddamn series costar:  KITT on Knight Rider (1982–1986).  Shit on David Hasselhoff’s acting ability all you want, but the man carried a hit TV show delivering the lion’s share of his dialogue to a dashboard.  Get fucked, Olivier.

1980s hero-car culture at a glance

As a rule, productions keep multiple replicas of key picture cars on hand, often for different purposes:  the vehicle utilized for dialogue scenes isn’t the one rigged for stunts, for instance.  It’s notable that the most detailed production model—the one featured in medium shots and closeups, in which the actors perform their scenes—is known as the “hero car.”  And why not?  Over the past half century, Hollywood has unquestionably programmed all of us to recognize the heroism of the automobile.

Continue reading

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

A History of the Blog (So Far)—and a Programming Update

Since launching this blog eight years ago, I have maintained a consistent publishing schedule of one new post per month.  However, given the ways in which this ongoing project has evolved, that level of output is no longer sustainable.  Here’s a brief chronicle of the blog’s creative progression—and a statement on what comes next.


From the time I signed with my first literary manager in 1998 through the ignominious end of my career in Hollywood in 2014, I was exclusively focused on one form of creative expression:  screenwriting.

Though ultimately unproduced, my scripts nonetheless earned praise from producers and development execs for their uncommon visual suggestiveness and sharp sense of pace, which I controlled through deliberate syntactic arrangement of the very things that do not appear in the finished film for audiences to appreciate:  the stage description.

Screenwriters, if you’re unaware, are not by and large particularly skillful wordsmiths.  And, to be fair, it’s not required of them.  Plot structure, characterization, and dialogue are what the screenwriter is there to provide for a motion picture.  Why waste time and creative energy on pretty prose in a blueprint, which is all a screenplay really is?

A rarified handful of pro screenwriters, Shane Black and James Cameron among them, paint immersive pictures with their words, imparting how the world of the story feels over merely sequentially reporting what happens.  Such is the dynamic mode of screenwriting for which I strove.

Most screenplays—and I’m talking about scripts to produced films, written by Hollywood’s A-list scribes—aren’t much more than utilitarian laundry lists of things we’ll see and hear onscreen, conveyed without any visceral impression of style or tempo, and are, accordingly, nigh unreadable.  The director, after all, is going to make the movie he sees in his head; the script is just a means to get all the above- and below-the-line talent quite literally on the same page.

Excerpted from “Indiana Jones and the Kingdom of the Crystal Skull” by David Koepp.  Mind-numbing, no?

I actually like words, however.  I like how they sound, and the infinite combinations of meaning that can be made from them.  Truth is, I never should’ve aspired to be a screenwriter.  It was the wrong medium for my talents and interests.  “Author” and “essayist” were always a better fit for my writerly sensibilities.  It took the implosion of my career to finally embrace that.

So, when I started this blog at the encouragement of my wife—one of her many good ideas—I didn’t know quite what to write about except screenwriting.  Accordingly, my first two dozen posts are almost entirely devoted to matters of narrative craft, from my customized Storytelling 101 curriculum to the violation of the Double Hocus Pocus principle in Ghostbusters II to character deconstructions of Jack Bauer and John Rambo and a comparative analysis of the Jack Nicholson and Heath Ledger interpretations of the Joker.

One year into this blogging project, all my notions about narrativity were challenged—perhaps even shattered—by a book I’d read called Present Shock:  When Everything Happens Now (2013) by Douglas Rushkoff, which argued that Joseph Campbell’s “heroic journey,” the dramatic schema that has served as the structural basis for nearly every story in the Western literary canon, had collapsed around the turn of the millennium, as evidenced by the fanatical popularity of “storyless” fiction like Lost, The X-Files, The Sopranos, CSI:  Crime Scene Investigation, The Walking Dead, and Game of Thrones.

Rushkoff’s premise inspired a yearslong scholarly investigation on my part, which began in earnest with a post called “Journey’s End:  Rushkoff and the Collapse of Narrative,” and turned the blog in a new, more complex direction.  This intellectual project would never be the same.

Continue reading

EXT. LOS ANGELES – ONE YEAR LATER

I thought I’d said everything I had to say about Los Angeles last winter.  Should’ve known Hollywood would demand a sequel.


Even at the height of its considerable cultural influence, I never much cared for Sex and the City—for a very simple reason:  I didn’t in any way recognize the New York it depicted.

As someone who’d grown up there, Sex seemed like a postfeminist fantasy of the city as a bastion of neoliberal materialism, conjured by someone who’d never actually been to New York or knew so much as the first thing about it.  It certainly didn’t reflect the experience of any working-class New Yorkers I knew.

(It would seem the more things change, the more they stay the same:  The recent SATC revival series, And Just Like That…, is reported to be full of unintentionally cringe-inducing scenes of the gals apparently interacting with Black women for the first time in their lives.  Sounds on-brand.)

But this isn’t a retroactive reappraisal of a 1990s pop-cultural pacesetter—those have been exhaustively conducted elsewhere of late—merely an acknowledgment that the impression the series made on the generation of (largely) female Millennials who adored it is undeniable, legions of whom relocated to New York in early adulthood to have the full Sex and the City experience, and who, in turn, in many ways remade the city in Carrie Bradshaw’s image, for better or worse.

I can’t say as I blame those folks, really.  That they were sold a load of shit isn’t their fault.  Here in New York, we were just as susceptible to Hollywood’s greener-grass illusions of elsewhere.  As a student in the 1990s, the Los Angeles of Beverly Hills, 90210 (1990–2000) and Baywatch (1989–2001), of Buffy the Vampire Slayer (1992) and Clueless (1995), seemed like a fun-in-the-sun teenage paradise in stark contrast with the socially restrictive experience of my all-boys high school in the Bronx, where the only thing that ever passed for excitement were spontaneous gang beatings at the bus stop on Fordham Road.

The high-school experience depicted on “Beverly Hills, 90210” is one I think we can all relate to

The sunny schoolyards and neon-lit nighttime streets of L.A. carried the promise of good times, the kind that seemed altogether out of reach for me and my friends.  The appeal of what California had to offer was so intoxicating, in fact, my two best pals and I spent an entire summer in the mid-’90s trying to make the streets of the Bronx look like Santa Cruz—a place none of us had ever been—for an amateur sequel to The Lost Boys, the ’80s cult classic about a coven of adolescent vampires who’ve (wisely) opted to spend eternity on the boardwalk.  That notion unquestionably took hold of my impressionable imagination—it made me want to be a part of that culture, and tell those kinds of stories.

Accordingly, it’s fair to say it wasn’t merely the movie business that brought me to Los Angeles in my early twenties as an aspiring screenwriter, but arguably the romantic impressions of California itself imprinted upon my psyche by all those movies and TV series on which I came of age.  Yet for the two decades I lived there, the city I’d always imagined L.A. to be—a place full of golden possibilities, as low-key as New York was high-strung—wasn’t the one I experienced.  Not really.  Not until last month, anyway.

Continue reading

Book Review:  “Blood, Sweat & Chrome” by Kyle Buchanan

Kyle Buchanan’s Blood, Sweat & Chrome, published by William Morrow in February, chronicles the not-to-be-believed making of George Miller’s Mad Max:  Fury Road (2015) from conception to release through interviews with its cast and crew, and celebrates the inspiring creative imagination of the filmmakers, who defied the odds to create a contemporary classic—a movie as singularly visceral as it is stunningly visual.

But much like the nonstop action in the movie itself, the adulation expressed in the book never pauses to interrogate Miller and company’s moral imagination.  Let’s fix that, shall we?


I abhor nostalgia, particularly for the 1980s and ’90s, but I’ve recently found myself revisiting many of the films and television shows of the latter decade, the period during which I first knew I wanted to be a cinematic storyteller, when earnest star-driven Oscar dramas like Forrest Gump (1994) coexisted with, and even prospered alongside, paradigm-shifting indies à la Pulp Fiction (also ’94).  Those days are gone and never coming back—the institution formerly known as Hollywood is now the superhero–industrial complex—but I’ve wondered if some of those works, so immensely popular and influential then, have stood the test of time?

Yet my informal experiment has been about much more than seeing if some old favorites still hold up (and, by and large, they do); it’s about understanding why they worked in the first place—and what storytelling lessons might be learned from an era in which movies existed for their own sake, as complete narratives unto themselves rather than ephemeral extensions of some billion-dollar, corporately superintended brand.

In an entertainment landscape across which there is so much content, most of it deceptively devoid of coherence or meaning—a transmedia morass I’ve come to call the Multiverse of Madness—the secret to studying narrativity isn’t to watch more but rather less.  To consume fewer movies and TV shows, but to watch them more selectively and mindfully.  Pick a few classics and scrutinize them until you know them backwards and forwards.

In college, I spent an entire semester analyzing Citizen Kane (1941), from reading multiple drafts of its screenplay to watching it all the way through with the volume turned down just to appreciate its unconventional cinematography.  That’s how you learn how stories work:  Study one or two movies/novels per year… but study the shit out of them.  Watch less, but do it far more attentively.

Tom Hardy as Max Rockatansky in “Mad Max: Fury Road,” the subject of “Blood, Sweat & Chrome”

That is, admittedly, a counterintuitive mindset in our Digital Age of automatic and accelerating behaviors, whereby post-credit scenes preemptively gin up anticipation for the next movie (often through homework assignments) before we’ve had a chance to digest the current one, and the autoplay feature of most streaming services encourages and enables mindless TV binge-watching.

But the quarantine, unwelcome though it may have been, did offer a pause button of sorts, and we are only now beginning to see some of the ways in which folks exploited the rare opportunity to slow down, to go deep, that it offered.  One such project to emerge from that period of thoughtful reflection is entertainment journalist Kyle Buchanan’s recently published nonfiction book Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road”:

In April 2020, as the pandemic swept the planet and the movie-release calendar fell apart, I began writing an oral history of Mad Max:  Fury Road for the New York Times.  Without any new titles to cover, why not dive deeply into a modern classic on the verge of its fifth anniversary?

Every rewatch over those five years had confirmed to me that Fury Road is one of the all-time cinematic greats, an action movie with so much going on thematically that there’d be no shortage of things to talk about.  I had also heard incredible rumors about the film’s wild making, the sort of stories that you can only tell on the record once the dust has long settled.

Kyle Buchanan, Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road” (New York:  William Morrow, 2022), 337

A movie two decades in the making, Fury Road, the belated follow-up to writer/director George Miller’s dystopian action-film trilogy Mad Max (1979, 1981, 1985) starring a then-unknown Mel Gibson as a wanderer in the wasteland—the Road Warrior—began its long journey to the screen as a proposed television series in 1995 when Miller won back the rights to the franchise from Warner Bros. as part of a settlement from a breach-of-contract suit he’d filed over having been fired from Contact (1997).

Eventually inspired to do another feature instead—“What if there was a Mad Max movie that was one long chase,” Miller pondered, “and the MacGuffin was human?” (ibid., 31)—the ensuing production was plagued with one near-terminal roadblock after another.  The behind-the-scenes story told in Blood, Sweat & Chrome is as thrilling, in its own way, as that of Mad Max:  Fury Road itself.

Continue reading

You Can’t Go Home Again:  Hopeful Reflections on Returning to New York after 20 Years Away

Following up on the personal story that began last month in “A Hollywood Ending:  Hopeful Reflections on a Failed Screenwriting Career,” here’s my take on whether we can ever truly go home again.


When I left my apartment of two decades in Los Angeles last spring, I knew it was the last time I’d ever see the place.  I’d never really experienced that particular manner of finality before—walking away from a longtime home with full knowledge I would never again cast eyes upon it—because when I moved to L.A. from the Bronx in 2001, it was implicit I’d have ample occasion to return.  My mother was here, after all, so it was still “Carlin homebase,” so to speak.

And, to be sure, I loved coming back for Christmas, and other sporadic occasions, to reconnect with the old hometown.  It was and remains the only place in the world where I can strut down the avenue like Tony Manero on 86th Street in Bensonhurst, both master of all I survey yet somehow, simultaneously and incongruously, just another townie.  I love that sensation—of belonging to a place so completely and so comfortably.  When I walk down the streets of New York, I am home.  And if that’s the standard for what home feels like, nothing else has ever come close—even L.A. after all that time.

After my screenwriting career abruptly ended in 2014, I spent the next several years nursing a quixotic fantasy in which I made my escape from L.A. both on a moment’s notice and without a backwards glance.  Sleep tight, ya morons!  Only trouble is, that’s like imagining yourself racing heroically into burning building to rescue someone trapped inside:  It’s an easy scenario to envision when it’s purely hypothetical, unlikely to ever be put to the test.

But over the winter of 2021, from the point at which my wife and I initiated the purchase of our new apartment in the Bronx through the day we left California for good, I had a lot of time to say the long goodbye to L.A.—to come to terms with the idea that I actually was leaving.  And throughout that six-month period, I couldn’t get Sean Penn’s elegiac soliloquy from State of Grace out of my head.

Gary Oldman, Robin Wright, and Sean Penn in “State of Grace” (1990)

State of Grace is an obscure crime thriller from 1990 about the Irish-American street gangs that once ruled Hell’s Kitchen, New York.  (The director, Phil Joanou, has made the entire film available on Vimeo free of charge and in high definition.)  In it, Penn plays a character named Terry Noonan who grew up in the Kitchen and spent his youth running with the Westies, but who absconded, suddenly and unceremoniously, around age twenty.  He told neither his best friend, Jackie (Gary Oldman), nor his girlfriend, Kathleen (Robin Wright); he just disappeared like a thief in the night, his whereabouts unknown.

The story opens with Terry returning to the Kitchen after a decade-long absence, picking up where he left off with Jackie and Kathleen and the Westies.  This being a mob movie, I don’t think it’s much of a spoiler to say it ends tragically for just about every character, Terry included.  “I thought some things,” Terry wistfully confesses to Kathleen in a scene preceding the movie’s blood-soaked climax.  “That I could come back.”  He goes on to explain his reasons for coming home, and how he assumed everything would be when he got there, once he’d reintegrated himself in the old neighborhood.  He’d pictured it all so perfectly…

But it was only an idea.  Had nothin’ to do with the truth, it’s just… a fuckin’ idea, like… you believe in angels, or the saints, or that there’s such a thing as a state of grace.  And you believe it.  But it’s got nothin’ to do with reality.  It’s just an idea.  I mean, you got your ideas and you got reality.  They’re all… they’re all fucked up.

From State of Grace, written by Dennis McIntyre (with uncredited contributions from David Rabe)

Now, I don’t imagine it’ll surprise you to learn I was not involved with the criminal underworld when I lived in New York, nor did I slip away unannounced in the middle of the night without providing a forwarding address.  Nonetheless, Terry’s lamentation played on a loop in my mind’s ear throughout that winter:

I thought some things… that I could come back.

State of Grace is about a guy who learns the hard way you can’t simply come home after all that time away and expect to just pick up where you left off; it’s a cautionary tale about what we expect versus how things actually are.  Faced with the prospect of finally going home for good, I wondered:  Is that even possible?  Or was Thomas Wolfe right?  Had I been carrying around a romantic notion of a happy homecoming that had nothing to do with reality?

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

The Ted Lasso Way: An Appreciation

The Emmy-nominated comedy series Ted Lasso doesn’t merely repudiate the knee-jerk cynicism of our culture—it’s the vaccine for the self-reinforcing cynicism of our pop culture.  In a feat of inspiring commercial and moral imagination, Jason Sudeikis has given us a new kind of hero—in an old type of story.


As a boy coming of age in the eighties and early nineties, I had no shortage of Hollywood role models.  The movies offered smartass supercops John McClane and Martin Riggs, vengeful super-soldiers John Matrix and John Rambo, and scorched-earth survivalists Snake Plissken and Mad Max, to cite a select sampling.  Sure, each action-hero archetype differed somewhat in temperament—supercops liked to crack wise as they cracked skulls, whereas the soldiers and survivalists tended to be men of few words and infinite munitions—but they were, one and all, violent badasses of the first order:  gun-totin’, go-it-alone individualists who refused to play by society’s restrictive, namby-pamby rules.

Yippee ki-yay.

The small screen supplied no shortage of hero detectives in this mode, either—Sonny Crockett, Thomas Magnum, Rick Hunter, Dennis Booker—but owed to the content restrictions of broadcast television, they mostly just palm-slammed a magazine into the butt of a chrome Beretta and flashed a charismatic GQ grin in lieu of the clever-kill-and-quick-one-liner m.o. of their cinematic counterparts.  (The A-Team sure as hell expended a lot of ammo, but their aim was so good, or possibly so terrible, the copious machine-gun fire never actually made contact with human flesh.)  The opening-credits sequences—MTV-style neon-noir music videos set to power-chord-driven instrumentals—made each show’s gleaming cityscape look like a rebel gumshoe’s paradise of gunfights, hot babes, fast cars, and big explosions.

It might even be argued our TV heroes exerted appreciably greater influence on us than the movie-franchise sleuths that would often go years between sequels, because we invited the former into our home week after week, even day after day (in syndication).  And to be sure:  We looked to those guys as exemplars of how to carry ourselves.  How to dress.  How to be cool.  How to talk to the opposite sex.  How to casually disregard any and all institutional regulations that stood in the way of a given momentary impulse.  How to see ourselves as the solitary hero of a cultural narrative in which authority was inherently suspect and therefore should be proudly, garishly, and reflexively challenged at every opportunity.  The world was our playground, after all—with everyone else merely a supporting actor in the “great-man” epic of our own personal hero’s journey.

Oh, how I wish, in retrospect, we’d had a heroic role model like Jason Sudeikis’ Ted Lasso instead.

THE LAST BOY SCOUT

The premise of Ted Lasso, which recently commenced its second season, is a can-do college-football coach from Kansas (Sudeikis) is inexplicably hired to manage an English Premier League team, despite that kind of football being an entirely different sport.  Ted, we learn, has been set up to fail by the embittered ex-wife of the club’s former owner (Hannah Waddingham), who, in a plot twist that owes no minor creative debt to David S. Ward’s baseball-comedy classic Major League—which the show tacitly acknowledges when Ted uncharacteristically invokes a key line of profane dialogue from the movie verbatim—inherited the team in a divorce and is now surreptitiously revenge-plotting its implosion.

Jason Sudeikis as Ted Lasso

But, boy oh boy, has Waddingham’s Rebecca Welton—a refreshingly dimensional and sympathetic character in her own right, it’s worth noting—seriously underestimated her handpicked patsy.  With his folksy enthusiasm and full Tom Selleck ’stache, Coach Ted Lasso unironically exemplifies big-heartedness, open-mindedness, kindness, courtesy, chivalry, civility, forgiveness, wisdom, teamwork, cultural sensitivity, and prosocial values—all with good humor, to boot.  His infectious optimism eventually converts even the most jaded characters on the show into true believers, and his innate goodness inspires everyone in his orbit—often despite themselves—to be a better person.  And if, like me, you watch the first season waiting for the show to at some point subject Ted’s heart-on-his-sleeve earnestness to postmodern mockery or ridicule—“spoiler alert”—it doesn’t.

Continue reading

Too Much Perspective: On Writing with Moral Imagination

Practicing morally imaginative storytelling means scrutinizing the values and messages encrypted in the fiction we produce—but it does not mean passing a “purity test.”


In Marty Di Bergi’s 1984 rockumentary This Is Spinal Tap, the titular British heavy-metal band, faced with ebbing popularity and flagging album sales, embarks on a disaster-prone tour of North America in support of its latest release, the critically savaged Smell the Glove.  During a stopover at Graceland to pay their respects to the King of Rock and Roll at his gravesite, lead vocalist David St. Hubbins comments, “Well, this is thoroughly depressing.”

To which bandmate and childhood best friend Nigel Tufnel responds, “It really puts perspective on things, though, doesn’t it?”

“Too much.  There’s too much fucking perspective now.”

It’s a sentiment to which we can all relate, collectively endowed as we’ve become with a migrainous case of “2020 vision.”  At the start of the pandemic, long before we had any sense of what we were in for let alone any perspective on it, I like many essayists felt the urge or need or even the responsibility to say something about it, despite knowing I had no useful or meaningful insight.  I netted out with an acknowledgment that the months to come would present a rare Digital Age opportunity for quiet introspection and reflection—one in which we might expand our moral imagination of what’s possible, to invoke the exquisite wisdom of my mentor Al Gore, and perhaps envision a world on the other side appreciably more just, equitable, and sustainable than the one we had before the global shutdown.

Did we ever.  Here in the United States, we are now wrestling with issues of economic inequality, structural racism, police brutality, environmental justice, and fair access to affordable housing and healthcare with an awareness and an urgency not seen in generations, and President Joe Biden—responding to the social movements of his times like FDR and LBJ before him—has proposed a host of progressive legislation that matches the visionary, transformative ambition of the New Deal and the Great Society.

Reuters via the New York Times

With heartening moral imagination (certainly more than this democratic eco-socialist expected from him), Biden is attempting to turn the page on the Randian, neoliberal narrative of the past forty years and write a new chapter in the American story—one founded on an ethos of sympathetic coexistence, not extractive exploitation.  With our continued grassroots support and, when necessary, pressure, he might even be the unlikely hero to pull it off, too—our Nixon in China.

As for me?  I spent most of the pandemic thinking about narrativity myself.  Doing nothing, after all, was a privilege of the privileged, with whom I am obliged to be counted.  So, I used the time in self-quarantine to think and to write about the stories we tell, and I arrived at the resolute conclusion that we—the storytellers—need to do a lot better.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑