Writer of things that go bump in the night

Tag: Star Wars (Page 1 of 3)

The Last Walking Infinity Throne Corrupts Infinitely:  How the Mega-Franchise Format Warps Creative Storytelling Goals

“As a medium, stories have proven themselves great as a way of storing information and values, and then passing them on to future generations”—Douglas Rushkoff, Present Shock:  When Everything Happens Now (New York:  Penguin Group, 2013), 16.

Traditionally, stories have been organized around universal dramatic principles first identified by Aristotle in Poetics, later codified by Joseph Campbell in The Hero with a Thousand Faces, and most recently customized for screenwriters in programs like Blake Snyder’s Save the Cat!  But in recent decades, narrativity has taken on a new, shapeless, very possibly endless permutation:  the transmedia “mega-franchise”—that is, the intertextual and ever-expanding storyworlds of Marvel, Star Wars, The Conjuring, Harry Potter’s Wizarding World, et al.

In this month’s guest post, friend of the blog Dave Lerner returns to delineate the five creative objectives of storytelling—and how those have mutated, along with narrativity itself, in this era of branded-IP entertainment.


From the first cave paintings to the Homeric epics to the Globe Theatre to the multicamera sitcom, storytellers across the ages have told stories for reasons so obvious they often go unstated and unacknowledged.

Let’s take a look at the five creative goals that guide storytellers in any medium, whether it be a movie, novel, TV episode, comic book, or otherwise.  Commercial considerations such as “profit” and “being hired to do so” are omitted here, as these are not creative goals.

Storytelling Goal #1:  Entertainment

Elementary!  The storyteller intends for their audience to have fun, to relax, to take their minds off their problems, to experience another world, another life, for a while.  Pure escapism.  While some may decry “mindless entertainment,” I would argue that it has a necessary place in life—and I’m not the only one who sees the virtues of escapist stories:

Hence the uneasiness which they arouse in those who, for whatever reason, wish to keep us wholly imprisoned in the immediate conflict.  That perhaps is why people are so ready with the charge of “escape.”  I never fully understood it till my friend Professor Tolkien asked me the very simple question, “What class of men would you expect to be most preoccupied with, and hostile to, the idea of escape?” and gave the obvious answer:  jailers.

C. S. Lewis, On Stories:  And Other Essays on Literature

Storytelling Goal #2:  Artistic Expression

Although the definition of “Art” has been and will be debated endlessly, for the purpose of this category I will use the second definition from Wiktionary:

The creative and emotional expression of mental imagery, such as visual, auditory, social, etc.

To further specify, art is more about the feelings the artist is expressing and the statement the artist is making than the emotions they are attempting to evoke in their audience.

Arguments about whether or not a given piece is “art,” or a given medium is “capable of creating art,” though valid in other contexts, will be disregarded here.  I’ll assume if you say your piece is art, then it’s art.  I am also ignoring the quality of the piece, the term “a work of art.”  By my definition, a movie can be as much a piece of art as a painting, sculpture, symphony, literary novel, etc., though when it is, it’s usually called a “film” and not a “movie.”

Storytelling Goal #3:  Education

The storyteller aspires to teach their audience something they did not know before.  While documentaries and lectures are obvious examples, many read historical novels or hard science fiction for much the same purpose.  When I was a child, I first learned that water expands when it freezes from a Shazam! comic book.  Of course, a person may forget most of what they’d learned almost immediately afterwards, but the learning experience itself was enjoyable.

“Young Indiana Jones,” recently studied here, incorporated biographical information about many early-20th-century historical figures, fulfilling the third of five storytelling goals

Even if the “facts” presented are deliberately inaccurate, as long the intent is for people to believe them, this category applies.

Continue reading

“Young Indiana Jones” Turns 30:  Storytelling Lessons from George Lucas’ Other Prequel Series

A television series based on an immensely popular action-movie franchise shouldn’t have been a creative or commercial risk—quite the opposite.  But with The Young Indiana Jones Chronicles, which premiered on March 4, 1992, filmmaker George Lucas had no intention of producing a small-screen version of his big-screen blockbusters.  Here’s how Lucas provided a richly imaginative model for what a prequel can and should be—and why it would never be done that way again.


Though he more or less innovated the contemporary blockbuster, George Lucas had intended—even yearned—to be an avant-garde filmmaker:

Lucas and his contemporaries came of age in the 1960s vowing to explode the complacency of the old Hollywood by abandoning traditional formulas for a new kind of filmmaking based on handheld cinematography and radically expressive use of graphics, animation, and sound.  But Lucas veered into commercial moviemaking, turning himself into the most financially successful director in history by marketing the ultimate popcorn fodder.

Steve Silberman, “Life After Darth,” Wired, May 1, 2005

After dropping the curtain on his two career- and era-defining action trilogies (Star Wars concluded in 1983, then Indiana Jones in ’89), then failing to launch a new franchise with Willow (his 1988 sword-and-sorcery fantasy fizzled at the box office, though even that would-be IP is getting a “legacy” successor later this year courtesy the nostalgia–industrial complex), Lucas did in fact indulge his more experimental creative proclivities—through the unlikeliest of projects:  a pair of prequels to both Indiana Jones and Star Wars.  And while both arguably got made on the strength of the brands alone, the prequels themselves would, for better and worse, defy the sacrosanct conventions of blockbuster cinema—as well the codified narrative patterns of Joseph Campbell’s “heroic journey”—that audiences had come to expect from Lucas.

A perfunctory scene in Return of the Jedi, in which Obi-Wan finally explains Darth Vader’s mysterious backstory to Luke (a piece of business that could’ve been easily handled in the first film, thereby sparing the hero needlessly considerable risk and disillusionment in The Empire Strikes Back, but whatever), served as the narrative foundation for Lucas’ Star Wars prequel trilogy (1999–2005), in which a precocious tike (The Phantom Menace) matures into a sullen teenager (Attack of the Clones) before warping into a murderous tyrant (Revenge of the Sith).  Underpinning Anakin’s emo-fueled transformation to the dark side is a byzantine plotline about Palpatine’s Machiavellian takeover of the Republic.  Meanwhile, references to the original trilogy, from crucial plot points to fleeting sight gags, abound.

You’ve all seen the movies, so I’ll say no more other than to suggest the story arc—which is exactly what Obi-Wan summarized in Return of the Jedi, only (much) longer, appreciably harder to follow, and a tonally incongruous mix of gee-whiz dorkiness and somber political intrigue—is precisely the kind of creative approach to franchise filmmaking that would’ve been summarily nixed in any Hollywood pitch meeting, had Lucas been beholden to the corporate precepts of the studio system from which the colossal success of the original Star Wars afforded him his independence.

George Lucas on the set of the “Star Wars” prequels

Which is not to say Lucas’ artistic instincts were infallible.  Financially successful though the prequels were, audiences never really embraced his vision of an even longer time ago in a galaxy far, far away:  Gungans and midi-chlorians and trade disputes didn’t exactly inspire the wide-eyed amazement that Wookiees and lightsabers and the Death Star had.

Maybe by that point Star Wars was the wrong franchise with which to experiment creatively?  Perhaps it had become too culturally important, and audience expectations for new entries in the long-dormant saga were just too high?  In the intervening years, Star Wars had ceased to be the proprietary daydreams of its idiosyncratic creator; culturally if not legally, Star Wars kinda belonged to all of us on some level.  By explicitly starting the saga with Episode IV in 1977, he’d invited each of us to fill in the blanks; the backstory was arguably better off imagined than reified.

As an IP, however, Indiana Jones, popular as it was, carried far less expectation, as did the second-class medium of network television, which made Lucas’ intended brand extension more of an ancillary product in the franchise than a must-see cinematic event—more supplemental than it was compulsory, like a tie-in novel, or the Ewok telefilms of the mid-eighties.  The stakes of the project he envisioned were simply much lower, the spotlight on it comfortably dimmer.  In the event of its creative and/or commercial failure, Young Indiana Jones would be a franchise footnote in the inconsequential vein of the Star Wars Holiday Special, not an ill-conceived vanity project responsible for retroactively ruining the childhoods of millions of developmentally arrested Gen Xers.  Here Lucas expounds on the genesis of the series:

Continue reading

There He Was… and in He Walked: Lessons on Mythic Storytelling from the Mariachi Trilogy

In belated observation of Día de los Muertos, here’s an appreciation for the idiosyncratic storytelling of Robert Rodriguez’s Mariachi trilogy, a neo-Western action series that emerged from the indie-cinema scene of the 1990s and can only be deemed, by current Hollywood standards, an anti-franchise.  The movies and the manner in which they were made have a lot to teach us about what it means to be creative—and how to best practice creativity.


Before the shared cinematic universe became the holy grail of Hollywood, the coup d’éclat for any aspiring franchise—and we can probably credit Star Wars for this—was the trilogy.

In contrast with serialized IPs (James Bond and Jason Voorhees, for instance), the trilogy came to be viewed, rightly or wrongly, as something “complete”—a story arc with a tidy three-act design—and, accordingly, many filmmakers have leaned into this assumption, exaggerating a given series’ creative development post factum with their All part of the grand plan! assurances.

This peculiar compulsion we’ve cultivated in recent decades—storytellers and audiences alike—to reverse-engineer a “unified whole” from a series of related narratives, each of which developed independently and organically, is antithetical to how creativity works, and even to what storytelling is about.

Nowhere is the fluidity of the creative process on greater, more glorious display than in the experimental trilogy—that is, when a low-budget indie attains such commercial success, it begets a studio-financed remake that simultaneously functions as a de facto sequel, only to then be followed by a creatively emboldened third film that completely breaks from the established formula in favor of presenting an ambitiously gonzo epic.  Trilogies in this mode—and, alas, it’s pretty exclusive club—include Sam Raimi’s Evil Dead, George Miller’s Mad Max, and Robert Rodriguez’s El Mariachi.

Robert Rodriguez at the world premiere of “Alita: Battle Angel” on January 31, 2019 in London (Eamonn M. McCormack/Getty)

A film student at the University of Texas at Austin in the early nineties, Rodriguez self-financed El Mariachi with a few thousand dollars he’d earned as a medical lab rat; the project wasn’t meant to be much more than a modest trial run at directing a feature film that he’d hoped to perhaps sell to the then-burgeoning Spanish-language home-video market.  He reasoned that practical experience would be the best teacher, and if he could sell El Mariachi, it would give him the confidence and funds to produce yet more projects—increasingly ambitious and polished efforts—that would allow him to make a living doing what he loved.  He had no aspirations of power lunches at The Ivy or red-carpet premieres at Mann’s Chinese Theatre, only pursuing the art of cinematic storytelling—not necessarily Hollywood filmmaking, a different beast—to the fullest extent possible.

If you want to be a filmmaker and you can’t afford film school, know that you don’t really learn anything in film school anyway.  They can never teach you how to tell a story.  You don’t want to learn that from them anyway, or all you’ll do is tell stories like everyone else.  You learn to tell stories by telling stories.  And you want to discover your own way of doing things.

In school they also don’t teach you how to make a movie when you have no money and no crew.  They teach you how to make a big movie with a big crew so that when you graduate you can go to Hollywood and get a job pulling cables on someone else’s movie.

Robert Rodriguez, Rebel without a Crew, or, How a 23-Year-Old Filmmaker with $7,000 Became a Hollywood Player (New York:  Plume, 1996), xiii–xiv

They don’t teach a lot of things about Hollywood in film school, like how so many of the industry’s power brokers—from producers and studio execs to agents and managers—are altogether unqualified for their jobs.  These folks think they understand cinematic storytelling because they’ve watched movies their entire lives, but they’ve never seriously tried their hand at screenwriting or filmmaking.  Accordingly, the town’s power structure is designed to keep its screenwriters and filmmakers subordinate, to make sure the storytellers understand they take their creative marching orders from people who are themselves utterly mystified by the craft (not that they’d ever admit to that).

It’s the only field I know of whereby the qualified authorities are entirely subservient to desk-jockey dilettanti, but I suppose that’s what happens when a subjective art form underpins a multibillion-dollar industry.  Regardless, that upside-down hierarchy comes from a place of deep insecurity on both ends of the totem pole, and is in no way conducive to creativity, hence the premium on tried-and-true brands over original stories, on blockbusters over groundbreakers.  As I discovered the hard way—more on that in a minute—Hollywood is arguably the last place any ambitiously imaginative storyteller ought to aspire to be.  Rodriguez seemed to understand that long before he ever set foot in L.A.:

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 1

Editor’s note:  By even the indefensibly prolix standards of this blog, the following essay—an analytical piece on Hollywood mega-franchises and how audiences wind up serving them more than they serve us—is a lengthy one.  Accordingly, “In the Multiverse of Madness” will be published in two separate parts, with the concluding installment following this one by a week.  I thank you in advance for your time and attention, neither of which I take for granted.


In last month’s post, I proffered that when a fan-favorite media franchise no longer serves us—when we come to recognize some of the popular fictions we’ve cherished embody values we no longer endorse, and potentially even threaten to stand in the way of where we need to go—often the best thing we can do for ourselves is to let it go, purposely and permanently.

Letting go is not about “canceling” (someone like disgraced geek god Joss Whedon) or boycotting (the films of, say, Woody Allen); it’s not about taking action at all.  Instead, letting go is not doing something any longer—not renting out any more space in your life or in your head to the likes of Whedon or Allen, or even to the culturally defining popular narratives whose very ubiquity we take as a God-given absolute:  Star Wars, Star Trek, Harry Potter, DC and Marvel, to name but a sampling.

Despite the universal prevalence of those transmedia brands—not merely the plethora of movies and TV shows, but the licensed apparel and iPhone cases, the die-cast collectables and plush toys—we can, if we choose, be done with any or all those franchises as of… right now.  To learn to live without them entirely.  And happily.  Even lifelong, hardcore superfans can learn to let go of their preferred multimedia pastimes.

It’s both easier and harder than you may think.

Just imagine never caring about ANY of this ever again…

But wait!  What if you happen to genuinely enjoy Star Wars or Star Trek or DC or Marvel?  If you’re a fan, and some or all of those entertainment franchises add value to your life’s experience, by all means, disregard this post’s advice.  Though perhaps first consider this:

For most of Hollywood history, the movie business has needed a hostage buyer, a customer with little choice but to purchase the product.  First, this was the theatre chains, which the studios owned, or controlled, until 1948, when the Supreme Court forced the studios to sell them on antitrust grounds.  In the eighties and nineties, video stores partly filled the role.  But, increasingly, the hostage buyer is us.

Today, the major franchises are commercially invulnerable because they offer up proprietary universes that their legions of fans are desperate to reënter on almost any terms.  These reliable sources of profit are now Hollywood’s financial bedrock.

Stephen Metcalf, “How Superheroes Made Movie Stars Expendable,” New Yorker, May 21, 2018

Consider:  How many of us are unwitting “hostage buyers”—fans who continue to subscribe to certain multimedia franchises no longer out of pleasure, but lately out of habit?  Out of decades-long conditioning?  We may watch Star Wars, for instance, simply because we’ve always watched Star Wars, even if we can’t truly recall the last time we actually enjoyed it the way we did when we were ten years old—with pure and wondrous abandon.  Bad word-of-mouth will steer us clear of a one-off bomb like Blackhat or King Arthur:  Legend of the Sword or The Happytime Murders, but it’ll merely lower our expectations for Star Wars:  The Rise of Skywalker and X-Men:  Dark Phoenix and Terminator:  Dark Fate, not deter us from seeing those umpteenth sequels for ourselves.

When that happens—when we’re willing to spend our money, time, and attention (our three primary modes of currency) on a product we know in advance is shit—we’re no longer fans of those franchises so much as brand loyalists.  Habit buyers, if not outright hostage buyers.  And it can be hard to recognize that in ourselves—harder than we might realize.  I was still reading Batman comics into my thirties, who-knows-how-many years after I stopped enjoying them—long after a once-joyful pleasure became an interminably joyless obligation.  So, why was I still reading and collecting them?

Because I’d always read comics, from the time I was a kid; I’d buy them at the corner candy store in my Bronx neighborhood with loose change I’d rummaged from the couch cushions and reread each one a thousand times.  I’d share them with my grade-school gang, and vice versa.  I’d collected them for as long as I could remember, so it truly never occurred to me a day might come when they no longer added value to my life—when they’d outlived their onetime reliable purpose.  And for years after I reached that point of terminally diminished returns, I’d continue to spend money, to say nothing of time and attention, on a habit I wasn’t enjoying—that did nothing but clutter my home with more worthless shit that went straight into indefinite “storage” in the closet.  Why the hell did I do that?

Because I’d ceased to be a fan and had instead become an obedient brand loyalist—an institutionalized hostage buyer.  And, to be sure, corporate multimedia initiatives—which is to say the those so-called “mega-franchises” from which there is always one more must-see/must-have sequel, prequel, sidequel, spinoff, TV series, tie-in comic, videogame, and branded “collectible” being produced—very much count on our continued, unchallenged fidelity to once-beloved concepts and characters…

… and they are doubling down on the billion-dollar bet they’ve placed on it:

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

The End: Lessons for Storytellers from the Trump Saga

The election of Joseph R. Biden Jr. earlier this month offered the very thing our movie franchises and television series have denied us for two decades:  catharsis.


For a writer, it turns out I may suffer from a staggering lack of imagination.

I will confess to anxiously entertaining all the apocalyptic post–Election Day scenarios contemplated by even our most sober pundits and analysts:  the disillusion-fueled outrage on the left should Trump eke out a narrow Electoral College win despite losing the popular vote to Biden; or, the armed militias activated by the president in the event of his loss.  Like the set of a Snake Plissken movie, store windows on Fifth Avenue and Rodeo Drive were boarded up; correspondingly, I barricaded my own front and balcony doors as I watched, sick to my stomach, an endless caravan of MAGA-bannered pickup trucks roar past my home in the liberal bastion of Los Angeles the weekend before Election Day.  I girded for the possibility (if not inevitability) of social breakdown, fully aware I would not be cast in the part of uber-competent dystopian hero—the Rick Grimes or Mad Max—in that story.

What I never imagined—not once, even fleetingly—was that upon receiving official word of a Biden/Harris victory, cities across the country, and the world over, would spontaneously erupt into large-scale celebration worthy of an MGM musical.  Ding-dong!  The witch is dead!  It was a perfectly conventional—and conventionally predictable—Hollywood ending, yet I never saw it coming.

The galaxy celebrates the death of Darth Vader

Despite all the warnings I’ve issued about the unconscious maleficent messaging in our commercial fiction—stories in which messianic saviors redeem our inept/corrupt public institutions (Star Wars and superhero sagas), armed men with badges act without even the smallest measure of accountability (action movies and police procedurals), and environmental destruction/societal collapse are not merely inevitable but preferable (Mad Max:  Fury Road, The Walking Dead), because apocalypse absolves us from our burdensome civic responsibilities—this election season has exposed my own susceptibility to pop-cultural conditioning.

It wasn’t merely a spirit of doomism I nursed throughout October; it was an unchallenged assumption that the interminable Trump narrative would simply do what all our stories now do:  hold us in a state of real-time presentism (“We’ll have to wait and see” and “I will keep you in suspense” are common refrains from the outgoing president) rather than arrive at definitive conclusion.

The erosion of cathartic narrativity is a subject I’ve admittedly addressed a lot here on the blog since I first published “Journey’s End” over five years ago, but it’s essential to understanding how the Trump presidency came to be, and why we all felt such an atavistic sense of relief when it reached an end on November 7.

Around the turn of the millennium, storytellers mostly abandoned the Aristotelian narrative arc—with its rising tension, climax, and catharsis—in favor of “storyless” fiction with either a satirical-deconstructionist agenda (Family Guy, Community) or to emulate the kind of open-ended worldbuilding previously the exclusive province of tabletop RPGs and videogames (Game of Thrones, Westworld).

Continue reading

The Road Back: Revisiting “The Writer’s Journey”

On the twenty-fifth anniversary of Christopher Vogler’s industry-standard screenwriting instructional The Writer’s Journey:  Mythic Structure for Writers, here’s an in-depth look at why the time-honored storytelling principles it propounds are existentially endangered in our postnarrative world… and why they’re needed now more than ever.


In The Hero with a Thousand Faces (1949), comparative mythologist Joseph Campbell identified the “monomyth”—the universal narrative patterns and Jungian psychological archetypes that provide the shape, structure, and emotional resonance of virtually every story in the Western literary canon.

As it’s more commonly known, this is the “Hero’s Journey,” in which the status quo is disrupted, sending our protagonist on a perilous adventure—physically or emotionally or both—through a funhouse-mirror distortion of their everyday reality (think Marty McFly in 1950s Hill Valley, Dorothy in Oz) in which they encounter Mentors, Shadows, Allies, and Tricksters throughout a series of escalating challenges, culminating in a climactic test from which they finally return to the Ordinary World, ideally a bit wiser for their trouble.  From the Epic of Gilgamesh to a given episode of The Big Bang Theory, the Hero’s Journey is the foundational schema of storytelling.

The Writer's Journey graphic
The stages of the Hero’s Journey

The book’s influence on the visionary young filmmakers who came of age studying it was quantum:  George Lucas consciously applied Campbell’s theory to the development of Star Wars (1977), as did George Miller to Mad Max (1979), arguably transforming a pair of idiosyncratic, relatively low-budget sci-fi projects into global phenomena that are still begetting sequels over forty years later.  After serving Western culture for millennia, in the waning decades of the twentieth century, the Hero’s Journey became the blueprint for the Hollywood blockbuster.

In the 1990s, a story analyst at Disney by the name of Christopher Vogler wrote and circulated a seven-page internal memo titled “A Practical Guide to The Hero with a Thousand Faces,” a screenwriter-friendly crib sheet that was notably used in the development of The Lion King (a classic Hero’s Journey if ever there was one), evolving a few years later into a full-length book of its own:  The Writer’s Journey:  Mythic Structure for Writers, a twenty-fifth anniversary edition of which was published this past summer.  The nearly 500-page revised volume is partitioned into four sections:

  • MAPPING THE JOURNEY:  Here Mr. Vogler characterizes the mythic archetypes of the Hero, Mentor, Threshold Guardian, Herald, Shapeshifter, Shadow, Ally, and Trickster.
  • STAGES OF THE JOURNEY:  Each monomythic “beat”—The Call to Adventure, Crossing the First Threshold, Approach to the Inmost Cave, etc.—is given thorough explanation and illustration.
  • LOOKING BACK ON THE JOURNEY:  Using the tools he teaches, Mr. Vogler provides comprehensive analyses of Titanic, Pulp Fiction, The Lion King, The Shape of Water, and Lucas’ six-part Star Wars saga.
  • THE REST OF THE STORY:  ADDITIONAL TOOLS FOR MASTERING THE CRAFT:  The appendices are a series of essays on the history, nature, and cultural dynamics of the art and craft of storytelling.  After 350 pages of practical technique, Mr. Vogler earns the privilege of indulging a bit of literary theory here, and his insights are fascinating.  He devotes an entire chapter to the subject of catharsis, “comparing the emotional effect of a drama with the way the body rids itself of toxins and impurities” (Christopher Vogler, The Writer’s Journey:  Mythic Structure for Writers, 4th ed. [Studio City, California:  Michael Wiese Productions, 2020], 420).  Stories, in that sense, are medicinal; their alchemical compounds have healing properties—more on this point later.

Vogler’s The Writer’s Journey codifies mythic structure for contemporary storytellers, demonstrating its form, function, and versatility through more accessible terminology than Campbell’s densely academic nomenclature, and by drawing on examples from cinematic touchstones familiar to all:  The Wizard of Oz, Star Wars, Titanic, etc.  Like The Hero with a Thousand Faces before it, The Writer’s Journey has become, over the last quarter century, an essential catechism, affecting not merely its own generation of scribes (including yours truly), but the successive storytelling programs that stand on its shoulders, like Save the Cat!

Comparison of Vogler’s Hero’s Journey and Snyder’s “beat sheet”

But why is it essential?  If Campbell and Vogler and Blake Snyder have simply put different labels on narrative principles we all intuitively comprehend from thousands of years of unconscious conditioning, why study them at all?  Why not simply trust those precepts are already instinctive and immediately type FADE IN at the muse’s prompting?

Because just as a doctor requires an expert’s command of gross anatomy even if no two patients are exactly constitutionally alike, and an architect is expected to possess a mastery of structural engineering though every building is different, it behooves the storyteller—be them screenwriter, novelist, playwright, what have you—to consciously understand the fundamentals of the narrative arts:

The stages of the Hero’s Journey can be traced in all kinds of stories, not just those that feature “heroic” physical action and adventure.  The protagonist of every story is the hero of a journey, even if the path leads only into his own mind or into the realm of relationships.

The way stations of the Hero’s Journey emerge naturally even when the writer in unaware of them, but some knowledge of this most ancient guide to storytelling is useful in identifying problems and telling better stories.  Consider these twelve stages as a map of the Hero’s Journey, one of many ways to get from here to there, but one of the most flexible, durable, and dependable.

ibid., 7

I’ve read and reread previous versions of The Writer’s Journey endlessly, and I take new insight from it each time:  An excellent primer for aspirants, it yields yet richer dividends for experienced writers—those that can readily appreciate it vis-à-vis their own work.  Though this updated edition, which includes two brand-new essays in the appendices (“What’s the Big Deal?” and “It’s All About the Vibes, Man”), was certainly sufficient reason in its own right to revisit The Writer’s Journey, I had a more compelling motivation:  I wanted to see for myself how Mr. Vogler makes a case for the type of conventional story arc he extols in the face of mounting evidence of its cultural irrelevance in our postnarrative era.

Continue reading

Some Assembly Required: Why Disciplined Creativity Begets Better Fiction

Editor’s note:  “Some Assembly Required” was written and scheduled to post prior to COVID-19’s formal classification as a global pandemic and the ensuing social disruption it has caused here in the United States and around the world; in light of that, a thesis about storytelling craft seems to me somewhat inconsequential and irrelevant.

More broadly, however, the essay makes a case for slowing down, something we’re all doing out of admittedly unwelcome necessity at present, and learning to value the intellectual dividends of thoughtful rumination over the emotional gratification of kneejerk reaction; as such, I submit “Some Assembly Required” as planned—along with my best wishes to all for steadfast health and spirits through this crisis.


Castle Grayskull.  The Cobra Terror Drome.  The Batcave.  I didn’t have every 1980s action-figure playset, but, man, how I cherished the ones I got.  In those days of innocence, there was no visceral thrill quite like waking up to an oversized box under the Christmas tree, tearing off the wrapping to find this:

I had one just like it!

Or this:

Optimus Prime was both an action figure AND a playset! Didn’t have him, alas…

Or this:

The seven-foot G.I. Joe aircraft carrier! DEFINITELY didn’t have this one…

Oh, the possibilities!  Getting one of those glorious playsets was like being handed the keys to a magical kingdom of one’s very own.  After having been inspired by the adventures of G.I. Joe and the Transformers and the Ghostbusters at the movies, on their cartoon series, and in comics, now you had your very own “backlot” to stage your personal daydreams.  It was grand.

I am in no way indulging 1980s nostalgia here—surely you know me better than that by now.  Rather, I mean only to elicit the particular thrum of excitement the era’s playsets aroused, the imagination they unleashed.  It’s fair to say I became addicted to that sensation in my youth; even at midlife, I still need my fix.  Nowadays, though, I get it not through curated collections of overpriced memorabilia—retro-reproductions of the action figures of yore—but rather the surcharge-free creation of my own fiction.

CREATIVITY—ONE… TWO… THREE!

Getting a new playset as a kid and a starting a new writing project as an adult share arguably the same three developmental phases.  The first is what I call Think about What You Might like for Christmas.  This is the stage when you experiment noncommittally with ideas, get a sense of what excites you, what takes hold of your imagination—maybe talk it over with friends—and then envision what it will look like.  Selling yourself on a new story idea, deciding it’s worth the intensive time and energy required to bring it to fruition, is much the same as furnishing your parents with a carefully considered wish list:  You’re cashing in your biannual Golden Ticket on this.  It’s a period of escalating anticipation, and of promise.  The “thing” isn’t real yet—it’s still a nebulous notion, not a tangible commodity—but it will be…

Stage two is Some Assembly Required:  This is the recognition that your personal paracosm doesn’t come ready-to-play out of the box.  You’ll need to snap the pieces in place, apply the decals; you need to give the forum structure first.  To use another analogy:  You don’t start decorating a Christmas tree that’s been arranged askance in its stand.  (More on Some Assembly Required in a minute.)

Stage three:  It’s Playtime!  You’ve done the hard, preparatory work of building your imaginary realm, and now you get to experience the pure joy of writing—to have fun, in other words, with your new toys.

Continue reading

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑