Writer of things that go bump in the night

Tag: filmmaking

No, Virginia, “Die Hard” Is Not a Christmas Movie

Ah, it’s that magical time of year!  When the Hudson hawk nips at the nose, and the skyline over the New Jersey Palisades bruises by midafternoon.  When chimney smoke from private houses spices the air, and strings of colored lights adorn windows and fire escapes.  And, of course, when the Internet engages in its annual bullshit debate as to whether perennial holiday favorite Die Hard, currently celebrating its thirty-fifth anniversary, is a Christmas movie.  And since “bullshit debates” are my brand…


In fourth grade, I scored what was, by 1980s standards, the holy grail:  a best friend with HBO.  Over the following five years, I slept over at his house every weekend, where we watched R-rated action movies into the night.  Whatever HBO was showing that week, we delighted in it, no matter how idiotic (Action Jackson) or forgettable (Running Scared).  For a pair of preadolescent boys, that Saturday-night cinematic grab bag abounded with illicit wonders.

Much as we enjoyed those movies, though, they were for the most part—this isn’t a criticism—ephemeral crap.  We howled at their profane jokes and thrilled to their improbable set pieces, but seldom if ever revisited any of them (Beverly Hills Cop [1984] and its sequel [1987] being a rare exception), and certainly none inspired us to playact their scenarios as we had with PG-rated adventures Ghostbusters (1984) and Back to the Future (1985).  They entertained us, sure, but didn’t exactly impress upon our imaginations in any lasting or meaningful way…

That is, not until an action thriller with the snarky guy from Moonlighting (1985–1989) and Blind Date (1987) came along.  I still remember seeing Die Hard (1988) for the first time, on a thirteen-inch television with side-mounted mono speaker at my friend’s Bronx apartment.  As a viewing experience, it was about as low-def as they come, but that didn’t diminish the white-knuckled hold the movie had on us; we watched it in astonished silence from beginning to end.  From that point on—and this was the year no less than Tim Burton’s Batman had seized the zeitgeist, and our longstanding favorites Ghostbusters and Back to the Future got their first sequelsDie Hard was almost all we could talk about.

At the time, Manhattan College was in the process of erecting a twelve-story student residence overlooking Van Cortlandt Park, and we would gather with our JHS pals at the construction site on weekends, running around the unfinished edifice with automatic squirt guns, playing out the movie’s gleefully violent plot.  Hell, at one point or another, every multistory building in the neighborhood with a labyrinthine basement and rooftop access became Nakatomi Plaza, the setting of a life-and-death battle staged and waged by a group of schoolboys, our imaginations captive to the elemental premise of Die Hard.

We obsessed over that fucking movie so exhaustively, we passed around this still-in-my-possession copy of the pulp-trash novel it was based on—Roderick Thorp’s Nothing Lasts Forever (1979)—until every one of us had had a chance to read it:

The now-battered copy of “Nothing Last Forever” I bought in 1989 at the long-gone Bronx bookstore Paperbacks Plus

The thirteen-year-old boys of the late ’80s were far from the only demographic taken with Die Hard.  The movie proved so hugely popular, it not only spawned an immediate sequel in 1990 (which we were first in line to see at an appallingly seedy theater on Valentine Avenue), but became its own subgenre throughout the rest of that decade.  Hollywood gave us Die Hard on a battleship (Under Siege), Die Hard on a plane (Passenger 57), Die Hard on a train (Under Siege 2:  Dark Territory), Die Hard on a mountain (Cliffhanger), Die Hard on a bus (Speed), Die Hard on a cruise ship (Speed 2:  Cruise Control), Die Hard in a hockey arena (Sudden Death), Die Hard on Rodeo Drive (The Taking of Beverly Hills), Die Hard at prep school (Toy Soldiers)…

Christ, things got so out of control, even Beverly Hills Cop, an established action franchise predating Die Hard, abandoned its own winning formula for the third outing (scripted by Steven E. de Souza, co-screenwriter of the first two Die Hards) in favor of a half-assed “Die Hard in an amusement park” scenario.  This actually happened:

Eddie Murphy returns as Axel Foley—sort of—in “Beverly Hills Cop III” (1994)

None of those films has had the staying power of the original Die Hard.  Mostly that’s owed to Die Hard being a superior specimen of filmmaking.  Director John McTiernan demonstrates uncommonly disciplined visual panache:  He expertly keeps the viewer spatially oriented in the movie’s confined setting, employing swish pans and sharp tilts to establish the positions of characters within a given scene, as well as imbue the cat-and-mouse of it all with breathless tension.

McTiernan consistently sends his hero scuttling to different locations within the building—stairwells, pumprooms, elevator shafts, airducts, the rooftop helipad—evoking a rat-in-a-cage energy that leaves the viewer feeling trapped though never claustrophobic.  The narrative antithesis of the globetrotting exploits of Indiana Jones and James Bond, Die Hard is a locked-room thriller made with an ’80s action-movie sensibility.  It was and remains a masterclass in suspense storytelling—often imitated, as the old saying goes, never duplicated.

Perhaps another key reason for the movie’s durability, its sustained cultural relevance, is owed to its (conditional) status as a celebrated Christmas classic.  Like It’s a Wonderful Life (1946) and National Lampoon’s Christmas Vacation (1989) and Love Actually (2003), Die Hard is a feel-good film—albeit with a considerably higher body count—one is almost compelled to watch each December.  Yet whereas nobody questions any of the aforementioned movies’ culturally enshrined place in the holiday-movie canon—nor that of cartoonishly violent Home Alone (1990)—Die Hard’s eligibility seems perennially under review.

Why does the debate around Die Hard die hard… and is it, in fact, a Christmas movie?

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

“Young Indiana Jones” Turns 30:  Storytelling Lessons from George Lucas’ Other Prequel Series

A television series based on an immensely popular action-movie franchise shouldn’t have been a creative or commercial risk—quite the opposite.  But with The Young Indiana Jones Chronicles, which premiered on March 4, 1992, filmmaker George Lucas had no intention of producing a small-screen version of his big-screen blockbusters.  Here’s how Lucas provided a richly imaginative model for what a prequel can and should be—and why it would never be done that way again.


Though he more or less innovated the contemporary blockbuster, George Lucas had intended—even yearned—to be an avant-garde filmmaker:

Lucas and his contemporaries came of age in the 1960s vowing to explode the complacency of the old Hollywood by abandoning traditional formulas for a new kind of filmmaking based on handheld cinematography and radically expressive use of graphics, animation, and sound.  But Lucas veered into commercial moviemaking, turning himself into the most financially successful director in history by marketing the ultimate popcorn fodder.

Steve Silberman, “Life After Darth,” Wired, May 1, 2005

After dropping the curtain on his two career- and era-defining action trilogies (Star Wars concluded in 1983, then Indiana Jones in ’89), then failing to launch a new franchise with Willow (his 1988 sword-and-sorcery fantasy fizzled at the box office, though even that would-be IP is getting a “legacy” successor later this year courtesy the nostalgia–industrial complex), Lucas did in fact indulge his more experimental creative proclivities—through the unlikeliest of projects:  a pair of prequels to both Indiana Jones and Star Wars.  And while both arguably got made on the strength of the brands alone, the prequels themselves would, for better and worse, defy the sacrosanct conventions of blockbuster cinema—as well the codified narrative patterns of Joseph Campbell’s “heroic journey”—that audiences had come to expect from Lucas.

A perfunctory scene in Return of the Jedi, in which Obi-Wan finally explains Darth Vader’s mysterious backstory to Luke (a piece of business that could’ve been easily handled in the first film, thereby sparing the hero needlessly considerable risk and disillusionment in The Empire Strikes Back, but whatever), served as the narrative foundation for Lucas’ Star Wars prequel trilogy (1999–2005), in which a precocious tike (The Phantom Menace) matures into a sullen teenager (Attack of the Clones) before warping into a murderous tyrant (Revenge of the Sith).  Underpinning Anakin’s emo-fueled transformation to the dark side is a byzantine plotline about Palpatine’s Machiavellian takeover of the Republic.  Meanwhile, references to the original trilogy, from crucial plot points to fleeting sight gags, abound.

You’ve all seen the movies, so I’ll say no more other than to suggest the story arc—which is exactly what Obi-Wan summarized in Return of the Jedi, only (much) longer, appreciably harder to follow, and a tonally incongruous mix of gee-whiz dorkiness and somber political intrigue—is precisely the kind of creative approach to franchise filmmaking that would’ve been summarily nixed in any Hollywood pitch meeting, had Lucas been beholden to the corporate precepts of the studio system from which the colossal success of the original Star Wars afforded him his independence.

George Lucas on the set of the “Star Wars” prequels

Which is not to say Lucas’ artistic instincts were infallible.  Financially successful though the prequels were, audiences never really embraced his vision of an even longer time ago in a galaxy far, far away:  Gungans and midi-chlorians and trade disputes didn’t exactly inspire the wide-eyed amazement that Wookiees and lightsabers and the Death Star had.

Maybe by that point Star Wars was the wrong franchise with which to experiment creatively?  Perhaps it had become too culturally important, and audience expectations for new entries in the long-dormant saga were just too high?  In the intervening years, Star Wars had ceased to be the proprietary daydreams of its idiosyncratic creator; culturally if not legally, Star Wars kinda belonged to all of us on some level.  By explicitly starting the saga with Episode IV in 1977, he’d invited each of us to fill in the blanks; the backstory was arguably better off imagined than reified.

As an IP, however, Indiana Jones, popular as it was, carried far less expectation, as did the second-class medium of network television, which made Lucas’ intended brand extension more of an ancillary product in the franchise than a must-see cinematic event—more supplemental than it was compulsory, like a tie-in novel, or the Ewok telefilms of the mid-eighties.  The stakes of the project he envisioned were simply much lower, the spotlight on it comfortably dimmer.  In the event of its creative and/or commercial failure, Young Indiana Jones would be a franchise footnote in the inconsequential vein of the Star Wars Holiday Special, not an ill-conceived vanity project responsible for retroactively ruining the childhoods of millions of developmentally arrested Gen Xers.  Here Lucas expounds on the genesis of the series:

Continue reading

A Hollywood Ending: Hopeful Reflections on a Failed Screenwriting Career

I’ve alluded to the irretrievable implosion of my screenwriting career in many a previous blog post.  I never felt ready to write about it at length before now.  So, since we were just recently discussing the artful revelation of backstory, here’s mine.


Given the long odds of a career in Hollywood, even under the most favorable of circumstances, the unexpressed question that looms ominously over every aspirant is:  How do I know when it’s time to call this quits?

My wife and I were having drinks at the S&P Oyster Co. in Mystic, Connecticut, when I knew I was done with Hollywood forever—that my ship wasn’t coming.  That was September 24, 2014, during a visit to the East Coast for her aunt and uncle’s golden-anniversary party, exactly thirteen years to the day after we’d relocated from our hometown of New York City to L.A.

Right out of college, I’d landed representation as a screenwriter—though that management company folded a few months prior to my move, catalyzing, at least in part, my decision to try my luck in Tinseltown—and I had a reel full of TV spots and short films I’d cut while working as an audiovisual editor in SoHo, so I felt certain I’d land on my feet in Hollywood, this despite having no contacts there.

So, in the predawn hours of Tuesday, September 11, 2001, I left the Bronx, the only home I’d ever known, and met my wife, though we weren’t married at the time, at JFK Airport to embark on our new adventure together.  Perhaps the cosmic timing of our departure (which was delayed by two weeks) should’ve been taken as a sign that the road ahead would be bumpier than I’d naïvely anticipated?

It took a full year in L.A. before I could even get a call returned, but finally I got some opportunities to edit a few independent shorts and features, and began networking my way into the industry.  But it would be another seven years yet before I procured representation as a screenwriter again, during which time I can’t tell you how many contemporaries I watched pack up their shit and abandon their dreams to move back home.  They’d decided it wasn’t worth it, that life was too short.  I’m certain I’d have been one of them were it not for my wife, who remained steadfastly supportive, and for a few friends—notably my buddy Mike—who were also Hollywood hopefuls determined to keep at it, too, through bad times and, well, less bad.  We were going to be the ones that hung in there and made it.

By 2009, things were looking up—considerably.  At long last I’d found representation once again with a management company, this time off a spec I’d written called Leapman, and all manner of opportunities soon followed:  to turn Leapman into a comic-book series; to sign with a big-letter talent agency; to vie for open screenwriting assignments; to develop an undersea sci-fi thriller (in the vein of The Abyss and Sphere) with a red-hot producer.

From “The Abyss” (1989), a movie about deep-sea extraterrestrials akin to the one I was developing

Around this same time, I got friendly with another up-and-coming screenwriter—we were repped by the same management—and he and I formed a critique group, enthusiastically enlisting half a dozen fledgling screenwriters we barely knew.  In short order, we all became close friends, meeting every other Tuesday night at one watering hole or another around Hollywood to trade script notes and war stories.  All unknowns at the time, some of those scribes have since gone on to write for shows including The Handmaid’s Tale and Women of the Movement, as well as WandaVision and Ted Lasso.

I was also, during this period, developing a short film with Mike.  He and I had met in 2003 on the postproduction crew of an indie film; we were on location in the redwoods of Marin County, right down the road from Skywalker Ranch, cutting dailies in a ramshackle cabin that looked for all the world like Ewok Village Hall.  Under those circumstances, it didn’t take long to become fast friends:  We were the same age, came up on the same cinematic influences, and—most notably—shared the same irreverent sense of humor, turning our verbal knives on all of Hollywood’s sacred cows, delighting in making one another howl with one progressively outrageous remark after the next.

Also like me, Mike was married to his teenage sweetheart, sans children, so we were both in the same place:  free to pursue our Hollywood dreams with the support of the women we loved.  It was and remains the closest male friendship I’ve ever made in my adult life.  As Mike continued to come into ever-more-promising editorial opportunities on studio features, my screenwriting career was kicking into high gear.  With aspirations to direct, he asked me if I wouldn’t mind taking one of my concepts—a horror/comedy I’d pitched him that reflected our mutual sensibilities—and scripting a short film for him to shoot.  So, there I was, developing a big-budget monster movie for a legit prodco by day, and a no-budget monster movie with my best friend by night.  After over a decade in Hollywood, everything had clicked into place.

And then came 2014.  Frustrated with the inexcusable lack of progress on the short—I’d written a script all of us were expressly happy with, and yet years had gone by and we were no closer to rolling camera—I put pressure on the project’s producer, Mike’s spouse, to do her part.  Consequently, for the first time in our decade-long association, our friendship grew strained, and once we both crossed the line and turned our caustic criticisms, the source of so many years of bonding and hilarity, on each other, our relationship eventually became irreversibly poisoned.  I’d lost my closest friend and ally in Hollywood, and that was only the beginning of my troubles.

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

There He Was… and in He Walked: Lessons on Mythic Storytelling from the Mariachi Trilogy

In belated observation of Día de los Muertos, here’s an appreciation for the idiosyncratic storytelling of Robert Rodriguez’s Mariachi trilogy, a neo-Western action series that emerged from the indie-cinema scene of the 1990s and can only be deemed, by current Hollywood standards, an anti-franchise.  The movies and the manner in which they were made have a lot to teach us about what it means to be creative—and how to best practice creativity.


Before the shared cinematic universe became the holy grail of Hollywood, the coup d’éclat for any aspiring franchise—and we can probably credit Star Wars for this—was the trilogy.

In contrast with serialized IPs (James Bond and Jason Voorhees, for instance), the trilogy came to be viewed, rightly or wrongly, as something “complete”—a story arc with a tidy three-act design—and, accordingly, many filmmakers have leaned into this assumption, exaggerating a given series’ creative development post factum with their All part of the grand plan! assurances.

This peculiar compulsion we’ve cultivated in recent decades—storytellers and audiences alike—to reverse-engineer a “unified whole” from a series of related narratives, each of which developed independently and organically, is antithetical to how creativity works, and even to what storytelling is about.

Nowhere is the fluidity of the creative process on greater, more glorious display than in the experimental trilogy—that is, when a low-budget indie attains such commercial success, it begets a studio-financed remake that simultaneously functions as a de facto sequel, only to then be followed by a creatively emboldened third film that completely breaks from the established formula in favor of presenting an ambitiously gonzo epic.  Trilogies in this mode—and, alas, it’s pretty exclusive club—include Sam Raimi’s Evil Dead, George Miller’s Mad Max, and Robert Rodriguez’s El Mariachi.

Robert Rodriguez at the world premiere of “Alita: Battle Angel” on January 31, 2019 in London (Eamonn M. McCormack/Getty)

A film student at the University of Texas at Austin in the early nineties, Rodriguez self-financed El Mariachi with a few thousand dollars he’d earned as a medical lab rat; the project wasn’t meant to be much more than a modest trial run at directing a feature film that he’d hoped to perhaps sell to the then-burgeoning Spanish-language home-video market.  He reasoned that practical experience would be the best teacher, and if he could sell El Mariachi, it would give him the confidence and funds to produce yet more projects—increasingly ambitious and polished efforts—that would allow him to make a living doing what he loved.  He had no aspirations of power lunches at The Ivy or red-carpet premieres at Mann’s Chinese Theatre, only pursuing the art of cinematic storytelling—not necessarily Hollywood filmmaking, a different beast—to the fullest extent possible.

If you want to be a filmmaker and you can’t afford film school, know that you don’t really learn anything in film school anyway.  They can never teach you how to tell a story.  You don’t want to learn that from them anyway, or all you’ll do is tell stories like everyone else.  You learn to tell stories by telling stories.  And you want to discover your own way of doing things.

In school they also don’t teach you how to make a movie when you have no money and no crew.  They teach you how to make a big movie with a big crew so that when you graduate you can go to Hollywood and get a job pulling cables on someone else’s movie.

Robert Rodriguez, Rebel without a Crew, or, How a 23-Year-Old Filmmaker with $7,000 Became a Hollywood Player (New York:  Plume, 1996), xiii–xiv

They don’t teach a lot of things about Hollywood in film school, like how so many of the industry’s power brokers—from producers and studio execs to agents and managers—are altogether unqualified for their jobs.  These folks think they understand cinematic storytelling because they’ve watched movies their entire lives, but they’ve never seriously tried their hand at screenwriting or filmmaking.  Accordingly, the town’s power structure is designed to keep its screenwriters and filmmakers subordinate, to make sure the storytellers understand they take their creative marching orders from people who are themselves utterly mystified by the craft (not that they’d ever admit to that).

It’s the only field I know of whereby the qualified authorities are entirely subservient to desk-jockey dilettanti, but I suppose that’s what happens when a subjective art form underpins a multibillion-dollar industry.  Regardless, that upside-down hierarchy comes from a place of deep insecurity on both ends of the totem pole, and is in no way conducive to creativity, hence the premium on tried-and-true brands over original stories, on blockbusters over groundbreakers.  As I discovered the hard way—more on that in a minute—Hollywood is arguably the last place any ambitiously imaginative storyteller ought to aspire to be.  Rodriguez seemed to understand that long before he ever set foot in L.A.:

Continue reading

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading

Tim Burton’s “Batman” at 30—and the Cultural Legacy of the Summer of 1989

In order to appreciate the state of commercial adolescence to which Generation X has been disproportionately consigned, one needs to consider Tim Burton’s Batman in its sociocultural context:  how it inadvertently provided a blueprint to reconceptualize superheroes from innocent entertainment meant to inspire the imagination of children to hyperviolent wish-fulfillment fantasies for commercially infantilized adults.


The weekly theatrical debut of a new franchise tentpole, voraciously bulling aside the $200 million–budgeted blockbuster released a mere seven days prior, is par for the course nowadays, but back in 1989—thirty summers ago per the calendar, though seemingly as recently as yesterday by the nebulous barometer of memory—we’d never before experienced anything like that.

That was the year that gave us new entries in such ongoing adventures as Indiana Jones, Star Trek, Ghostbusters, The Karate Kid, Lethal Weapon, James Bond, and Back to the Future, lowbrow comedies Police Academy, Fletch, and Vacation, as well as slasher staples Friday the 13th, A Nightmare on Elm Street, and Halloween—to say nothing of launching all-new franchises with Bill & Ted’s Excellent Adventure, Major League, Pet Sematary, Honey, I Shrunk the Kids, Weekend at Bernie’s, and Look Who’s Talking.  To anyone who’d grown up in the nascent home-video era—that period in which all the aforementioned series (save 007) were born and could thusly be re-watched and obsessed-over ad infinitum—1989 was the Christmas of summer-movie seasons.

Tim Burton's "Batman"
Michael Keaton in Tim Burton’s “Batman” (1989)

But none of those films, huge as many of them were, dominated the cultural spotlight that year as pervasively as Tim Burton’s Batman, released on this date in 1989.

Out of the Shadows

I can hear my thirteen-year-old nephew now:  “One superhero movie?  Wow—how’d you handle the excitement?”

Yeah, I know.  But it was exciting.  I was thirteen myself in 1989, spending most of my free time with my grade-school gang at the neighborhood comic shop down on Broadway, steeped in a subculture that hadn’t yet attained popular acceptance.  Richard Donner’s Superman (1978) had been the only previous attempt at a reverent comic-book adaptation, and, creatively and financially successful though it was, most of that goodwill had been squandered in the intervening decade by a succession of increasingly subpar sequels (through no fault of the marvelous Christopher Reeve, who makes even the worst of them watchable).

Christopher Reeve and Margot Kidder in “Superman: The Movie”

As for Batman:  It’s crucial to remember, and easy enough now to overlook, that in the late eighties, the prevailing public perception of the character was not Frank Miller’s Dark Knight, but rather Adam West’s “Bright Knight” from the self-consciously campy acid-trip of a TV series that had aired twenty years earlier.  In the wake of that show’s cancelation, a concerted effort was made by the character’s creative custodians at DC Comics—first Dennis O’Neil and Neal Adams, then Steve Englehart and Marshall Rogers, and most effectively Miller with his aptly titled The Dark Knight Returns—to reestablish Batman as the “nocturnal avenger” he was originally conceived to be.

“Dark Knight Triumphant” (July 1986); art by Frank Miller and Lynn Varley

But if you weren’t following the comics—and, in those days, few over thirteen years old were—the predominant impression the name “Batman” conjured wasn’t the ferocious Miller rendering above so much as this:

Continue reading

“Almost” Doesn’t Count: On Trying and Losing (Repeat as Needed)

In the event you don’t keep track of these things, the Los Angeles Dodgers lost the World Series last month, four games to one, to the Boston Red Sox.  It was both the Dodgers’ second consecutive World Series appearance and defeat.  From the point of view of many a long-suffering fan here in L.A., collapsing yet again mere inches from the finish line amounts to nothing more than another season-long strikeout, a yearlong exercise in futility, a squandered investment of time and emotional support.  “This is where baseball breaks your heart,” someone said to me in the waning days of the season.  To be sure, I share the sentiment:  It’s hard as hell to get so frustratingly close to the Golden Ring only to go home empty-handed.  A miss is as good as a mile, after all.  Close only matters in horseshoes and hand grenades.  “Almost” doesn’t count.

As recently as a few years ago, I wouldn’t have known, much less cared, who won or lost this Series—or even who played in it.  I came to baseball relatively late in life—around forty—as I recounted in “Spring Fever,” the gist of which was this:  For whatever reason, neither I nor any of my boyhood pals were born with the “sports gene.”  We were all pop-culture fanatics, more likely to be found at the local comic shop than Little League field.  When we saw the Bronx Bombers play the Indians at Yankee Stadium in 1986, none of us knew what the hell to make of that abstract experience; when we watched them face-off again in David S. Ward’s Major League in 1989, in the context of a Cinderella narrative, suddenly the rivalry had meaning.  We loved movies and comics; sports we simply had no use for.

A few years later, I found myself formally studying comics (under legendary Batman artist and DC Comics editor Carmine Infantino) and cinema (in college) in preparation for making a career in those fields.  What they don’t tell you in school, though, is that when you turn your passions into your profession, you often do so at the expense of the joy you once took in those pastimes.  Worse still, so many of the things that directly inspired me to be a screenwriter, from Star Wars to superheroes, I eventually grew to disdain.  And what Dodgers baseball restored in me, outside my conscious awareness as it was happening, was the innocent pleasure of being a fan of something again; it’s been a welcome, even analeptic, reprieve from the tyranny of passion.

Game 2 of the 2018 World Series

The Dodgers’ reentry into the World Series this fall, and the collective hope it kindled of their first world-championship win in precisely three decades, coincided with a sobering anniversary of my own:  It’s been exactly twenty years—October of 1998—since I signed with my first literary manager off a screenplay I’d written called BONE ORCHARD.  It occurs to me only now, as I type this, that the project was something of a creative precursor to Escape from Rikers Island, trafficking in many of the same themes and concepts:  an urban island left to rot and ruin, overrun with supernatural savages (demons, not zombies), with a neo–hardboiled detective at the center of the action.  (I’d studied Raymond Chandler in college and have since been heavily influenced by his fiction.)

Anyway, there I was, twenty-two years old and only a few months out of school, and everything was unfolding right on schedule.  The script would be taken to the spec marketplace and I would soon join the ranks of working screenwriters.  You study for a career in the arts, and you get one—simple as that.

Christ, if only.  BONE ORCHARD didn’t sell.  And while I was halfway through writing my follow-up, the management company repping me shuttered.  Young and naïve though I was, I nonetheless intuited I wasn’t likely to move the needle on my screenwriting career in New York—an ambition I was resolute about fulfilling—so I left the comforts of home behind for Los Angeles.

When you first arrive in Hollywood, good luck getting anyone with even a modicum of clout to give you the time of day.  Not gonna happen.  What you do—and what I did—is seek out aspiring filmmakers at the same level and pool resources.  In addition to screenwriting, I’d had experience as a film and video editor, so I started cutting USC thesis shorts pro bono.  Within a year or two, I’d established a circle of friends and colleagues, all in our twenties, who were collaborating on “portfolio projects.”  I was editing by day and writing by night, hoping to network my way to new representation—an objective that would, to my slowly percolating astonishment, take another half-dozen years to realize.

Continue reading

Signals in the Noise: Finding Meaning through Storytelling

It’s a strange thing, really, as anyone who knew me way back when can attest, that I now find myself in the predominantly solitary profession known as novelist.

Now, I don’t think any of them would find it the least bit surprising that I’m a creative, it’s only that I preferred to exercise my creativity as an agent of fellowship:  I was the kid who organized weekend games of “Christopher Columbus,” a large-scale, rough-and-tumble variant of hide-and-seek played on the streets of New York (its origins, so far as I know, derive from an obscure teen comedy from the eighties that I haven’t watched since, on the hunch that it’s likely better off remembered than revisited); I hosted annual “murder parties” along with my best friend, Chip, inspired by our love for Clue:  The Movie; and in senior year of high school, we enlisted half the neighborhood in a quixotic production of Lost Boys II, a handmade, feature-length sequel to one of our favorite horror films, itself a kind of ode to teamwork, that we shot on a state-of-the-art VHS-C camcorder.  To this day, I think we did a reasonably credible job of passing off the Bronx as Santa Cruz:  The Palisades along the Hudson River doubled for the coastal cliffs of the Pacific, and a cavernous subbasement I’d discovered beneath a 1970s luxury high-rise served as the vampires’ cave—not a bad bit of on-the-cheap production value, if I do say so!  (The acting and cinematography, on the other hand, from the limited footage that still actually exists, seem somewhat… unpolished.)

In retrospect, the Lost Boys project probably represents an inexorable turning point in my life:  Not only had I finally found a creative outlet that felt like a natural fit (after guitar lessons didn’t pan out and my enthusiasm for comic-book illustration somewhat outweighed my talent for it), but filmmaking would allow me and my friends to do something truly special—make movies!—and, more importantly, to do it together.  Of all the arts, this one embodied the spirit of fellowship I so cherished like none other.  It became one of the great loves of my life, and an obsessive—even tumultuous—twenty-year affair with it ensued.

Continue reading

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑