Writer of things that go bump in the night

Tag: sequel (Page 1 of 2)

Book Review:  “Heat 2” by Michael Mann + Meg Gardiner

This article discusses plot details and scene specifics from Michael Mann’s film Heat (1995) and his novel Heat 2 (2022).


John Carpenter’s dystopian classic Escape from New York (1981), set in 1997, opens with an expository intertitle:  “1988—The Crime Rate in the United States Rises Four Hundred Percent.”  Though that grim prognostication amounted to an exaggeration, the issue itself had nonetheless become a big deal here in the real world by the early 1990s:

In 1993, the year President Clinton took office, violent crime struck nearly 11 million Americans, and an additional 32 million suffered thefts or burglaries.  These staggering numbers put millions more in fear.  They also choked the economic vitality out of entire neighborhoods.

Politically, crime had become one of the most divisive issues in the country.  Republicans called for an ever more punitive “war on drugs,” while many Democrats offered little beyond nebulous calls to eliminate the “root causes” of crime.

David Yassky, “Unlocking the Truth About the Clinton Crime Bill,” Opinion, New York Times, April 9, 2016

Clinton’s response was the measurably effective (if still controversial) Violent Crime Control and Law Enforcement Act of 1994, otherwise known as the 1994 Crime Bill, coauthored by Joe Biden, the provisions of which—and this is just a sampling—added fifty new federal offenses, expanded capital punishment, led to the establishment of state sex-offender registries, and included the Federal Assault Weapons Ban (since expired) and the Violence Against Women Act.

It was an attempt to address a big issue in America at the time:  Crime, particularly violent crime, had been rising for decades, starting in the 1960s but continuing, on and off, through the 1990s (in part due to the crack cocaine epidemic).

Politically, the legislation was also a chance for Democrats—including the recently elected president, Bill Clinton—to wrestle the issue of crime away from Republicans.  Polling suggested Americans were very concerned about high crime back then.  And especially after George H.W. Bush defeated Michael Dukakis in the 1988 presidential election in part by painting Dukakis as “soft on crime,” Democrats were acutely worried that Republicans were beating them on the issue.

German Lopez, “The controversial 1994 crime law that Joe Biden helped write, explained,” Vox, September 29, 2020

Given the sociopolitical conditions of the era, it stands to reason—hell, it seems so obvious in hindsight—the 1990s would be a golden age of neo-noir crime cinema.  The death of Michael Corleone, as it happens, signified a rebirth of the genre itself; Martin Scorsese countered the elegiac lethargy—that’s not a criticism—of Francis Ford Coppola’s The Godfather, Part III with the coke-fueled kineticism of Goodfellas (both 1990).  Henry Hill shared none of Michael’s nagging reluctance about life in the Italian Mafia; he always wanted to be a gangster!

Reasoning that was probably true of audiences, too—as an author of horror stories, I certainly appreciate a healthy curiosity for the dark side—Hollywood offered vicarious trips into the criminal underworlds of Hell’s Kitchen, in Phil Joanou’s State of Grace (1990), and Harlem, in Mario Van Peebles’ New Jack City (1991), both of which feature undercover cops as major characters.  So does Bill Duke’s Deep Cover (1992), about a police officer (Laurence Fishburne) posing as an L.A. drug dealer as part of a broader West Coast sting operation.

The line between cop and criminal, so clearly drawn in the action-comedies of the previous decade (Lethal Weapon, Beverly Hills Cop, Stakeout, Running Scared), was becoming subject to greater ambiguity.  In no movie is that made more starkly apparent than Abel Ferrara’s Bad Lieutenant (1992), about a corrupt, hedonistic, drug-addicted, gambling-indebted, intentionally nameless New York cop (Harvey Keitel) investigating the rape of a nun in the vain hope it will somehow redeem his pervasive rottenness.

And it wasn’t merely that new stories were being told; this is Hollywood, after all, so we have some remakes in the mix.  Classic crime thrillers were given contemporary makeovers, like Scorsese’s Cape Fear (1991), as well as Barbet Schroeder’s Kiss of Death (1995), which is mostly remembered, to the extent it’s remembered at all, as the beginning and end of David Caruso’s would-be movie career, but which is much better than its reputation, thanks in no small part to a sharp script by Richard Price (Clockers), full of memorably colorful Queens characters and his signature street-smart dialogue.

Creative experimentation was in full swing, too, as neo-noir films incorporated conventions of other genres, including erotic thriller (Paul Verhoeven’s Basic Instinct [1992]), black comedy (the Coen brothers’ Fargo [1996] and The Big Lebowski [1998]), period throwback (Carl Franklin’s Devil in a Blue Dress [1995]; Curtis Hanson’s L.A. Confidential [1997]), neo-Western (James Mangold’s Cop Land [1997]), and, well, total coffee-cup-shattering, head-in-a-box mindfuckery (Bryan Singer’s The Usual Suspects; David Fincher’s Seven [both 1995]).

Christ, at that point, Quentin Tarantino practically became a subgenre unto himself after the one-two punch of Reservoir Dogs (1992) and Pulp Fiction (1994), which in turn inspired an incessant succession of self-consciously “clever” knockoffs like John Herzfeld’s 2 Days in the Valley (1996) and Gary Fleder’s Things to Do in Denver When You’re Dead (1995).  By the mid-’90s, the crime rate, at least at the cinema, sure seemed like it had risen by 400%.

Tim Roth lies bleeding as Harvey Keitel comes to his aid in a scene from the film “Reservoir Dogs,” 1992 (photo by Miramax/Getty Images)

As different as they all are, those films can almost unanimously be viewed as a repudiation of the ethos of ’80s action movies, in which there were objectively good guys, like John McClane, in conflict with objectively bad guys, like Hans Gruber, in a zero-sum battle for justice, for victory.  It was all very simple and reassuring, in keeping with the archconservative, righteous-cowboy worldview of Ronald Reagan.  And while those kinds of movies continued to find a receptive audience—look no further than the Die Hard–industrial complex, which begat Under Siege (1992) and Cliffhanger (1993) and Speed (1994), among scores of others—filmmakers were increasingly opting for multilayered antiheroes over white hats versus black hats.

Which begged the question:  Given how blurred the lines had become between good guys and bad guys in crime cinema, could you ever go back to telling an earnest, old-school cops-and-robbers story—one with an unequivocally virtuous protagonist and nefarious antagonist—that nonetheless aspired to be something more dramatically credible, more psychologically nuanced, more thematically layered than a Steven Seagal star vehicle?

Enter Michael Mann’s Heat.

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading

“It’s Over, Johnny”: The Thrill Is Gone in “Rambo: Last Blood”

The following article discusses story details of Rambo:  Last Blood.

In the lead-up to Creed (2015), the New Yorker published a fascinating analysis of the six Rocky movies, arguing that they can be viewed as a trilogy:  In Rocky (1976) and Rocky II (1979), the Italian Stallion goes from nobody to somebody; in III (1982) and IV (1985), he mutates once again, this time from hero to superhero; Sylvester Stallone then sought to extricate the champ from the excesses of Reagan’s America (the robot butler, anyone?), setting up Rocky’s ignoble return to the streets of Philly in Rocky V (1990), then credibly reestablishing him as an underdog in Rocky Balboa (2006).  It was this iteration of Rocky—the purest version—that Stallone reprised in Creed and Creed II (2018), in which an aging, widowed, streetwise Rocky acts (reluctantly at first) as mentor and trainer to a young protégé.

Sylvester Stallone in “Rambo: Last Blood” (2019)

Sly’s other signature role, troubled Vietnam vet John Rambo, has had no less of a winding road through the past five decades when it comes to his ever-evolving characterization:  The self-hating solider of David Morrell’s 1972 novel First Blood was recast as a sympathetic hero in the 1982 movie of the same name, who in turn became the jingoistic superhero of Rambo:  First Blood, Part II (1985) and Rambo III (1988).  It was only in his belated fourth cinematic adventure, Rambo (2008), that his prototypal literary temperament atavistically asserted itself:

You know what you are, what you’re made of.  War is in your blood.  Don’t fight it.  You didn’t kill for your country—you killed for yourself.  God’s never gonna make that go away.  When you’re pushed, killing’s as easy as breathing.

Rambo’s inner monologue in Rambo (2008)

Upon ending the prolonged moratorium on both creatively depleted franchises in the aughts, Stallone didn’t “retcon” some of the lesser entries in the Rocky and Rambo series, but rather embraced them as part of both heroes’ long emotional arcs:  Just as Creed II redeems the hokey jingoism of Rocky IV, Rambo IV acknowledges that the previous sequels glorified violence—gleefully, even pornographically—and burdens the protagonist with the guilt of that indefensible carnage, refusing to let him off the hook for it.  The inconvenient mistakes of the past aren’t expunged from the hagiographies of either of these American icons for the sake of a cleaner narrative—an increasingly common (and inexcusably lazy) practice in franchise filmmaking, as evidenced by recent “do-over” sequels to Terminator and Halloween—but instead seed the conditions in which we find both Rocky and Rambo at the next stage of their ongoing sagas.

So, in Rambo:  Last Blood (2019), which sees the itinerant commando back home at his ranch in Arizona (per the coda of the last movie), the big question I had going into the film was this:  Which permutation of Rambo would we find in this story—the one about what happened after Johnny came marching home?  What might Rambo, who has always served a cultural Rorschach—first as an expression of the political disillusionment of the seventies, then recruited in the eighties to serve as poster boy for the Reagan Doctrine—tell us about ourselves in the Trump era?

Continue reading

Artistic Originality: Is It Dead—or Was It Merely a Fallacy to Begin With?

Over the course of the many insightful conversations generated by the recent post on Star Wars:  The Last Jedi—sincerest thanks to all who shared their time and thoughts—the subject of artistic influence was discussed:  what role it played in the creation of some of Gen X’s most cherished movie franchises of yore, and what part, if any, it has in our now-institutionalized praxis of remaking those films wholesale—of “turning Hollywood into a glorified fan-fiction factory where filmmakers get to make their own versions of their childhood favorites.”

Because where is the line drawn, exactly, between inspiration and imitation?  If the narrative arts are a continuum in which every new entry owes, to a certain extent, a creative debt to a cinematic or literary antecedent, is originality even a thing?

If so, what is it, then?  How is one to construe it concretely, beyond simply “knowing it when we see it”?  And, as such, is there a way for us as artists to codify, or at very least comprehend, the concept of originality as something more than an ill-defined abstraction to perhaps consciously strive for it in our own work?

 

THE HERO WITH A THOUSAND INFLUENCES

Since it was Star Wars that provoked those questions, let me start with this:  George Lucas is one of my eminent creative influences.  When I was in high school in the early nineties, during that long respite between Return of the Jedi and The Phantom Menace, when Star Wars was more or less placed by its creator in carbon-freezing, I became aware that the same mind had conceived two of my favorite franchises, and went to great lengths to study Lucas’ career:  how he learned the art of storytelling, where his ideas came from, how he managed to innovate the way in which blockbusters were created and marketed.

“Star Wars” and “Indiana Jones” mastermind George Lucas, my first creative idol

In order to more fully appreciate what Lucas created in 1977 when he made Star Wars—a work of fiction so thrilling and inspired it seemed to emerge fully realized from his singular imagination—it behooves us to consider the varied influences he drew from.  The 1936 Flash Gordon film serial Lucas watched as a child provided the inciting animus—a grand-scale space opera told as a series of high-adventure cliffhangers.  (It also later informed the movie’s visual vocabulary, with its reliance on old-fashioned cinematic techniques like opening crawls and optical wipes.)

In a case of east meets west, Joseph Campbell’s study of comparative mythology The Hero with a Thousand Faces provided a general mythic and archetypal blueprint to endow Lucas’ sprawling alien-world fantasy with psychological familiarity, while Akira Kurosawa’s The Hidden Fortress served as a direct model for the plot he eventually settled on (after at least three start-from-scratch rewrites).  Lucas ultimately patterned the series’ three-part narrative arc after Tolkien’s Lord of the Rings cycle (which later directly influenced his high-fantasy franchise-nonstarter Willow), because, prior to Star Wars, closed-ended “trilogies” weren’t really a thing in commercial cinema.

In addition to his cinematic and literary interests, Lucas is also a passionate scholar of world history (as evidenced by Indiana Jones, particularly the television series), and a direct line can be drawn from the X-wing assault on the Death Star to the aerial dogfights of World War II, to say nothing of the saga’s allusions to the Roman Republic, Nazi Germany, and the Vietnam War.  As for where the Force and lightsabers and the twin suns of Tatooine came from… who knows?  The sheer number of disparate interests that met, mated, and reproduced within the confines of Lucas’ brain can never be fully accounted for, even by the man himself.

Continue reading

Won’t Get Fooled Again: “The Last Jedi” Incites a Fan Rebellion against Disney’s “Star Wars” Empire

Well ahead of the release of The Last Jedi, I’d made a private resolution to stop being so goddamn grumpy about Star Wars and superheroes moving forward.  That’s not to suggest, mind you, I rescind my cultural criticisms of them, merely an acknowledgment that I’d said my piece, have nothing more to offer on the matter, and have no wish to spend 2018 mired in negativity.  There’s enough of that going around these days.

And yet here I find myself, first post of the New Year, compelled by fate—just like Obi-Wan, I suppose, and, more recently, Luke Skywalker himself—to crawl out of hiding.  Here’s what happened:

The week Last Jedi hit theaters, I was preoccupied with last-minute errands and arrangements for my trip home for the holidays, and Star Wars, frankly, was the last thing on my mind.  I was peripherally aware the movie was “in the air”—reviews were near-universally hailing it as “groundbreaking,” the best of the series since Empire—but altogether oblivious that it had already opened.

Until Saturday, December 16.  That’s when unsolicited text messages start pinging in rapid succession from friends and colleagues, decrying it as “the worst Star Wars ever,” “a betrayal,” “the death of the franchise,” etc.  (One old friend even suggested I stay away from the movie at all costs if I wanted to preserve any fondness I had left for Star Wars.)  I couldn’t quite reconcile any of that with the glowing critical notices, so I went to Rotten Tomatoes, and, sure enough, an overwhelming plurality of the audience was hating this movie.  Not strongly disliking it, mind you—despising it.  Some excerpts:

“I will pass on IX and it won’t make any difference in the grand scheme of things, but there is nowhere the plot can go in the final movie that I particularly would care for.  I have no investment in the characters, plot or universe anymore.”

“Steaming pile of bantha poodoo.”

“Easily the worst in the Saga.  Lifelong Star Wars fan.  It’s now all over.”

“Worst movie EVER.  I can’t begin to find the words that express how bad this was.  Guess it’s hard to say much without spoilers.  Just be warned it’s not the star wars you know.”

“You won’t fool me, nor my money, ever again.”

And then there was this succinct four-word review:

“Fuck you rian Johnson”

How to explain such opprobrium?  (Note:  There are those that suggest a vocal minority of haters has merely created the misleading illusion of substantial backlash—possibly that’s true—but the sampling of direct responses I’ve fielded for the most part range from faint praise at best to seething vitriol.)  I mean, these were the movies that were supposed to “redeem” Star Wars after creator George Lucas’ best malignant efforts to ruin all our childhoods with the prequels, right?

Epic fail—”Episode VIII” turned out to be something other than the glorious return of the Jedi many fans anticipated

So, what’s gone wrong? I wondered.  Were fans simply being oversensitive?  Or did filmmaker Rian Johnson, making his Star Wars debut, indeed deliver a credibly bad movie—a “franchise killer”?  How exactly did things reach such an extreme, fevered pitch a mere two years after Disney’s much-anticipated brand-relaunch of Star Wars?

It’s a complicated answer with more than one determinant, but I can get to the heart of the problem for you.

Hold that thought, though.  We’ll get back to Star Wars shortly.

Continue reading

This Counts, That Does Not: On Canonicity in Media Franchises

It may surprise you to learn this, but the events of Star Wars never actually happened—the majority of them, anyway.  I mean that sincerely—not for a minute should that be interpreted as snide or condescending.  But perhaps I’m getting ahead of myself…

In 1983, George Lucas brought his Star Wars trilogy to a close with Return of the Jedi (oh, those bygone days when movie franchises actually reached—wait for it—a conclusive resolution).  Throughout the eighties, the series lived on by way of a pair of made-for-television Ewok movies and the Saturday-morning cartoons Droids and Ewoks, which continued to stoke interest in the franchise—and its lucrative action-figure line… for a while.  But by the end of the decade, with no new big-screen productions to energize the fan base, Star Wars had resigned its position at the top of the pop-cultural hierarchy.

George Lucas looks to the horizon

Lucas, who had always been a forward-thinking businessman as much as he was a visionary filmmaker (he negotiated a reduced fee for writing and directing the original Star Wars in return for ownership of sequel and merchandising rights, which the studio deemed worthless and was only too happy to relinquish), had plans to revisit the Star Wars galaxy in a prequel trilogy that had been part of his grand design when he was developing the earlier films—hence the reason, in case you never thought to ask, they are numbered Episodes IV through VI.  Even though the prequels themselves were some years off—production on The Phantom Menace wouldn’t commence until 1997—he began laying the groundwork to return Star Wars to its lofty place in the cultural consciousness by commissioning science-fiction author Timothy Zahn to write a trio of novels set five years after the events of Return of the Jedi—what later became commonly known as “the Thrawn trilogy” (named for its chief antagonist).

The books were released successively in ’91, ’92, and ’93 (my best friend Chip and I couldn’t get down to the local bookstore fast enough to buy a copy of each upon publication, though being a year older, he got to read them first); they were New York Times bestsellers that not only got their intended job done—reigniting public interest in a dormant media franchise—but also led to an endless, ongoing series of novels that explored every facet of the Star Wars galaxy:  No character or event was too small to be the focus of its own story.  Thus, the Star Wars Expanded Universe (SWEU) was born.  Han and Leia had twins!  Luke got married!  Chewbacca sacrificed himself for the Solos’ son Anakin!  A universe of stories, far beyond the contained narrative arc of the classic trilogy, took on a life of its own and captured the imagination of a generation that invested itself in the ongoing space opera collectively known as Star Warsa vast, complex continuity that Lucasfilm maintained with curatorial oversight to prevent inconsistencies and contradictions in the expansive mythos, which comprised movies, books, comics, TV shows, RPGs, and video games.

The Force awakens? For many fans, it never went dormant

When Disney acquired Lucasfilm in 2012, however, they had their own ambitious plans to expand the franchise, and didn’t want to be tied down to every addenda in the extensive mythology.  And just like that, everything other than the feature films and then-current Clone Wars animated series was “retconned”—still commercially available, mind you, under the new “Legends” banner, but henceforth declared noncanonical.  This was an outrage to many of the longtime fans who considered these “expanded universe” adventures sacrosanct—who’d invested time, money, and interest in the world-building fictions of the Star Wars continuity that had been undone with the stroke of a hand.  Some of their favorite stories were now apocrypha, whereas the much-derided prequels, on the other hand, were still canonically official.  Where was the justice—the sense—in that?

Continue reading

Classifying the “Star Trek” Movies by Their “Save the Cat!” Genre Categories

Star Trek turned fifty this year (something older than me, mercifully), but you needn’t be a fan to appreciate some of the lessons writers of fiction can take from its successes and failures during its five-decade voyage.  I mean, I probably wouldn’t myself qualify as a “Trekkie”—I simply don’t get caught up in the minutiae.  What I’ve always responded to in Trek is its thoughtful storytelling and philosophical profundity.  “Even the original series, for all its chintziness,” someone told me when I was thirteen, “it was still the thinking man’s show.”

I recall watching The Original Series in syndication, and being swept away by the classic time-travel episode “The City on the Edge of Forever”; finally I understood that Trek was about ideas, and those could be just as thrilling—more so, in fact—than set pieces.  Anyone who was around for it certainly remembers the excitement when The Next Generation premiered, unknowingly kicking off perhaps the first major-media “shared fictional universe” two decades before Marvel got there.  I watched the pilot with my father—which was a big deal, since television wasn’t his thing (the nightly news excepting)—and I haven’t forgotten his lovely, two-word appraisal of the first episode when it was over:  “It’s kind,” he said, with no further elaboration.

It took some years to fully appreciate that assessment.  Having grown up on the adventures of James T. Kirk, the original captain’s renegade spirit and cowboy diplomacy appealed to my juvenile worldview; Picard, on the other hand, seemed like a high-school principal in comparison.  But over time, I came to identify with Picard’s genteel, introspective mindset, and every line he uttered—even the technobabble—sounded like poetry from the mouth of Patrick Stewart, who endowed his performance with such dignity and conviction.  For me, the best part of Star Trek was getting Picard’s closing takeaway on the issue du jour.

The franchise continued to grow as I did, and my wife, whom I started dating at nineteen, was as much a fan as I was, it turned out, and we looked forward every few years to the next feature film, until the series finally, against all expectation, sputtered out with Nemesis (2002) and Enterprise (2001–2005).  Among other reasons for that, Trek had been eclipsed by a new sci-fi franchise—The Matrix—that spoke to the ethos of our new Digital Age.  Perhaps more than any other genre, science fiction needs to reflect its times, and times change; finality is something to be accepted—embraced, even—not feared.  The Enterprise, thusly, had been decommissioned.

Continue reading

Collapse of the Tentpole: Why Hollywood’s Grim Summer Is Good News for the Rest of Us

Hope springs eternal—and by that I mean it was just this past spring I was lamenting Hollywood’s hopeless addiction to nostalgic, twentieth-century brands, from superheroes to Star Wars, and its incorrigible aversion to original genre works in favor of endless sequels and remakes (I will not cave to social pressure by calling them “reboots” just to assuage the egos of filmmakers too precious to be considered slumming with the likes of—heaven forbid—a remake).  And yet…

And yet what a difference a summer can make.  Let’s review the scorecard, shall we?

Batman v Superman took a critical beating (to say the least) and, despite sizable box-office returns, underperformed to expectations, an inauspicious opening salvo in Warners’ would-be mega-franchise (and something tells me, no matter how tepid the public response, they’re not going to take “no” for an answer on this one).  The follow-up, Suicide Squad, performed well even if it didn’t fare any better critically, though one could argue both movies actually did the health of the budding cinematic universe more harm than good in that they tarnished the integrity, such as it is, of the brand; DC is thus far not enjoying Marvel’s critical or popular cachet.  And you don’t build an ongoing franchise playing only to the base.

Other expensive underperformers:  Warcraft; X-Men:  Apocalypse; Teenage Mutant Ninja Turtles:  Out of the Shadows; Neighbors 2:  Sorority Rising; Star Trek BeyondJason Bourne opened well but suffered a steep second-week drop-off—it had no “legs,” in box-office parlance.

Who ya gonna call to exterminate the "ghosts" of a previous generation haunting the multiplex?

Who ya gonna call to exterminate the “ghosts” of a previous generation haunting the multiplex?

Plenty of other “surefire” sequels outright bombed:  Alice Through the Looking Glass, Ghostbusters (not a sequel, but it was promoted as one), The Huntsman:  Winter’s War, Zoolander 2, Independence Day:  Resurgence, and The Divergent Series:  Allegiant, the last of which has resulted in a particularly embarrassing—and unprecedented—predicament for its studio, Lionsgate, which, following in the footsteps of previous YA adaptations Harry Potter, Twilight, and The Hunger Games, unnecessarily split the last movie into two parts, and now they’re stuck with a commitment to a final sequel (or half of one, anyway) without an audience anticipating its release.

Continue reading

The Great Escape: What the Ascendancy of Comic-Book Culture Tells Us about Ourselves

Lest anyone doubt the real-world superheroic capabilities of a fictional character, let me state for the record that Batman taught me how to read.

For in watching the syndicated reruns of the Adam West series in the late seventies—the kitschy opening credits, specifically—my not-yet-literate mind eventually recognized a correlation between the splashy title-card logo and repetitive choral chant that accompanied it, and “Batman” became the first word I could read and write.  Absolutely true story.

"Ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba! Batman!"

“Ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba, ba! Bat-man!”

I loved the old Batman show—the pop-art color scheme and Dutch angles (not that I took conscious note of such stylistics at the time) were like a cartoon come to life.  The camp humor?  Entirely lost on me:  When Batman and Robin slid down the Batpoles and zoomed off in the Batmobile—staged in that glorious life-sized playset of a Batcave—the sense of adventure was kinetic.  And when the villain-of-the-week left our heroes for dead in some Rube Goldbergian contraption—their fate to be determined in twenty-four agonizing hours!—the tension was excruciating.

Unlike most of my heroes at that time—Michael Knight, the Duke boys—the Dynamic Duo weren’t confined to the limited jurisdiction of their own fictional worlds, but rather popped up elsewhere, too, in animated form on The New Scooby-Doo Movies and Super Friends, and I never quite understood why no one had thought to put Adam West, Christopher Reeve, and Lynda Carter in a movie together; with no concept of copyright issues or irreconcilable aesthetics or what later came to be called “shared cinematic universes,” it seemed like a no-brainer to assemble an all-star superhero team from the preexisting talent pool.

Batman v Superman: Dawn of Justice

Batman v Superman: Dawn of Justice

Thirty-five years after I—along with an entire generation raised on the same pop-cultural diet, it turns out—first dreamed it, the team formerly known as the Super Friends are getting the tent-pole treatment next month with the release of Batman v Superman:  Dawn of Justice, Warner Bros.’ opening-salvo attempt at the kind of license to print cash shared cinematic universe Marvel has so deftly pioneered (to the envy of every studio in Hollywood).  Fanboy anticipation is at a full boil, if enthusiasm on social media is any barometer; many are counting down the days with a breadth of fanaticism ordinarily reserved for the Second Coming, others forecasting the would-be mega-franchise’s stillbirth, but all are anxiously awaiting Dawn.

Not me, though.  I can say with absolute and irrevocable certainty that I’ll be sitting out Batman v Superman—in theaters, on home video, on cable.  In perpetuity.

But, more on that shortly.

Continue reading

« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑