Hope springs eternal—and by that I mean it was just this past spring I was lamenting Hollywood’s hopeless addiction to nostalgic, twentieth-century brands, from superheroes to Star Wars, and its incorrigible aversion to original genre works in favor of endless sequels and remakes (I will not cave to social pressure by calling them “reboots” just to assuage the egos of filmmakers too precious to be considered slumming with the likes of—heaven forbid—a remake). And yet…
And yet what a difference a summer can make. Let’s review the scorecard, shall we?
Batman v Superman took a critical beating (to say the least) and, despite sizable box-office returns, underperformed to expectations, an inauspicious opening salvo in Warners’ would-be mega-franchise (and something tells me, no matter how tepid the public response, they’re not going to take “no” for an answer on this one). The follow-up, Suicide Squad, performed well even if it didn’t fare any better critically, though one could argue both movies actually did the health of the budding cinematic universe more harm than good in that they tarnished the integrity, such as it is, of the brand; DC is thus far not enjoying Marvel’s critical or popular cachet. And you don’t build an ongoing franchise playing only to the base.
Other expensive underperformers: Warcraft; X-Men: Apocalypse; Teenage Mutant Ninja Turtles: Out of the Shadows; Neighbors 2: Sorority Rising; Star Trek Beyond. Jason Bourne opened well but suffered a steep second-week drop-off—it had no “legs,” in box-office parlance.
Plenty of other “surefire” sequels outright bombed: Alice Through the Looking Glass, Ghostbusters (not a sequel, but it was promoted as one), The Huntsman: Winter’s War, Zoolander 2, Independence Day: Resurgence, and The Divergent Series: Allegiant, the last of which has resulted in a particularly embarrassing—and unprecedented—predicament for its studio, Lionsgate, which, following in the footsteps of previous YA adaptations Harry Potter, Twilight, and The Hunger Games, unnecessarily split the last movie into two parts, and now they’re stuck with a commitment to a final sequel (or half of one, anyway) without an audience anticipating its release.
GHOSTBUSTERS RETURNS! BOURNE RETURNS! DIMINISHING RETURNS!
Don’t for a minute feel bad for Hollywood—they did this to themselves. Prior to the disastrous 2007–08 Writers Guild of America strike, the industry worked like this: Studios had deals with production companies, furnishing them with “war chests” with which they could then seek out and hire screenwriters to develop original projects; once a satisfactory screenplay was complete, the prodco would take it to their partnered studio and try to get it “set up”—i.e., greenlighted for production. That was the way the business operated for a long, long time; it’s how many of your favorite films found their way to the silver screen.
During the strike, however, it occurred to the studios that they already own enough “branded IPs” (intellectual properties)—from Star Wars to Fast & Furious to Transformers to Bond to Bourne to Planet of the Apes to the superheroes of Marvel and DC—to keep them in business through the end of time. Why pay to develop new material—why take a chance on an unknown entity—when they could simply sequelize and “reboot” proven franchises for an audience with an insatiable appetite for them? Culturally speaking, this was a troubling strategy, of course, but economically, it made a kind of sense—even to a no-name screenwriter like myself, whose very welfare depends upon a marketplace hospitable to new ideas, new stories. So long as audiences were coming back to see, time and again, “the ephemera of a previous century” (to borrow Watchmen scribe Alan Moore’s exquisite phrasing), why not keep serving up the same old shit on a fancier platter?
Only this year, audiences didn’t show up for it. Like the housing bubble of ’08, Hollywood’s avaricious overreliance on branded IPs may have finally reached its saturation point. Could it be that an entire movie-going public considered what was being offered—from Batman to X-Men to Star Trek to Ghostbusters—and collectively shrugged, “Already seen it”? Could we finally be hungering for new stories, and new heroes, that speak to the ethos and preoccupations of our new millennium? Is this the summer we declared, for good and all, our Independence Day?
I certainly hope so. I hope so for the sake of our culture, so mired in nostalgia right now, and I hope so, rather selfishly, for my own professional prosperity—my forthcoming novels, none of which feature cameo appearances by Iron Man or Wolverine. I don’t think Hollywood will take a lesson from any of this, mind you: In the midst of this dismal season at the box office, Disney announced a remake of The Rocketeer, a movie that bombed in theaters a quarter century ago, just to give you a sense, folks, of how deep the town’s denial runs. (Remind me, Disney, how that Tron rehash worked out for you?) But by digging in their heels they’re only digging their own graves—by clinging to a formula growing less and less effective every year. Next year’s franchise offerings, like Wonder Woman and Justice League, are already in the pipeline, so we’ll have to see if the box office rebounds—if 2016 was merely an anomalous dip—or if the downward trend continues. I’d like to think what we saw this summer is a referendum—on nostalgia-for-profit, on the inexcusable cooptation (and perversion) of children’s characters by a generation of middle-aged men, on the creative bankruptcy of Hollywood movie studios and their hermetic stable of go-to content creators—that will still hold true in a year’s time.
Granted, the sheer number of entertainment options may have very well played a part in eroding the audience for everything: Aside from the fact that there are now nearly 500 scripted television shows competing for our attention (a far cry from the measly trio of networks I grew up watching), the movie studios have come to rely so heavily on “tentpoles” that there’s more or less a new one being released every Friday—$200 million–budgeted movies get a single weekend to do all their business before being shoved aside for the next weekly must-see “event.” As Scott Mendelson from Forbes recently put it: “The mood for the last year regarding would-be tentpoles has not been ‘Woohoo, we’re going to hit it big!’ but more ‘Aww geez, I hope we don’t lose too badly!’” And that’s exactly the problem: We’re being force-fed more “must-see” media than we have the appetite or bandwidth to consume. We’re overwhelmed. And when that happens, at some point “must-see” becomes—out of sheer self-preservation—“Who cares?”
Adults, certainly, can’t muster sufficient enthusiasm to go out to the movies anymore. Only the teenage demographic still bothers to do that, because, despite their increasingly digitized socialization habits, they still need a physical place to go on a Saturday night that isn’t under the watchful eye of their parents, hence the reason movies now are geared almost exclusively to their juvenile sensibilities. There are no more Godfathers, no more Dances with Wolves, no Bravehearts or As Good as It Gets. In 1996, Tom Cruise introduced us to both Ethan Hunt, a dimensionless action figure not half as interesting as even the dumbest incarnations of 007, and Jerry Maguire, a complex, layered protagonist in a drama about a sports agent; since then, Hunt has been reprised four times (with more missions, impossibly, yet to come), while even Cruise’s considerable (if somewhat diminished) star power can’t get projects the likes of Jerry Maguire made anymore.
Which is not to say good stories aren’t still being told—we just have to go elsewhere for them. So while you won’t find the next GoodFellas at your local Cineplex, Netflix is offering Peaky Blinders. There are no more Shawshank Redemptions, but there is Orange Is the New Black. No new Russia House, but The Americans instead. No more Fargos, but at least there’s Fargo. Noticing a trend here?
Whereas movies are increasingly becoming a more expensive—and correspondingly riskier—enterprise, ever more reliant on spectacle over nuanced storytelling, television is filling the narrative void. I’m not implying, as many of my colleagues would suggest, TV is a better medium than cinema, it’s simply being utilized, for the most part, in more emotionally complex and resonant ways at present.
Among other innovations, television has become a haven for “postnarrative” fiction. Unlike conventional storytelling (the Aristotelian arc), with its beginning, middle, and end—and its takeaway “moral of the story”—postnarrativity is an open-ended, ongoing exercise in “problem solving,” in which the sprawling fictional worlds of shows like Game of Thrones “are like giant operating systems whose codes and intentions are unknown to the people living inside them. Characters must learn how their universes work. Narrativity is replaced by something more like putting together a puzzle by making connections and recognizing patterns” (Douglas Rushkoff, Present Shock: When Everything Happens Now, [New York: Penguin Group, 2013], 34). In postnarrativity, there’s no emotional value at stake (like hope in Shawshank) or lesson to be learned (“There’s no place like home”). Consider:
- Unlike Michael Corleone, Tony Soprano neither struggles to be ethical nor seeks absolution for his transgressions. His “resolution,” therefore, is neither tragic nor redemptive: In the middle of a perfectly ordinary scene—he’s sitting with his family in a diner—the screen cuts abruptly to black and the series is over. The Sopranos ended, but it did not conclude, for that would have been beside the point.
- On Lost, the castaways spend all their time on the island trying to puzzle out what it all means: the mysterious sequence of numbers, the smoke monster, the polar bear on the beach. Note the final line of the first episode isn’t, “Guys, how do we get home?” Instead, rather tellingly, it’s “Guys—where are we?” Getting home was irrelevant; cracking the enigma (or interminable sequence of them) was the entire point of the epic narrative.
- On The Walking Dead, the rules in question are ones of moral parameters: In a civilization-has-fallen world without laws or government, what, if anything, constitutes right and wrong? The characters no longer know. Some cling to absolutes, like Morgan, while others, like Rick, prioritize survival above (obsolete?) principles of morality. Back and forth the pendulum swings, the survivors hardening or softening depending on their present circumstances and recent experiences, but no new set of laws or principles is ever established—sorting it all out is very much an ongoing work in progress.
- On Seinfeld, Jerry and his pals are preoccupied with decoding the unspoken and often ambiguous rules of etiquette in modern urban society; something as simple as a call-waiting alert becomes a social dilemma fraught with consequences, and is even given, in an attempt at codification, its own clever label: a phone-call face-off.
None of the above examples of postnarrativity are about providing resolution or catharsis like the “hero’s journey” arc, only about an endless game of pattern recognition. They’re about ongoing, sustainable plots, but they are not about story. Story concludes; story imparts values. (In a recent episode of Mr. Robot, itself an exemplar of postnarrativity, a character ruminates on the classic Seinfeld episode “The Chinese Restaurant”—that its real-time plot about a group of friends waiting interminably for a table, and debating as the night drags on the pros and cons of staying versus bailing, is not a story. He’s right. And just like Mr. Robot, it wasn’t intended to be one.) The classical form of narrative (And the moral of the story is…) isn’t resonating anymore; it doesn’t reflect the challenges and anxieties of our new Digital Age in which our sense of linearity has been fractured by telecommunications technologies that allow us to be—that demand that we be—multiple places at once. This is what has emerged in its place.
Postnarrative television has become so popular, so culturally resonant, that movies are now being produced by way of a similar model: Consider, for example, the Marvel Cinematic Universe, the first successful “mega-franchise” in which sequels, as we’ve traditionally understood them (a linear progression of subsequent adventures featuring James Bond or Indiana Jones or Freddy Krueger), are supplanted by installments in an expansive fictional universe of concurrent action where what happens in Captain America: Civil War has an immediate butterfly effect on the events of Agents of S.H.I.E.L.D. and Spider-Man: Homecoming and Avengers: Infinity War and so on and so on and so on. That’s no structurally different, really, from your average episode of Game of Thrones (or Gotham or Sleepy Hollow or Once Upon a Time, for that matter), with its umpteen simultaneous plotlines, many of which are only tangentially connected to one another outside of simply existing in the same vast fictional landscape. How it all connects is ultimately more important than how any of it resolves. This is how storytelling works in a postnarrative world in which endless connections—smartphones that urgently “hyperlink” us to the next happening before the conclusion of the current one—keep us from ever experiencing resolution.
Cinema and TV, therefore, once provinces as distinct as Winterfell and the Dothraki Sea, have formed a multimedia feedback loop: Movies, with their A-list talent and ambitious scope, influence television, and television in turn makes an evolutionary storytelling leap, thereby influencing movies—to the point now where the line between the two is becoming indistinguishable, certainly to a generation that consumes virtually all its media not on a screen in a darkened theater or family living room at a designated time, but on a phone or tablet whenever the mood strikes.
Let’s face it: Movie theaters are an outmoded presentational forum, anyway. Sure, they made sense before anyone had TVs in their home. And they still had a purpose as recently the eighties and nineties, when I was a kid, because VCRs weren’t yet de rigueur, and even by the time they were, the quality certainly wasn’t up to theatrical snuff. But nowadays? C’mon. Most home-theater setups provide a comparable if not superior audio-visual experience to your average aging multiplex, minus the expense and inconveniences of a night out. And since fewer and fewer people are actually buying tickets anymore, and the theater-to-download release window is getting ever narrower, I have a hard time imagining theatrical exhibition will still be a viable thing in a decade, if even half that long. There will always be theaters, certainly in the big cities, but they will become niche outlets for special event screenings. The days of going to the movies are about to be relegated to one of those romantic notions of the past, like drive-ins.
THE ERA OF POSTTHEATRICALITY
Something revolutionary, however, is coming that will supplant the old twentieth-century media models—they will merge into an entity not quite cinema, but not quite television, either. Movies won’t get theatrical distribution and TV shows won’t air on the weekly installment plan any longer. We’ll download all our filmed media at our own convenience, and the line between what constitutes cinema and television will be moot to a generation reared on no-beginning/no-end postnarrative stories that were never projected or broadcast, but rather streamed. To them, The Avengers won’t be a movie series anymore than Daredevil is a television series—it’ll all just be entertainment continua, available on demand, ever and always, in their bingeable totalities.
Storytellers are already adapting their fictions to suit both the cultural complexities of modern society as well as the new presentational modes. In discussing House of Cards, creator Beau Willimon observed that “we were able to take a novelistic approach to filmmaking, which you could call television, but we really saw as a long movie, I guess, with more of a resemblance to a novel than anything else.” Got all that? Is he, then, creating a movie? A TV series? A filmed novel? Something else altogether?
Who cares? He’s telling a good tale that’s resonating. Whether it earns him an Emmy or an Oscar is irrelevant—as irrelevant, in point of fact, as those two distinct awards ceremonies will soon, I suspect, find themselves. More and more narratives are going to start to emerge in the House of Cards mold, and one or more of them will make the next transmutative leap. Hell, even compact, closed-ended stories may return—and prosper—in this new paradigm. I’m betting they will—so much so that those are the kind I’m writing; I think we long for traditional narratives and old-fashioned heroes, actually: Audiences are often taken with the unpredictability of postnarrative series that don’t conform to the familiar three-act model (look at all the time spent analyzing their minutia on pointless “aftershows” like Talking Dead), yet get frustrated when they fail to deliver on Aristotelian conventions (by which I mean a satisfying resolution). That’s not postnarrativity’s job, of course—it’s merely to keep “the adventure alive and as many threads going as possible” (ibid.)—but finality is nonetheless hardwired into our very apprehension of reality, and stories that provide closure help us find meaning in the unavoidable and often unpleasant truth of cessation.
Perhaps “movies,” then, unrestrained by the time limits of theatrical presentation—of show times—will run eight or ten or twelve continuous hours; imagine, if you will, a finite narrative like the recent Netflix original series Stranger Things, but without episode breaks: The viewer would then decide, at his own discretion, when to pause and resume the program, same as the way a novel is read. Something like that is going to happen, sooner than later. The different forms and formats will evolve into new permutations that take best advantage of the latest content-delivery technologies. Closed-ended stories won’t necessarily be confined to two-hour “movies” or one-hour “episodes,” just as open-ended narratives will no longer be beholden to “phases” (as they’re called by Marvel) or “seasons” (on television), but will instead serve as the testing ground for experimental, amorphous structuring from innovative storytellers; that which works—and not all of it will—will become the new structural standard in narrativity. But make no mistake: Storytelling criteria as we’ve understood them for nearly a century are going to metamorphose—and our notions of “movies” and “television” as separate entities, I predict, will soon be as antiquated as landlines and credit cards and wristwatches and photo albums and wall-hung calendars, and all of the other archaic independent apparatuses whose functions are now fulfilled by a single featherweight device. This is inevitable.
The movie studios don’t want to face this, hence the reason they’re still operating under the old model—which is fitting, considering most of their material is recycled from a bygone era, too. (Same with the networks and cable channels, whose content is still structured to support advertiser-sponsored programming.) They’re squeezing every last dollar they can out of a dying beast by overcrowding the market with event movies. That it’s unsustainable, as this past summer unequivocally demonstrated, is completely the point: The old Hollywood institutions are effectively drilling for the last barrels of oil as fast as they can before the grid goes green, imminently and irrevocably.
So, I can’t believe I’m saying this a mere half a year after pleading for it, but the days of the “tentpole,” as we know it, are now very likely numbered, and with it Hollywood’s systemically stupid business model that favors stale corporate franchises—ones that employ but a small fraction of screenwriters vying for opportunities—over new ideas, and that reduce the shelf life of multimillion-dollar investments (blockbuster movies) to a 72-hour window (opening weekend) instead of valuing them as perennial digital assets that, like a Netflix series, can grow a following over months and even years and deliver a slow-but-consistent return on investment instead of a “crack high” frontloaded release followed by a sobering 80% plunge the following Saturday. This is a good thing, for both the industry and the culture. It’s evolutionary, in fact—a cosmic course-correction against a vested corporate interest in maintaining the status quo, no matter how outdated or inefficient, to ensure unsustainable financial growth over cultural health and prosperity.
The institution of storytelling, meanwhile, will thrive regardless of the medium, as it has since the days of cave-painting. It will, in fact, grow more emotionally and intellectually complex as its presentational mechanisms evolve—and as audiences grow ever savvier to the time-honored conventions of narrative—and may one day soon wrest free from the corporate stranglehold that’s been choking it of contemporary relevance and retarding its cultural and creative progression for the last fifteen years. Imagine it: commercial filmmaking transformed from the Gen X greatest-hits compilation it is now into the essential leading-edge art form of the twenty-first century. Better late than never.