Writer of things that go bump in the night

Tag: nostalgia (Page 2 of 2)

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 1

Editor’s note:  By even the indefensibly prolix standards of this blog, the following essay—an analytical piece on Hollywood mega-franchises and how audiences wind up serving them more than they serve us—is a lengthy one.  Accordingly, “In the Multiverse of Madness” will be published in two separate parts, with the concluding installment following this one by a week.  I thank you in advance for your time and attention, neither of which I take for granted.


In last month’s post, I proffered that when a fan-favorite media franchise no longer serves us—when we come to recognize some of the popular fictions we’ve cherished embody values we no longer endorse, and potentially even threaten to stand in the way of where we need to go—often the best thing we can do for ourselves is to let it go, purposely and permanently.

Letting go is not about “canceling” (someone like disgraced geek god Joss Whedon) or boycotting (the films of, say, Woody Allen); it’s not about taking action at all.  Instead, letting go is not doing something any longer—not renting out any more space in your life or in your head to the likes of Whedon or Allen, or even to the culturally defining popular narratives whose very ubiquity we take as a God-given absolute:  Star Wars, Star Trek, Harry Potter, DC and Marvel, to name but a sampling.

Despite the universal prevalence of those transmedia brands—not merely the plethora of movies and TV shows, but the licensed apparel and iPhone cases, the die-cast collectables and plush toys—we can, if we choose, be done with any or all those franchises as of… right now.  To learn to live without them entirely.  And happily.  Even lifelong, hardcore superfans can learn to let go of their preferred multimedia pastimes.

It’s both easier and harder than you may think.

Just imagine never caring about ANY of this ever again…

But wait!  What if you happen to genuinely enjoy Star Wars or Star Trek or DC or Marvel?  If you’re a fan, and some or all of those entertainment franchises add value to your life’s experience, by all means, disregard this post’s advice.  Though perhaps first consider this:

For most of Hollywood history, the movie business has needed a hostage buyer, a customer with little choice but to purchase the product.  First, this was the theatre chains, which the studios owned, or controlled, until 1948, when the Supreme Court forced the studios to sell them on antitrust grounds.  In the eighties and nineties, video stores partly filled the role.  But, increasingly, the hostage buyer is us.

Today, the major franchises are commercially invulnerable because they offer up proprietary universes that their legions of fans are desperate to reënter on almost any terms.  These reliable sources of profit are now Hollywood’s financial bedrock.

Stephen Metcalf, “How Superheroes Made Movie Stars Expendable,” New Yorker, May 21, 2018

Consider:  How many of us are unwitting “hostage buyers”—fans who continue to subscribe to certain multimedia franchises no longer out of pleasure, but lately out of habit?  Out of decades-long conditioning?  We may watch Star Wars, for instance, simply because we’ve always watched Star Wars, even if we can’t truly recall the last time we actually enjoyed it the way we did when we were ten years old—with pure and wondrous abandon.  Bad word-of-mouth will steer us clear of a one-off bomb like Blackhat or King Arthur:  Legend of the Sword or The Happytime Murders, but it’ll merely lower our expectations for Star Wars:  The Rise of Skywalker and X-Men:  Dark Phoenix and Terminator:  Dark Fate, not deter us from seeing those umpteenth sequels for ourselves.

When that happens—when we’re willing to spend our money, time, and attention (our three primary modes of currency) on a product we know in advance is shit—we’re no longer fans of those franchises so much as brand loyalists.  Habit buyers, if not outright hostage buyers.  And it can be hard to recognize that in ourselves—harder than we might realize.  I was still reading Batman comics into my thirties, who-knows-how-many years after I stopped enjoying them—long after a once-joyful pleasure became an interminably joyless obligation.  So, why was I still reading and collecting them?

Because I’d always read comics, from the time I was a kid; I’d buy them at the corner candy store in my Bronx neighborhood with loose change I’d rummaged from the couch cushions and reread each one a thousand times.  I’d share them with my grade-school gang, and vice versa.  I’d collected them for as long as I could remember, so it truly never occurred to me a day might come when they no longer added value to my life—when they’d outlived their onetime reliable purpose.  And for years after I reached that point of terminally diminished returns, I’d continue to spend money, to say nothing of time and attention, on a habit I wasn’t enjoying—that did nothing but clutter my home with more worthless shit that went straight into indefinite “storage” in the closet.  Why the hell did I do that?

Because I’d ceased to be a fan and had instead become an obedient brand loyalist—an institutionalized hostage buyer.  And, to be sure, corporate multimedia initiatives—which is to say the those so-called “mega-franchises” from which there is always one more must-see/must-have sequel, prequel, sidequel, spinoff, TV series, tie-in comic, videogame, and branded “collectible” being produced—very much count on our continued, unchallenged fidelity to once-beloved concepts and characters…

… and they are doubling down on the billion-dollar bet they’ve placed on it:

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

The End: Lessons for Storytellers from the Trump Saga

The election of Joseph R. Biden Jr. earlier this month offered the very thing our movie franchises and television series have denied us for two decades:  catharsis.


For a writer, it turns out I may suffer from a staggering lack of imagination.

I will confess to anxiously entertaining all the apocalyptic post–Election Day scenarios contemplated by even our most sober pundits and analysts:  the disillusion-fueled outrage on the left should Trump eke out a narrow Electoral College win despite losing the popular vote to Biden; or, the armed militias activated by the president in the event of his loss.  Like the set of a Snake Plissken movie, store windows on Fifth Avenue and Rodeo Drive were boarded up; correspondingly, I barricaded my own front and balcony doors as I watched, sick to my stomach, an endless caravan of MAGA-bannered pickup trucks roar past my home in the liberal bastion of Los Angeles the weekend before Election Day.  I girded for the possibility (if not inevitability) of social breakdown, fully aware I would not be cast in the part of uber-competent dystopian hero—the Rick Grimes or Mad Max—in that story.

What I never imagined—not once, even fleetingly—was that upon receiving official word of a Biden/Harris victory, cities across the country, and the world over, would spontaneously erupt into large-scale celebration worthy of an MGM musical.  Ding-dong!  The witch is dead!  It was a perfectly conventional—and conventionally predictable—Hollywood ending, yet I never saw it coming.

The galaxy celebrates the death of Darth Vader

Despite all the warnings I’ve issued about the unconscious maleficent messaging in our commercial fiction—stories in which messianic saviors redeem our inept/corrupt public institutions (Star Wars and superhero sagas), armed men with badges act without even the smallest measure of accountability (action movies and police procedurals), and environmental destruction/societal collapse are not merely inevitable but preferable (Mad Max:  Fury Road, The Walking Dead), because apocalypse absolves us from our burdensome civic responsibilities—this election season has exposed my own susceptibility to pop-cultural conditioning.

It wasn’t merely a spirit of doomism I nursed throughout October; it was an unchallenged assumption that the interminable Trump narrative would simply do what all our stories now do:  hold us in a state of real-time presentism (“We’ll have to wait and see” and “I will keep you in suspense” are common refrains from the outgoing president) rather than arrive at definitive conclusion.

The erosion of cathartic narrativity is a subject I’ve admittedly addressed a lot here on the blog since I first published “Journey’s End” over five years ago, but it’s essential to understanding how the Trump presidency came to be, and why we all felt such an atavistic sense of relief when it reached an end on November 7.

Around the turn of the millennium, storytellers mostly abandoned the Aristotelian narrative arc—with its rising tension, climax, and catharsis—in favor of “storyless” fiction with either a satirical-deconstructionist agenda (Family Guy, Community) or to emulate the kind of open-ended worldbuilding previously the exclusive province of tabletop RPGs and videogames (Game of Thrones, Westworld).

Continue reading

Trick-or-Treating Is Canceled? Why Disrupted Halloween Traditions Are Nothing to Fear

Owed to my Romantic proclivities, the most spiritually challenging aspect to living in Los Angeles is its seasonal monotony.  I am never so acutely aware of it than at this time of year, when my biorhythms, still calibrated for the East Coast after nearly two decades, anticipate the cooling of the air and coloring of the foliage.  With only gentle reminders, at best, from Mother Nature of the Earth’s shifting axial tilt, a greater metric burden is placed on holidays:  Celebrating St. Patrick’s Day is how I make the mental transition to spring; Fourth of July reminds me summertime has commenced in earnest; Thanksgiving heralds the coming Christmas season, when those who are dear to me will be near to me once more.

In that way, holidays do more than merely mark the passage of timeanother birthday, another Mother’s Day, another New Year’s Eve—but in fact give the year its very structure.  With the exception of August, which itself is traditionally a time for family vacations, every month has at least one official holiday that helps define it.  The particular aesthetics of one over the other, from its foods to its music to its very color palette, conjures a fully immersive sensory experience all its own.  Sure, we may prefer some holidays over others, or celebrate some more than others, but where would we be without them?

I guess we’d be in 2020.  I don’t know about you, but the only friends I got drunk with on St. Paddy’s were Sean Penn and Gary Oldman; the only baseball games I got out to this past spring featured Cleveland Indians starting pitcher Charlie Sheen; the only beach I visited this summer was out on Amity Island.  My cousin’s son turned twelve this past May, and I couldn’t help lament he wouldn’t be spending what will likely be his last summer of innocence on the streets with his friends as I did; I sent him a copy of Stephen King’s The Body so he could at least have a vicarious boyhood adventure.  We’ve all made due however we must this year, “celebrating” seasonal occasions in our living rooms or backyards, clinging to the semblance of normality those traditions provide in these traumatically abnormal times.

But when the Los Angeles County Department of Public Health prohibited trick-or-treating last month, that was a bridge too far.  Parents—not kids, mind you—went apeshit, and the very next day L.A. softened its position substantially, merely recommending against the time-honored practice, so cease-and-desist with the hate-tweets, please!  Banning trick-or-treating was perceived as canceling Halloween—an unacceptable sacrifice in a year full of previously unthinkable compromises.

The Peanuts gang goes trick-or-treating in “It’s the Great Pumpkin, Charlie Brown” (1966)

It’s impossible to imagine my own parents, who always made the holidays special, reacting so histrionically.  The first decade of my late father’s life, after all, coincided with the Great Depression; I don’t think he would’ve felt particularly sorry for us had trick-or-treating been suspended on account of a major public-health crisis.  And not because he was unkind or unsympathetic, but rather because he wouldn’t have viewed it as an impediment to celebration.

Continue reading

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading

Naomi Klein’s “On Fire” (Book Review)

Since I trained under former vice president Al Gore to serve in his Climate Reality Leadership Corps just over a year ago—a period in which no fewer than eighty-five federal environmental regulations have been rolled back, greenhouse-gas emissions have spiked (after leveling off in years prior), polar-ice melt is outpacing predictive modeling, and the Intergovernmental Panel on Climate Change has strenuously warned us we have a mere decade to halve our current rate of carbon-burning if we hope to avoid the most catastrophic effects of climate change—there is one distinct emotional state that has been entirely absent from my life.

Despair.

I might, in fact, be happier and more optimistic than at any other point in my adult life.

Activism, I’ve discovered, is the antidote to despair, to doomism.  Over the past year, I’ve given public presentations on the Energy Innovation and Carbon Dividend Act, a bipartisan bill in Congress that would charge fossil-fuel extractors for the privilege of pollution—of treating the public commons of our atmosphere like an open sewer—they’ve thus far enjoyed free of charge.

This past March, my Climate Reality chapter was proud to enlist Los Angeles into the County Climate Coalition, an alliance of jurisdictions across the United States, formed by Santa Clara County Supervisor Dave Cortese, that have formally pledged to uphold the standards of the Paris Accord.  Less than six months later, we were in attendance as the L.A. County Board of Supervisors voted to adopt the OurCounty sustainability plan, one of the most ambitious green initiatives in the United States.

And just last month, I joined 300,000 activists in Lower Manhattan for the Global Climate Strike as we swarmed the streets of City Hall, marched down Broadway, and rallied at Battery Park—where no less than Greta Thunberg addressed the crowd.  None of that, as it happens, has left much time to actually worry about the climate breakdown.

Greta Thunberg at the Global Climate Strike in New York City on September 20, 2019 (photo credit: Sean P. Carlin)

But that level of activism, I acknowledge, isn’t something to which everyone can readily commit.  So, if you want to share my profound hopefulness about the solutions to the climate crisis—if you want to appreciate the world-changing opportunity humanity has been handed by history—do yourself a favor and read a book that might admittedly be outside your comfort zone:  Naomi Klein’s On Fire:  The (Burning) Case for a Green New Deal.

Naomi Klein’s “On Fire: The (Burning) Case for a Green New Deal”

I promise:  You won’t be inundated with scientific facts and figures; if you want to understand the basic science of global warming, Mr. Gore’s documentaries An Inconvenient Truth (2006) and An Inconvenient Sequel:  Truth to Power (2017) are both excellent primers.  Naomi Klein’s On Fire is a recently published collection of her essays and lectures from the past decade, bookended by all-new opening and closing statements on why a Global Green New Deal is the blueprint for an ecologically sustainable and socially equitable twenty-first century:

The idea is a simple one:  in the process of transforming the infrastructure of our societies at the speed and scale that scientists have called for, humanity has a once-in-a-century chance to fix an economic model that is failing the majority of people on multiple fronts.  Because the factors that are destroying our planet are also destroying people’s quality of life in many other ways, from wage stagnation to gaping inequalities to crumbling services to the breakdown of any semblance of social cohesion.  Challenging these underlying forces is an opportunity to solve several interlocking crises at once. . . .

. . . In scale if not specifics, the Green New Deal proposal takes its inspiration from Franklin Delano Roosevelt’s original New Deal, which responded to the misery and breakdown of the Great Depression with a flurry of policies and public investments, from introducing Social Security and minimum wage laws, to breaking up the banks, to electrifying rural America and building a wave of low-cost housing in cities, to planting more than two billion trees and launching soil protection programs in regions ravaged by the Dust Bowl.

Naomi Klein, On Fire:  The (Burning) Case for a Green New Deal, (New York:  Simon & Schuster, 2019), 26
Continue reading

Oh, Snap! The Nostalgia-Industrial Complex — ’90s Edition

Et tu, Millennials?  The old nostalgia-industrial complex got its hooks into you, too, huh?  I must ask:  Have you not witnessed in firsthand horror what pining for the good old days has done to Generation X…?

To recap:  We Xers have thus far spent the twenty-first century reliving all our childhood favorites—Star Wars, Super Friends, Karate Kid, Ghostbusters, Lethal Weapon, Halloween, Bill & Ted, Tron, Transformers, Terminator, Top Gun—a pathological exercise in self-infantilization that has catastrophically retarded both the culture as well as a generation of middle-aged adults who are at this point more passionately invested in Skywalkers and superheroes than are the juvenile audiences for whom those characters were originally intended.

Always keen to recognize—and replicate—a winning formula, a new permutation of forward-thinking backward-gazing has recently seized Hollywood:  Sell nineties-era nostalgia to the generation that came of age in that decade!  Over the past few years, we got a pair of Jurassic Park remakes-masquerading-as-sequels that didn’t inspire a single word of enthusiasm (certainly not a second viewing), but nonetheless earned over a billion dollars apiece, while our last conventional movie star, Dwayne Johnson, used his considerable clout (or more aptly muscle?) to resurrect both Jumanji and Baywatch.  As for this year?  Hope you’re excited for warmed-over helpings of The Lion King, Men in Black, Toy Story, Aladdin, and yet more Jumanji.  And while we’re at it, let’s welcome back slacker duo Jay and Silent Bob, because surely their grunge-era stoner humor still holds up in middle-age—

Our sentiments exactly, fellas…

—as well as Will Smith and Martin Lawrence, back from buddy-cop purgatory for more Bad Boys badassery!  You know damn well whatcha gonna do when they come for you:  Buy a ticket!

For an indeterminate, but clearly not immeasurable, swath of moviegoers, there is no marketing campaign more alluring than one that taps into foggy childhood memories. . . .

. . . The great nostalgia-industrial complex will [continue] steamrollering us against our better judgment into multiplexes, hoping for a simulacrum of the first high we felt watching great characters years ago.

Tom Philip, “Summer ’19 Brought To You By Nostalgia-Bait Movies,” Opinion, New York Times, July 4, 2019

Not just multiplexes.  (And how are those even still a thing?)  On the small screen, VH1 revived game-changing nineties slasher franchise Scream this summer (how, for that matter, is VH1 still a thing?), and new iterations of decade-defining teen melodramas 90210 and Party of Five are on the way.  Dope.

Continue reading

All That You Can’t Leave Behind: On Memories, Memorabilia, and Minimalism

A lifelong packrat, here’s the story of my unlikely conversion to minimalism.


Concert tickets.  Refrigerator magnets.  Christmas ornaments.  Comic books.  Trading cards.  Greeting cards.  Bobbleheads.  Bank statements.  Photo albums.  Vinyl records.  Shoes.  Shot glasses.  Jewelry.  Blu-rays.

What does the stuff we collect, consciously or unconsciously, contribute to the story of our lives?

And… what does it mean for us when there’s less of it?

Photo credit: Ticketmaster blog, June 26, 2015

In an opinion piece that appeared in the New York Times earlier this month, columnist Peter Funt laments the obsolescence of analog mementoes in a Digital Age:

And so ticket stubs join theater playbills, picture postcards, handwritten letters and framed photos as fading forms of preserving our memories.  It raises the question, Is our view of the past, of our own personal history, somehow different without hard copies?

Peter Funt, “Does Anyone Collect Old Emails?,” Opinion, New York Times, April 5, 2019

In recent years, I’ve expanded this blog from its initial scope, an exclusively academic forum on storytelling craft, to chronicle my own personal history, often in no particular order.  I am ever and always in search of a clearer, more complete, more honest perspective on my past, and how it has shaped the narrative arc of my life; I mine my memories regularly for content, and for truth.

I have also routinely expressed apprehension about the practices we’ve lost in a Digital Age, the kind to which Mr. Funt refers, particularly as that applies to the corrupted discipline of storytelling itself:  From the superhero crossovers of the “Arrowverse,” to the literary Easter-egg hunt of Castle Rock, to the expansive franchising of Star Wars, today’s popular entertainments are less concerned with saying something meaningful about the human condition than they are with challenging the viewer to catch all their internal cross-references.  Whereas stories once rewarded audiences with insight, now the reward is the esteemed privilege of calling oneself a superfan—a participatory designation earned by following all the breadcrumbs and connecting all the dots… an assignment only achievable if one never misses a new installment:

In a nod to the subscription model of consumption—where we lease cars or pay monthly to a music service—the extended narratives of prestige TV series spread out their climaxes over several years rather than building to a single, motion picture explosion at the end.  But this means energizing the audience and online fan base with puzzles and “spoilers”. . . .

. . . The superfan of commercial entertainment gets rewarded for going to all the associated websites and fan forums, and reading all the official novels.  Superfans know all the answers because they have purchased all the products in the franchise.  Like one of those card games where you keep buying new, expensive packs in order to assemble a powerful team of monsters, all it takes to master a TV show is work and money.

Douglas Rushkoff, Team Human (New York:  W. W. Norton & Company, 2019), 163

Fanboys and -girls thought they were legitimized when the geek subculture went mainstream—when superheroes and sci-fi went from niche hobby to pop-cultural monopoly—but they were really just commodified:  “geek” shifted from a stigmatized social category to a lucrative economic one.  Leveraging our telecommunications-induced FOMO, a new permutation of commercial narrative was contrived:  the “mega-franchise,” which seeks not our intermittent audience, but rather our habitual obedience.  Sure, you may not have even liked the last four Star Wars or Terminator or Transformers movies… but do you really wanna run the risk of skipping this one?

More is more: Every “Star Wars” character has its own backstory and action figure—collect them all!

So, given those two ongoing preoccupations—personal history and receding traditions in the Digital Age—the thesis of “Does Anyone Collect Old Emails?” would’ve spoken to me regardless, but the timing of it was nonetheless uncanny, as I have devoted no small degree of consideration in recent months to the matter of the physical objects we amass, wittingly or otherwise, and how they tether us to the past.  Here’s the story.

Continue reading
Newer posts »

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑