Writer of things that go bump in the night

Tag: Minimalism

A History of the Blog (So Far)—and a Programming Update

Since launching this blog eight years ago, I have maintained a consistent publishing schedule of one new post per month.  However, given the ways in which this ongoing project has evolved, that level of output is no longer sustainable.  Here’s a brief chronicle of the blog’s creative progression—and a statement on what comes next.


From the time I signed with my first literary manager in 1998 through the ignominious end of my career in Hollywood in 2014, I was exclusively focused on one form of creative expression:  screenwriting.

Though ultimately unproduced, my scripts nonetheless earned praise from producers and development execs for their uncommon visual suggestiveness and sharp sense of pace, which I controlled through deliberate syntactic arrangement of the very things that do not appear in the finished film for audiences to appreciate:  the stage description.

Screenwriters, if you’re unaware, are not by and large particularly skillful wordsmiths.  And, to be fair, it’s not required of them.  Plot structure, characterization, and dialogue are what the screenwriter is there to provide for a motion picture.  Why waste time and creative energy on pretty prose in a blueprint, which is all a screenplay really is?

A rarified handful of pro screenwriters, Shane Black and James Cameron among them, paint immersive pictures with their words, imparting how the world of the story feels over merely sequentially reporting what happens.  Such is the dynamic mode of screenwriting for which I strove.

Most screenplays—and I’m talking about scripts to produced films, written by Hollywood’s A-list scribes—aren’t much more than utilitarian laundry lists of things we’ll see and hear onscreen, conveyed without any visceral impression of style or tempo, and are, accordingly, nigh unreadable.  The director, after all, is going to make the movie he sees in his head; the script is just a means to get all the above- and below-the-line talent quite literally on the same page.

Excerpted from “Indiana Jones and the Kingdom of the Crystal Skull” by David Koepp.  Mind-numbing, no?

I actually like words, however.  I like how they sound, and the infinite combinations of meaning that can be made from them.  Truth is, I never should’ve aspired to be a screenwriter.  It was the wrong medium for my talents and interests.  “Author” and “essayist” were always a better fit for my writerly sensibilities.  It took the implosion of my career to finally embrace that.

So, when I started this blog at the encouragement of my wife—one of her many good ideas—I didn’t know quite what to write about except screenwriting.  Accordingly, my first two dozen posts are almost entirely devoted to matters of narrative craft, from my customized Storytelling 101 curriculum to the violation of the Double Hocus Pocus principle in Ghostbusters II to character deconstructions of Jack Bauer and John Rambo and a comparative analysis of the Jack Nicholson and Heath Ledger interpretations of the Joker.

One year into this blogging project, all my notions about narrativity were challenged—perhaps even shattered—by a book I’d read called Present Shock:  When Everything Happens Now (2013) by Douglas Rushkoff, which argued that Joseph Campbell’s “heroic journey,” the dramatic schema that has served as the structural basis for nearly every story in the Western literary canon, had collapsed around the turn of the millennium, as evidenced by the fanatical popularity of “storyless” fiction like Lost, The X-Files, The Sopranos, CSI:  Crime Scene Investigation, The Walking Dead, and Game of Thrones.

Rushkoff’s premise inspired a yearslong scholarly investigation on my part, which began in earnest with a post called “Journey’s End:  Rushkoff and the Collapse of Narrative,” and turned the blog in a new, more complex direction.  This intellectual project would never be the same.

Continue reading

Sorting through the Clutter:  How “The Girl Before” Misrepresents Minimalism

The Girl Before depicts minimalism as an obsessive-compulsive symptom of emotional instability, in contrast with what I can attest it to be from years of committed practice:  a versatile set of tools/techniques to promote emotional balance—that is, to attain not merely a clutter-free home, but a clutter-free head.


In the BBC One/HBO Max thriller The Girl Before, created by JP Delaney (based on his novel), brilliant-but-troubled architect Edward Monkford (David Oyelowo)—ah, “brilliant but troubled,” Hollywood’s favorite compound adjective; it’s right up there with “grounded and elevated”—is designer and owner of a postmodern, polished-concrete, minimalist home in suburban London, One Folgate Street, which he rents out, with extreme selectivity, at an affordable rate to “people who live [t]here the way he intended.”  Prospective tenants are required to submit to an uncomfortably aloof interview with Edward, whose otherwise inscrutable mien lapses into occasional expressions of condescending disapproval, and then fill out an interminable questionnaire, which includes itemizing every personal possession the candidate considers “essential.”

The rarified few who meet with Edward’s approval must consent to the 200-odd rules that come with living in the house (no pictures; no ornaments; no carpets/rugs; no books; no children; no planting in the garden), enforced through contractual onsite inspections of the premises.  Meanwhile, One Folgate Street is openly monitored 24/7 by an AI automation system that tracks movements, polices violations of maximum-occupancy restrictions, regulates usage of water and electricity, sets time limits on tooth-brushing, and preselects “mood playlists”—just for that personal touch.  All of this is a reflection of Edward’s catholic minimalist philosophy:  “When you relentlessly eradicate everything unnecessary or imperfect, it’s surprising how little is left.”

“The Girl Before,” starring David Oyelowo, Gugu Mbatha-Raw, and Jessica Plummer

The Girl Before—and I’ve only seen the miniseries, not read the book—intercuts between two time periods, set three years apart, dramatizing the experiences of the current tenant, Jane Cavendish (Gugu Mbatha-Raw), grief-stricken over a recent stillbirth at 39 weeks, and the home’s previous occupant, Emma Matthews (Jessica Plummer), victim of a sexual assault during a home invasion at her flat.  (Emma, we soon learn, has since died at One Folgate Street under ambiguous circumstances that may or may not have something to do with Edward…?)  Edward’s minimalist dogma appeals to both women for the “blank slate” it offers—the opportunity to quite literally shed unwanted baggage.

This being a psychological thriller, it isn’t incidental that both Jane and Emma bear not merely uncanny physical resemblance to one another, but also to Edward’s late wife, who herself died at One Folgate Street along with their child, casualties of an accident that occurred during the construction of the home originally intended for the site before Edward scrapped those plans and went psychoneurotically minimalistic.  Everyone in The Girl Before is traumatized, and it is the imposition of or submission to minimalist living that provides an unhealthy coping mechanism for Edward, Jane, and Emma, each in their own way:

In this novel, [Delaney] wanted to explore the “weird and deeply obsessive” psychology of minimalism, evident in the fad for [Marie] Kondo and her KonMari system of organizing.  “On the face of it,” he wrote, “the KonMari trend is baffling—all that focus on folding and possessions.  But I think it speaks to something that runs deep in all of us:  the desire to live a more perfect, beautiful life, and the belief that a method, or a place, or even a diet, is going to help us achieve that.  I understand that impulse.  But my book is about what happens when people follow it too far.  As one of my characters says, you can tidy all you like, but you can’t run away from the mess in your own head.”

Gregory Cowles, “Behind the Best Sellers:  ‘Girl Before’ Author JP Delaney on Pseudonyms and the Limits of Marie Kondo,” New York Times, February 3, 2017

Indeed.  And if only The Girl Before had been a good-faith exploration of what minimalism, the psychology and practice of it, actually is.

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

One Good Idea: Reflections on My Longest-Running Project—and Most Successful Creative Collaboration

My wife and I are celebrating twenty-five years together this winter.  God, that’s three impeachments ago.  To place it in even more sobering perspective, the January morning we’d met for our first date at the AMC on Third Avenue at 86th Street, I’d never in my life sent an e-mail.  At best, I had peripheral awareness of the “World Wide Web”—whatever that was—and certainly no idea how to access it (not that I’d ever need to).  I definitely didn’t have a cell phone—which, admittedly, would’ve come in handy, seeing how I was running late to meet her.

But we met that morning just the same; in those days—don’t ask me how, for this secret, like the whereabouts of Cleopatra’s tomb, is permanently lost to history—folks somehow met up at a prearranged location without real-time text updates.  It’s true.  We met many more times over the month that followed, in many more locales around the city:  Theodore Roosevelt Park on the Upper West Side; Washington Square Park in the Village.  (Public parks are a godsend for penniless students at commuter college.)  It was cold as hell that winter, but I never cared; I was happy to sit outside in the bitter temperatures for hours—and we did—just to be with her.  By February, we were officially inseparable—and have remained so ever since.

A lifetime has passed since then, one in which, hand in hand, we’ve graduated college, traveled to Europe on several occasions, moved across the country (on September 11, 2001 of all cosmic dates), weathered the deaths of a parent apiece, eloped in Vegas, cared for twenty-four different pets (mostly fosters), consciously practiced patience with and developed deeper appreciation for one another during this indefinite interval of self-quarantine (we haven’t seen our immediate family on the East Coast since Thanksgiving of 2019), and have perennially quoted lines from GoodFellas to one another… because, well, GoodFellas might be the only thing that’s aged as well as we have.

Goodfellas Joe Pesci and Ray Liotta at the Jackson Hole diner in Queens, the site of many of our dates in the college years

She’s certainly aged preternaturally well in her own right.  She was the prettiest girl at school—no minor triumph, given that there were almost 20,000 students at Hunter College at the time—but she’s impossibly more beautiful today.  When I glance in the mirror, however, I in no way recognize the kid that fell in love with her all those years ago.  This is a good thing.  I think—I hope—I’m a much better man today than I was then.  Kinder; more compassionate; more sensitive; more patient.  Certainly wiser.  And hopefully more deserving of the love she’s given so freely and steadfastly.  Indeed, hopefully that above all else.

All the best ideas to grace my life over the past quarter century have been hers, without so much as a lone exception.  Long before I took up blogging, she’d encouraged me to do so.  I was such an incorrigible Luddite, however; furthermore, I reasoned it would be a distraction from my “real” work:  screenwriting.  Now I wish I’d swapped screenwriting for blogging years earlier.  The former—my bright idea—made me miserable; the latter has allowed me to better know myself unquantifiably.  I am an exponentially better writer for this continuing project—the one she had the wisdom to suggest years before I could see the value in it myself.  She didn’t hold that against me, though; she even set up my WordPress domain.  Now you get to read all about the esoteric bullshit she alone used to entertain over dinner.

Continue reading

All That You Can’t Leave Behind: On Memories, Memorabilia, and Minimalism

A lifelong packrat, here’s the story of my unlikely conversion to minimalism.


Concert tickets.  Refrigerator magnets.  Christmas ornaments.  Comic books.  Trading cards.  Greeting cards.  Bobbleheads.  Bank statements.  Photo albums.  Vinyl records.  Shoes.  Shot glasses.  Jewelry.  Blu-rays.

What does the stuff we collect, consciously or unconsciously, contribute to the story of our lives?

And… what does it mean for us when there’s less of it?

Photo credit: Ticketmaster blog, June 26, 2015

In an opinion piece that appeared in the New York Times earlier this month, columnist Peter Funt laments the obsolescence of analog mementoes in a Digital Age:

And so ticket stubs join theater playbills, picture postcards, handwritten letters and framed photos as fading forms of preserving our memories.  It raises the question, Is our view of the past, of our own personal history, somehow different without hard copies?

Peter Funt, “Does Anyone Collect Old Emails?,” Opinion, New York Times, April 5, 2019

In recent years, I’ve expanded this blog from its initial scope, an exclusively academic forum on storytelling craft, to chronicle my own personal history, often in no particular order.  I am ever and always in search of a clearer, more complete, more honest perspective on my past, and how it has shaped the narrative arc of my life; I mine my memories regularly for content, and for truth.

I have also routinely expressed apprehension about the practices we’ve lost in a Digital Age, the kind to which Mr. Funt refers, particularly as that applies to the corrupted discipline of storytelling itself:  From the superhero crossovers of the “Arrowverse,” to the literary Easter-egg hunt of Castle Rock, to the expansive franchising of Star Wars, today’s popular entertainments are less concerned with saying something meaningful about the human condition than they are with challenging the viewer to catch all their internal cross-references.  Whereas stories once rewarded audiences with insight, now the reward is the esteemed privilege of calling oneself a superfan—a participatory designation earned by following all the breadcrumbs and connecting all the dots… an assignment only achievable if one never misses a new installment:

In a nod to the subscription model of consumption—where we lease cars or pay monthly to a music service—the extended narratives of prestige TV series spread out their climaxes over several years rather than building to a single, motion picture explosion at the end.  But this means energizing the audience and online fan base with puzzles and “spoilers”. . . .

. . . The superfan of commercial entertainment gets rewarded for going to all the associated websites and fan forums, and reading all the official novels.  Superfans know all the answers because they have purchased all the products in the franchise.  Like one of those card games where you keep buying new, expensive packs in order to assemble a powerful team of monsters, all it takes to master a TV show is work and money.

Douglas Rushkoff, Team Human (New York:  W. W. Norton & Company, 2019), 163

Fanboys and -girls thought they were legitimized when the geek subculture went mainstream—when superheroes and sci-fi went from niche hobby to pop-cultural monopoly—but they were really just commodified:  “geek” shifted from a stigmatized social category to a lucrative economic one.  Leveraging our telecommunications-induced FOMO, a new permutation of commercial narrative was contrived:  the “mega-franchise,” which seeks not our intermittent audience, but rather our habitual obedience.  Sure, you may not have even liked the last four Star Wars or Terminator or Transformers movies… but do you really wanna run the risk of skipping this one?

More is more: Every “Star Wars” character has its own backstory and action figure—collect them all!

So, given those two ongoing preoccupations—personal history and receding traditions in the Digital Age—the thesis of “Does Anyone Collect Old Emails?” would’ve spoken to me regardless, but the timing of it was nonetheless uncanny, as I have devoted no small degree of consideration in recent months to the matter of the physical objects we amass, wittingly or otherwise, and how they tether us to the past.  Here’s the story.

Continue reading

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑