Charlotte Howell – Antenna http://blog.commarts.wisc.edu Responses to Media and Culture Thu, 30 Mar 2017 23:48:47 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.5 Straddling the “Edge”: The Invisible Trend of Religion on TV http://blog.commarts.wisc.edu/2015/03/04/straddling-the-edge-the-invisible-trend-of-religion-on-tv/ http://blog.commarts.wisc.edu/2015/03/04/straddling-the-edge-the-invisible-trend-of-religion-on-tv/#comments Wed, 04 Mar 2015 15:00:54 +0000 http://blog.commarts.wisc.edu/?p=25646 Lost's Last Supper
One of the most compelling trends in American television programming at the moment is almost never even seen as a trend. A variety of shows in various stages of development or production that feature religious topics and imagery include: Constantine on NBC, Dig on USA, A.D. on NBC, Preacher on AMC, Lucifer for Fox, Black Jesus on Cartoon Network, a Ten Commandments-based series for WGN and another for NBC, American Gods on Starz, Daredevil on Netflix, Hand of God on Amazon the list goes on and on. Across broadcast, basic and pay cable, and online streaming platforms, there is a wealth of series dealing with spiritual stories, using specific religions’ dogma, featuring Biblical characters and translating religion into mythos.

So why are these elements ignored in trade news and minimized in promotional materials? Have the press and industry failed to recognize this as a trend or are they deliberately downplaying this widespread development across the TV landscape? With religion on fictional television growing, why is it so difficult for press and PR to acknowledge this shift within the industry?

We regularly hear talk of television’s greater edginess—its willingness to engage with more explicit language, sexuality, and violence. Yet when it comes to religion, things get more complicated. Since the neo-network era, “edge” has been a leading logic of the television industry: a way to gain the attention of desirable, affluent, niche audiences who are thought to seek programs distinctive in some way from the mediocre mainstream. Since the 1980s, the concept of “edgy” has found many additional markers for distinction. From NYPD Blue’s notable nudity and curse words to South Park’s free-for-all offensiveness, the taboos of language, representations, violence, and sexuality have faded. Religion, however, remained a vagary. When religion appeared, it was in general, sanitized terms or single-episode sensational stories that nevertheless avoided faith-based specificity.

In 1990, Horace Newcomb described religion represented on television as “the deeply, powerfully embedded notions of the good that must come from . . . somewhere” but that avoided specifics of belief. Little changed from that description of how religion is featured on television until the mid-2000s, when Battlestar Galactica, Lost, and the long-arm of The Passion of the Christ’s success enabled a period of multiple attempts at religiously-themed television shows. At that moment, the press noticed the pattern: For instance, Variety and The Hollywood Reporter both ran articles examining the “hot topic” of religious content for television, putting shows like Wonderfalls, Joan of Arcadia, Miracles, and Revelations in relation to each other and wider industrial vicissitudes. However, aside from a few successful shows with multiple seasons, this mid-decade trend died, and so too did the industry’s willingness to discuss religious content as a programming trend. It’s unclear why the industry that was able to make these links chose to stop explicitly drawing these connections and preferring to ignore the trend, but the big gamble and big loss of Kings seems the turning point toward skittishness.

Significantly, whereas Deadline has no problem identifying new trends pertaining to romantic comedies, movie adaptations, and medical dramas—regardless of how many of these series get greenlit or survive for longer than a handful of episodes—few articles appear regarding the increasingly widespread presence of religious series across the television landscape. If such series are discussed, as in this TV Guide article, Biblical series are foregrounded while most science fiction series are left out. (Whither the Sleepy Hollow mention, TV Guide?)

Religion may be perceived as “edgy,” or at least risky, in a business sense in that it is cast as somewhat dangerous in an industrial context. Many industry workers don’t want to talk about it or deflect to bigger “spiritual/humanist” questions. Even if writers use Revelation in a specifically Protestant iteration as the key to a show’s ongoing mythology, they remain careful to couch it among other mythologies that appear once. But religion on TV is the wrong kind of edgy for how the shows, studios, and networks conceive of their target audience. As young Americans and wealthy Americans (as well as coastal Americans) are identified as more and more secular, spiritual, or non-religious by Pew research and through anecdotal encounters, religion—particularly Christianity, which is the main wellspring for this content—continues to be thought of increasingly as belonging to old, poor, Heartland Americans, (i.e., not the desired consuming audience for many of these shows). Moreover, appealing to such an audience is cast in opposition to “edge.” Thus, the industry straddles a fine “edge”: On the one hand, networks use Biblical adaptations to get the ratings of Heartland viewers, on the other hand, they make the case to advertisers that the “right” kind of audience can be attracted to view their other shows by downplaying the religious elements while maintaining they won’t alienate viewers.

MV5BMTc4OTcyOTc2M15BMl5BanBnXkFtZTgwODE2OTU1MjE@._V1_In this recent spate of shows, the only notable example of a series that is exploiting its religious content to foreground its edgy bona fides is on Amazon. Continuing to cast itself as the place to go for television that could not appear anywhere else, Amazon Studios picked up Hand of God during its August 2014 pilot season. The series wins at edgy bingo: the main plot of the pilot features a corrupt judge who becomes born-again Christian following the brutal beating of his son and the rape of his daughter-in-law by an assailant that he then discovers via “visions” from God. The judge then conscripts a violent disciple to kill in the name of God. The characters curse freely, the violence is graphic, and drug use is commonplace. Yet it is the exploration of corruption in religion that sets this show apart from others in this recent trend. In bucking the industry’s insistence of downplaying religion as a key narrative element, Hand of God found the “edge” in religion. But you wouldn’t know it from trade press coverage of it.

Share

]]>
http://blog.commarts.wisc.edu/2015/03/04/straddling-the-edge-the-invisible-trend-of-religion-on-tv/feed/ 1
Report from: Generation(s) of Television Studies http://blog.commarts.wisc.edu/2013/04/16/report-from-generations-of-television-studies/ Tue, 16 Apr 2013 14:00:42 +0000 http://blog.commarts.wisc.edu/?p=19674 newcomb-cropThe Generation(s) of Television Studies symposium, held at the University of Georgia last Friday, made visible just how influential Horace Newcomb has been to the field. Over the course of the afternoon presentations, his TV: The Most Popular Art was invoked as a “heretical text” in the context of a film studies program, as an important intervention, and as the ur-text of American television studies. His students and colleagues spoke of Newcomb’s generosity and humility, even as his influence over the field was apparent in every speaker and every presentation. As Tom Schatz pointed out, though Newcomb may be embarrassed to be called a “father of television studies,” his former students now have their own generation of advisees to whom Horace Newcomb is an academic grandfather.

The generations of television studies scholars formed the structure of the day’s events: the morning was devoted to small-group and two-on-one workshopping of a few select graduate student papers with the visiting scholars. The afternoon session of scholarly presentations was devoted to the generation of television studies, a genealogy of the field with Horace Newcomb at the center. Organized by Jay Hamilton and UGA graduate students Evan L. Kropp, Mark Lashley, and Brian Creech, the symposium marked Newcomb’s retirement as both a full-time faculty member and as director of the Peabody Awards. The presenters reflected the celebration of the man and scholar: colleagues David Thorburn and Tom Schatz, and former students Amanda Lotz, James Hay, Alisa Perren, and Jeff Jones.

Both former students, Lotz and Jones focused on the integration of Newcomb’s contributions to the field into a common sense of television. Lotz began the afternoon session by discussing Newcomb’s article “Magnum: The Champagne of TV?” as a useful map to the field of television studies. She cited the article as the first use of the term “cumulative narrative” to describe the metaplot that extends over the full series but is separate from seriality, and used that metaphor to articulate the cumulative narrative of television studies from “Magnum: The Champagne of TV?” Within that article she could see the metaplot of the field, including: the politics of pleasure, the negotiation of narrative technique in a production economy, the provision of an alternative to rigid ideological analysis, and the way in which various aspects of television to be studied exist in conversation with each other. Where Lotz reflected on the ways the Magnum article had constructed a way of studying television that has become generally normalized for her, Jeff Jones expanded that normalization of Newcomb’s ideas to a general ontology of television with his focus on the concept of the cultural forum. Although the idea of television as a cultural forum has become so commonsense that it can sometimes seem irrelevant, Jones argued that it is still central to the way that we understand television, citing the recent attributions of changing popular sentiment on LGBT rights to its televisual representations.

Schatz and Perren took up Newcomb and Alley’s The Producer’s Medium as a significant influence on how television studies negotiates questions of authorship. Schatz focused on the tension between the film-studies mode of auteurism and the importance of writers and producers in television, peppered with anecdotes of his own friendship and colleagueship with Newcomb as an example of how film studies and television studies converge more than is sometimes thought. Perren also articulated the significance of The Producer’s Medium while calling for contemporary scholars and discussions of showrunners to continue to learn from that text. She argued that The Producer’s Medium positions producers/showrunners as a baseline for understanding broad continuity and changes in television and how there are still many issues of cultural gatekeeping regarding the powerful position.

Hay and Thorburn turned to the future of television studies and how Newcomb’s contributions to the field have paved the way for many possible avenues of media studies. Hay focused on how the intellectual formulation of television studies might lead the way to media studies of formerly invisible media, like smart appliances. He laid a hypothetical path from TV: The Most Popular Art toward a “critical refrigerator studies.” Thorburn, however, sees a full stop to the television that Newcomb had studied. He said that the first great age of television is over, and the next great age will be profoundly different. Regardless of what the next age of television looks like, however, he positioned Newcomb as a great scholastic gardener, never trying to recreate himself but instead cultivating a forest of scholars who will be able to tackle this new era.

At the close of the session, Horace Newcomb spoke about his history with television, both as a scholar and a teacher. As one of those “grandchildren” of television studies, to see Newcomb speak so passionately about himself, the medium, his work, the field, and his position in it was inspirational and electrifying. “I write about television because it changed my life,” he said; “growing up in Mississippi in the 1940s and 1950s, television gave me a different world  . . . television is practical politics.” He ended the day by articulating his educational theory by way of Walt Whitman’s “Song of Myself”:

I am the teacher of athletes,

He that by me spreads a wider breast than my own proves the width of my own,

He most honors my style who learns under it to destroy the teacher.

Between the morning workshops and the afternoon papers, the colleagues, scholars, mentors and mentees gathered in Athens, Georgia stood as proof of that width, honoring Horace Newcomb by spending the day engaging with the field he helped shape.

Antenna and Cinema JournalThis post is part of an ongoing partnership between the University of Wisconsin-Madison’s Antenna: Responses to Media & Culture and the Society for Cinema & Media Studies’ Cinema Journal.

—–

Share

]]>
Mom Enough?: The Return of the Absentee Mother as Threat http://blog.commarts.wisc.edu/2012/05/29/mom-enough-the-return-of-the-absentee-mother-as-threat/ http://blog.commarts.wisc.edu/2012/05/29/mom-enough-the-return-of-the-absentee-mother-as-threat/#comments Tue, 29 May 2012 13:00:50 +0000 http://blog.commarts.wisc.edu/?p=13121 [Note: The following post discusses the first season finales of Alias, Grimm, and Revenge, and thus contains spoilers for those episodes.]

There is an unwritten rule in dramatic television–particularly shows whose genres create unstable realities for the characters–that no one is really dead until you see a body. Through supernatural or soap operatic machinations, characters previously believed to be dead can act as a Chekhov’s gun waiting to go off, upending a protagonist’s worldview and often destabilizing their essential sense of self.

Ten years ago, Alias pulled the trigger on that narrative gun by ending its first season with a shadowy figure in a doorway and a handcuffed and beaten Sydney Bristow looking to her captor and asking, “Mom?” Audience members had known that Sydney’s supposedly dead mother, Irina Derevko, was a Soviet spy and potentially very much alive, but the final moments of the season revealed her as a direct and ongoing threat to Sydney. She was “The Man,” the season’s big bad. The question of “Mom?” mixed hope and terror as the revivification of the maternal is wrapped in violence, a threat left unclear over the four-month summer hiatus.

Now, almost ten years later, two more first seasons of television ended with similar revelations: NBC’s Grimm and ABC’s Revenge. All three shows are set in narrative worlds where twists, threats, and threatening twists are commonplace, relating to ongoing serial mysteries and generic conventions. There is nothing necessarily new about a character’s surprising return, but the particular attention to the absent mother’s return in a threatening form that appeared in two finales last week appears to tap into a current and contentious discourse of motherhood: attachment parenting.

This recent TIME Magazine cover image and the accompanying story discuss attachment parenting as both physical and emotional closeness between mother and child during the child’s formative years. The image on the cover represents an extreme example of that method in which a child is breastfeeding well past the normative time-frame. Underlying this form of parenting is a reaction against absentee-ism and an implicit critique of distance between mother and child. It is this criticism that links with the threatening fictional mothers on Alias, Grimm, and Revenge. Death appears to be the only legitimate reason for an absent mother, and when that death is revealed as a lie, the mother becomes a threat to the child.

Grimm’s first season finale, “The Woman in Black,” followed protagonist detective and creature-hunting Grimm, Nick Burkhardt, as he is threatened by a man who was involved in his parents’ murder. The eponymous woman operates one step ahead of Nick, the police, and the assassin, outwitting, outrunning, and outfighting all before revealing her identity as Nick’s supposedly dead mother. Although the reveal tempers her threatening characteristics–at least toward Nick–the majority of the episode portrays her as a powerful, shadowy figure not to be trusted. She poses a potential physical threat toward Nick by being a clearly better fighter than him (in a few seconds she fells the man he had battled for the previous five minutes), but she also represents a threat to his understanding of self and purpose. If his parents–particularly his mother through whose blood the gift/duty of being a Grimm was passed to him–were not sacrifices to the Grimm duty and name but were/are instead hiding from it or waging their own separate war, how can Nick reconcile his recent acceptance of the mantle? It is yet unclear whether Nick’s mother will live up to her threatening title as the Woman in Black and join the other monstrous women of the show or if the reference to Susan Hill’s recently adapted novel is merely happenstance. The implication, however, seems to be that there is something seriously wrong with her that she’d distance herself from her son when he was a child.

Different from the two mothers discussed above, the reveal that Amanda Clark’s mother may still be alive on Revenge does not pose a physical or immediate threat to her daughter, but it does show a similar potential for existential crisis. What happens to Amanda’s singularly focused drive for revenge when she may have family yet to love and lose? Alternatively, what role might Mrs. Clark play in the vast conspiracy that instigated Amanda’s vengeance? Might she be “the Man” behind the Initiative? How bad could she have been to warrant a faked death and total isolation from her young daughter? We are only told that there is more to Mrs. Clark’s alleged death than we know, and that David Clark didn’t even bring a picture of his allegedly dead wife when he moved to the Hamptons. Until next season, she is an empty vessel for viewer supposition, the equivalent of a shadow in a doorway, a maternal threat cultivated through absence, a bomb waiting to go off.

Share

]]>
http://blog.commarts.wisc.edu/2012/05/29/mom-enough-the-return-of-the-absentee-mother-as-threat/feed/ 1
Award Winning: The 84th Annual Academy Awards http://blog.commarts.wisc.edu/2012/02/28/award-winning-the-84th-annual-academy-awards/ Tue, 28 Feb 2012 14:08:31 +0000 http://blog.commarts.wisc.edu/?p=12364 I suddenly have a strong desire to go buy tickets to a movie at my local theater and ask everyone I know to go with me and cancel my Neflix account and–wait a minute.  Maybe that’s just the Oscars talking, literally. Though I watch the Academy Awards telecast every year, this year seemed more explicit about its message than any in recent memory: go (back) to the movies, America.

The motif of Hollywood’s glory days comprised a large part of the Oscar discourse between the nominations and the awards ceremony, mostly centering on the multiple nominations and odds of winning for The Artist and Hugo (though there was also some discussion about Viola Davis in relation to the legacyandlegend of Hattie McDaniel’s Oscar experience).  Many heralded this year as indicative of Hollywood eating its own tail, creating an ouroboros of self-congratulation and nostalgia for the medium’s history. This claim was further supported by the LATimesinvestigation into the demographic composition of the Academy. The results were unsurprising: it’s mostly white men who were alive when cinema was still the dominant American entertainment form.  All of this led us to last night, an Oscar ceremony that seemed to hammer into the audience at home one clear message: “Movies are great, but they’re even better when experienced in a movie theater.  But don’t take our word for it; instead, take the word of thirty or so movie stars, a jaw-dropping Cirque du Soleil act, and a bevy of blue-silk-clad, leggy cigarette girls-cum-ushers who will entice you with free popcorn before the ad-break.”

Through Billy Crystal’s continual references to his eight other hosting gigs, the admittedly gorgeous art deco “movie palace” set design, and a decidedly skewed attention to pre-1990s films in its various salute-to-the-movies montages, this years Oscars felt like it was desperately seeking a halcyon past.  While this is often the case with the Oscars–perhaps more than any other major awards ceremony–this year posed a strongly economic undercurrent to that nostalgia.  It wasn’t as much about the movies from that bygone era but the mode of exhibition and patronage of the mass audience who treated a trip to the movies as a unique and desired cultural experience, and more importantly, who paid for that experience.

The Oscars have never seemed so baldly self-promotional to me before, which is perhaps why the irreverent moments–though few and far between–seemed all the more charming.  These moments make these awards shows the cultural events they are, drawing on the promise of liveness.  The Oscars broke out of the commercial shell (or at least acted enough like they were) when Octavia Spencer was so overcome at her win she could barely make it to the stage, when the winners for best editing didn’t try to fill in for their speechlessness and instead said thank you and “let’s get out of here” and did just that, when Emma Stone swayed onstage enticing,“Let’s dance. Let’s dance,” to convey her excitement, and when some wonderful audience plants shouted “Scorsese” during a Bridesmaids cast presentation, forcing Melissa McCarthy and Rose Byrne to pull mini-bottles of vodka from their decolletages and swig, per their SAGdrinkinggame.

Share

]]>
SAG Awards Drink to Scorsese, Celebrate Union Merger http://blog.commarts.wisc.edu/2012/01/30/sag-awards-drink-to-scorsese-celebrate-union-merger/ Mon, 30 Jan 2012 17:07:51 +0000 http://blog.commarts.wisc.edu/?p=12054 The Screen Actors Guild Awards are a bit of an oddity among the standard awards shows. They don’t have the glamour of the Oscars–though perhaps they do have a higher concentration of movie stars–and following so closely after the Golden Globes, their revelry appears more in the realm of office party than gala event. This year, the SAG awards seemed a bit looser, sporting an atmosphere that made the Golden Globes seem uptight in retrospect. Perhaps this is because there was no “outsider” like Ricky Gervais that the crowd felt they had to guard themselves against, or perhaps Guild members perceive the difference between airing on NBC versus on TBS and TNT as license to let their collective hair down. Or maybe the Bridesmaids cast drinking game was put to good use at the dining tables (at least by the end of the night, a number of attendees raised their glasses at Steve Buscemi’s legitimate mention of Martin Scorsese).

Whatever the reason, this year’s Screen Actors Guild Awards ceremony crystallized its role during awards season: it’s a semi-insular, half-sober, self-congratulatory vocational celebration that embraces the paradox of a union made visible through millionaires.  The show kicked off with a strange example of this paradox as John Cryer, Demian Bichir, Emily Watson, and Jim Parsons proclaimed “I am an actor.” They were celebrating their profession while simultaneously seeming to argue against some unseen attack on their chosen field, surrounded by opulence but implying labor solidarity.

More than anything else, that which sets the Screen Actors Guild Awards apart from its red-carpeted brethren is its union celebration, a narrative that seemed especially emphasized during this year’s ceremony. At its base, the Screen Actors Guild is a professional union with a mission to “enhance actors’ working conditions, compensation and benefits and to be a powerful, unified voice on behalf of artists’ rights.” The first half of that mission statement sometimes gets lost in the much more visible second half, but this year’s awards emphasized the Guild’s efforts for working actors (not just the stars seated in the audience) with a “note of appreciation” to all the branches of the Guild, called the SAG story. In the video presentation that played during the first half of the ceremony, brief scenes from a variety of (mainstream, Hollywood, and critically-acclaimed) movies featured local actors and the big name directors who lauded their work. Ben Affleck described his Boston actors as “so perfect and so real” and Robert Redford speaks of two Georgia actresses who were “right on” with their characters in The Conspirator. The air of authenticity hangs heavy over this segment until the final “local” actor appeared: “Mike Tyson, Las Vegas.” The implications of this apparently comedic turn are unclear but seem to speak to this idea of the SAG awards having it both ways: movie stars and labor solidarity. Admittedly, I haven’t watched the SAG awards every year, but the union themes seemed especially pronounced this year, a year in which labor unions across the country faced threats and recalled strength from solidarity. SAG even created a number of brief videos affirming their support of labor last April, one of which I’ve included below.

In addition to the general state of labor over the last year, the focus on it during the ceremonies was a particular consequence of the state of the union itself, specifically its merger with the American Federation of Television and Radio Artists that was approved by both unions’ boards this weekend.  The President of SAG, Ken Howard, announced this progress during the ceremony and informed that the next and final stage would be ratification by members. It was a moment that couldn’t but remind the audience that SAG is a union, complete with bureaucracy and power positioning and coalition building.

Yet amongst all the reminders and defenses of labor, the looseness of the awards prevailed in small moments: the brilliant cast of Bridesmaids contributing to the generally boozy atmosphere, Dick Van Dyke’s affable surprise at an ovation for his mere presence, Jean-Ralphio (Ben Schwartz) whispering words of encouragement to Michael C. Hall after the latter’s loss, Tina Fey drinking Steve Buscemi’s wine, and, of course, Larry Hagman’s comically large hat.

Share

]]>