history – Antenna http://blog.commarts.wisc.edu Responses to Media and Culture Thu, 30 Mar 2017 23:48:47 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.5 Magical Realism and Fictional Verisimilitude in Medellín http://blog.commarts.wisc.edu/2015/09/07/magical-realism-and-fictional-verisimilitude-in-medellin/ Mon, 07 Sep 2015 14:00:51 +0000 http://blog.commarts.wisc.edu/?p=28082 poster for Marcos with Central and South America outlined in cocaine

Narcos is a new Netflix original series that premiered all its 10 episodes just before start of the network TV fall season. It centers around a U.S. DEA officer in the 1980s who is sent to Colombia to follow the constantly expanding, multi million dollar cocaine trade to its sources. He soon encounters infamous Colombian drug dealer Pablo Escobar and the show follows the police and DEA agents’ attempts to capture and kill Escobar. There are three things that struck me in the first ten minutes and that are all indicative of the type of show Narcos is:

split image: on left, Pablo Escobar as played by Wagner Moura; on right, Pablo Escobar, historical shot

(1) the layered framing: We start with a voiceover comparing present day NSA surveillance with the much more primitive methods used in the ’80s only to then move back yet again to the early ’70s and a strongly abbreviated history of Pinochet’s military coup. The voiceover narrative evokes films like Goodfellas and Casino, gangster narratives told after the fact. In the end, however, the voiceover keeps the viewer at a distance and often carries the burden of exposition of the highly complex historical and political realities that create the backdrop to the moments we see on screen. In fact, at times it feels like the complexities of reality are at odds with the need for characterization, if not sympathetic protagonists, within this fiction.

tanks and soldiers in urban setting ready to strike. seemingly historical footage

(2) the original footage: In the brief historical summaries, the text uses historical footage suggesting the historical veracity of the narrative. In fact, the entire pacing combined with the voiceover often resembles a documentary more than a television drama. Unlike historical fiction that focuses on fictional or composite characters to elucidate the truth of an era or a phenomenon, Narcos places Pablo Escobar front and center, telling his story as fact. And yet the entirety of reality is undermined by prefacing the first episode with defining magic realism as “a realistic setting…invaded by something too strange to believe.” The layering and the verisimilitude suggest that the show is an epistemological inquiry to get to the truths of Pablo Escobar, and the show endeavors to unravel those truths slowly and carefully. The leisurely pacing and often gorgeous backdrop scenery with its clearly marked ’80s fashion all add to a level of care that supports this endeavor—all the while foregrounding and enforcing its fictionality.

face shot of Escobar with clear English subtitle, reading I am Pablo Emilio Escobar Gaviria.

(3) the language: As a native German speaker, few things throw me out of a show more quickly than Germans in Germany speaking accented English to one another (like in the recent Sense8). If I suspend my disbelief, they should speak fluent English just like any Star Trek alien does. Or they should be German speaking German and just be subtitled. Narcos yet again tries to invoke a sense of authenticity and documentary evidence by presenting the large majority of the show in Spanish with subtitles. However, I use the term “tries” purposefully, because while I was initially very excited about the choice to present a show primarily in subtitles, it turns out verisimilitude only goes so far: Wagner Moura, who plays Pablo Escobar, is Brazilian (apparently he and director José Padilha are BFFs). My Spanish native speaker friends tell me that the show is surprisingly good in its linguistic authenticity, but that cannot make up for the fact that the central character speaks Colombian Spanish with a clearly noticeable Brazilian accent. Apparently, US monolinguals or, at least, non-Spanish speakers are the primary, if not only, audience.

mockup web page for Medellin with Vincent Chase and Billy Walsh named. At bottom, image of dead on market square. At right Vincent Chase playing Escobar holding gun in relief.

Last not least: I’m glad I’m not the only one who immediately thought back to Entourage’s Vincent Chase playing Pablo Escobar.


Losing Our Heads for the Tudors: The Unquiet Pleasures of Quixotic History in The Tudors and Wolf Hall http://blog.commarts.wisc.edu/2015/06/23/losing-our-heads-for-the-tudors-the-unquiet-pleasures-of-quixotic-history-in-the-tudors-and-wolf-hall/ Tue, 23 Jun 2015 14:00:51 +0000 http://blog.commarts.wisc.edu/?p=27156 The Tudors and Wolf Hall can actually tell us a great deal about how the early modern appears in contemporary popular culture, as well as how we engage with the historical past.]]> wolfhall

Post by T.J. West, Syracuse University

If we are, indeed, living in the Golden Age of Television, we can also be said to be living in the Golden Age of Tudorphilia (or at least a golden age, as the Tudors seem to bubble to the surface of popular consciousness periodically). From the runaway success of Philippa Gregory’s The Other Boleyn Girl to Hilary Mantel’s award-winning and critically lauded books Wolf Hall and Bring Up The Bodies, the exploits of Henry VIII and his six wives, as well as everyone caught in the crossfire, have re-entered the popular cultural landscape with a vengeance. We seemingly cannot get enough of the Tudors. In this essay, I would like to explore some of the aesthetic and ideological functions of two particular iterations of this obsession with England’s most (in)famous dynasty, Showtime’s The Tudors (2007-2010) and the BBC and Masterpiece Theatre’s Wolf Hall (2015), the latter based on Mantel’s two books on the life of Thomas Cromwell. However, rather than chiding these films as mere escapism or condemning them for distorting Tudor history (both of which may be true to some degree), I would like to argue that they can actually tell us a great deal about not only how the early modern appears in contemporary popular culture, but also why it appears and what it can tell us about how we engage with the historical past.

Of course, both The Tudors and Wolf Hall partake in a long tradition of re-imagining the Tudor court for the contemporary imagination. Alexander Korda’s The Private Life of Henry VIII (1933) solidified the image of Henry as a villainous glutton who devours both chicken legs and wives with the same abandon (this image is due, in no small part, to the corpulent persona assiduously cultivated by Charles Laughton). Other actors would bring different levels of complexity to the role, including Richard Burton’s brooding and Byronic persona in Anne of the Thousand Days (1969) and Eric Bana’s gruffly and dangerously handsome interpretation in The Other Boleyn Girl (2008).

Cue Jonathan Rhys-Meyers, who strides onto the set of The Tudors chewing scenery and shedding clothes. Exuding his signature mix of sultry sexuality and brat prince antics, Henry as Rhys-Meyers portrays Henry as less the erudite and thoughtful scholar-king and more the unruly id that constantly threatens to overwhelm the bounds of the narrative designed to contain him. His excessive and sometimes capricious sexual desires cause chaos at the personal, social, and political levels, leading to more than one ignominious death on the scaffold.


While men do certainly fall victim to Henry’s mercurial changes of temper, it is the women who truly bear the brunt of his sexual whims. While The Tudors contains many scenes of female nudity, the camera often focuses just as intently on the anguished expressions of Henry’s various consorts, particularly Katherine of Aragon and Anne Boleyn (portrayed by the immensely talented Maria Doyle Kennedy and Natalie Dormer, respectively). The first two seasons in particular draw conspicuous attention to the ways in which female bodies and sexuality serve the double-edged function of allowing access to power while also becoming their weak points, for in the world of The Tudors—as in so many other dramas that appear in the cable television world—women’s bodies remain a commodity that can be easily acquired and just as easily cast aside once their “usefulness” is expended. The many close-ups of Katherine’s face registers the emotional and mental anguish she encounters as a result of her own eclipse by Anne, and the camera also focuses on the latter’s face after her eventual fall from grace. While the series alludes to the momentous political and social changes that surround the events of Henry’s court—the annulment, after all, eventually became part of the broader Protestant Reformation—these momentous changes are mapped onto the suffering female body.

If bare flesh, sexual romps, and the anguished female body stand as the aesthetic markers of The Tudors, dim lighting, claustrophobically tight spaces and sinister whispers are those of Wolf Hall. While less explicitly concerned with the rampant sexual escapades of the Tudor dynasty, this latter drama remains just as invested in digging into the grim, dark underbelly of Tudor glamour. Death and a general precariousness of life are a consistent feature of this tightly-plotted vision of Henry’s reign. Death here can come in many forms, whether as the sweating sickness that claims the lives of Cromwell’s wife and daughters within the first episode or the despair that takes hold of Cromwell’s mentor Cardinal Wolsey as he tumbles out of Henry’s orbit and into ignominy.

While Damian Lewis may not have the smokey, pin-up good likes of Rhys-Meyers, he does have his own brand of handsomeness, and it is worth noting that he actually looks like Henry was supposed to have looked, with his fiery-gold hair and fair skin. Likewise, Lewis makes for a more charismatic and likeable Henry, not falling so easily into the realm of sultry camp that always threatens the seriousness of The Tudors. However, it is precisely this charisma that makes this Henry so dangerous and that makes him serve as the perfect foil for Mark Rylance’s more dour and dark Cromwell. This Henry can turn from laughing and light-hearted to dangerously lethal in the blink of an eye, his radiant and sunny personality a mask covering a truly sinister persona just awaiting its chance to strike. As the series progresses, we see that caprice strike down several men and, while Cromwell has so far managed to rise above the bodies, anyone who knows their Tudor history knows that, inevitably, Henry’s sexual desires will once again destroy one of his most faithful councilors.

Clearly, both The Tudors and Wolf Hall remain invested in depicting the Early Modern world as dangerously and exotically other than the world that we currently inhabit. In their own ways, each of these series attempts to tame that dangerousness—to render it intelligible and contained—through the moral codes of melodrama (The Tudors) or the explanatory power of narrative and “literary” historical fiction (Wolf Hall). At the same time, however, they also contain within them a (perhaps unwitting) acknowledgment of the perilously undisciplined nature of both the past and sexuality. While both appear to have been tamed by the discourses we have designed to discipline them and to render them intelligible, there always remains something about them that slips away from us, unknowable, ungraspable, and ultimately ineffable. It is precisely these elements that make the Tudor period so unquietly pleasurable to watch, reminding us of the perilous Quixotism of history.


Why Care About Radio Broadcast History in the On-Demand Digital Age? http://blog.commarts.wisc.edu/2014/11/17/why-care-about-radio-broadcast-history-in-the-on-demand-digital-age/ http://blog.commarts.wisc.edu/2014/11/17/why-care-about-radio-broadcast-history-in-the-on-demand-digital-age/#comments Mon, 17 Nov 2014 19:11:33 +0000 http://blog.commarts.wisc.edu/?p=25048 radio-1Post by John McMurria

As the Radio Preservation Task Force embarks on a collective effort to identify and make publicly accessible radio broadcast recordings and the documents that inform their contexts of creation and use, it is worth posing the question of why we should care about these historical archives beyond their value as traces of the past. Indeed, a pervasive talk among cultural commentators and media scholars defines the significance and status of our contemporary media culture as a “post-broadcast” or “post-network” temporal break from a past media culture that emerged out of the radio broadcasting era of the first half of the 20th century. Frequent tropes invoked to describe this temporal break as progressive, liberating or even revolutionary include those pertaining to media source (from a few to many), media quality (from a lowest-common-denominator mass culture to a plethora of taste-diversified niche cultures), and media use (from passive reception to active engagement). Yet despite this increasingly prevalent temporal narrative, many scholars, including those who invoke the “post” to examine contemporary media culture, have increasingly problematized the verity of this narrative, whether in recognizing the on-going prevalence of broadcast network programming in contemporary media culture or questioning the liberatory state of our socially networked, on-demand media culture. Questioning this temporal narrative shifts the emphasis away from a technology-centric focus on these tropes of progressive social change to understanding media as a material location that is situated in a particular place and time. Locating and making publically accessible radio broadcasts and their supporting archival documents facilitates placing our media past within their particular material locations in place and time while mitigating the generalized understandings that radio broadcasting’s past was a “mass” media of little variety, low quality and limited engagement.

pt-loma-1942-91Radio Preservation Task Force organizers Josh Shepperd and Chris Sterling have foregrounded the importance of place in organizing the search for radio archives on a geographical basis so that researchers in specific locations can develop a situated knowledge of radio history in their designated areas and develop relationships with the institutions and private collectors who might house radio archives. This localized research intends to expand our understanding of local and regional radio programming, an area that has been subordinated to the study of national network programming, and to reveal how localized contexts informed perceptions about national network programming. Thomas Conner, a doctoral student in the Communications Department here at UCSD, has begun a search for radio broadcasting in our neck-of-the-woods. Of particular importance will be finding Spanish language broadcasts that have emanated from both sides of the boarder. Also significant are the military broadcasts that have aired in this city where the military has had a prevalent place in the culture and economy of San Diego. While the search for these broadcasts continues, Thomas has discovered exciting finds such as 100 hours of LGBT programming on local public radio in the early 1980s located at The Lambda Archives of San Diego. The search takes perseverance as the vast majority of solicitations reveal no records or a mute response. Sometimes one’s own passions for radio history can sustain the search. An avid Woody Guthrie fan, Thomas is hopeful to find recordings of the American folk singer and socialist’s local radio broadcasts from Los Angeles in the late 1930s, which to date are non-existent.

Another way in which radio broadcast history can complicate technology-centric narratives of media progress is in the area of media policy. Defining the stakes of media policy today has been the debate over applying “network neutrality” regulations to broadband internet service which would prevent internet service providers from charging users for faster data speeds. But network neutrality talk draws heavily from its conceptual origins in internet utopianism and romantic individualism, the idea that if the networked digital technologies for communication remained open to everyone, society would evolve beyond the corporate controlled and homogeneous mass media of the industrial era toward a more liberated networked information era where individuals were freed to innovate, create, consume and prosper. Evident in statements from Lawrence Lessig and Robert McChesney, two prominent public intellectuals of the media, is a Horatio Alger mythology: “most of the great innovators in the history of the Internet started out in their garages with great ideas and little capital” because “network neutrality protections minimized control by the network owners, maximized competition and… guaranteed a free and competitive market for Internet content.”

Though present in broadcast policy history, this mythology of the liberating forces of market competition was couched within a broader discourse of public ownership of the airwaves. In my research on the emergence of cable television I found a similar discourse of the liberating forces of “pay-TV” to free viewers from the low culture of network television through creating a competitive market. But opposed to pay-TV were low-income and rural residents who were against this stratification of access to television that required subscribers to pay. African Americans too disputed understandings that cable technology would stimulate competitive markets open to all. Following two decades of having close to no opportunities to participate in the economic ownership of broadcast stations or television networks, African Americans faced similar barriers to owning cable systems, particularly when market competition logics drove cable policies by the 1970s. These challenges to the mythology that free markets could redress class and race inequalities included right or entitlement claims for equal access to television. Because many of these rights claims were motivated by the active pleasures that television provided in everyday life, pleasures that had been established in radio broadcasting, they also disputed official meanings of the public interest in broadcasting, including paternalistic notions that citizens were in need of ethical guidance to participate fully in democratic governance.

MMTC-LogoRestoring these historical contexts of pleasure and entitlement to broadcasting as a medium of public ownership is not only important to revising on-going “post-broadcasting” references to a prior era of limited variety and low quality, but also to intervening in today’s media policy debates. A voice in the broadband internet policy debate that has been subordinated to the organizing efforts of media activists and internet social networking companies who have supported network neutrality is a broad coalition of civil rights organizations who support an open internet but oppose the logic and limitations of network neutrality legislation. Organized through the Minority Media and Telecommunications Council, an advocacy group that has fought to open opportunities in media for historically disenfranchised people, this coalition is skeptical that network neutrality rules, and their promise of an open marketplace, could address issues of class, gender and race discrimination. The coalition advocates for more affirmative policies that would intervene into market forces, such as under Section 706 of the Telecommunications Act that gives authority to the FCC to take “immediate action” if broadband is not “deployed to all Americans in a reasonable and timely fashion.” The coalition also recommends modeling the procedures for resolving complaints after Title VII of the 1964 Civil Rights Act, which prohibits discrimination on the basis of a race, religion and sex. This prioritizes the rights of historically disempowered people to equal opportunity that are not accommodated through the free market promises of network neutrality legislation.

Just as these rights claims of historically disenfranchised groups should not be dismissed in policy debates over our media future, we should not dismiss radio broadcasting’s past as a period that contrasts with the revolutionary status of our media present. Instead, a renewed focus on the material histories of radio broadcasting’s past can challenge us to suspend universal assumptions about open markets and attend to the localized material practices and rights claims of the historically disenfranchised.


http://blog.commarts.wisc.edu/2014/11/17/why-care-about-radio-broadcast-history-in-the-on-demand-digital-age/feed/ 3
Crowdsourcing as Consultation: Branding History at Canada’s Museum of Civilization (Part II) http://blog.commarts.wisc.edu/2012/12/19/crowdsourcing-as-consultation-branding-history-at-canadas-museum-of-civilization-part-ii/ Wed, 19 Dec 2012 12:00:27 +0000 http://blog.commarts.wisc.edu/?p=17024 Canadian Museum of Civilization IIIn Part I of this entry, we discussed the recent announcement by the Ministry of Canadian Heritage that the flagship Canadian Museum of Civilization (CMC) will undergo a major transformation by 2017, becoming the Canadian Museum of History (CMH). The newly revamped museum will discontinue three longstanding installations to make way for a large exhibition focused on national history, which will constitute about half the museum’s total exhibit space. As we noted in yesterday’s post, the museum’s new focus on showcasing national “achievements,” “accomplishments,” and “treasures” appears to be a way to elide objects and events from the country’s past that tell a less positive or celebratory narrative.

What is also clear from the estimated date of completion in 2017 is that the government is placing symbolic considerations over practical ones. The CMC and Heritage Minister James Moore’s insistence that the renovated CMH will be open to the public by Canada’s 150th birthday indicates that they must either already have significant plans in place, or they have dramatically underestimated the amount of time it takes to design and curate a major exhibition. This only highlights the lack of serious consideration of the process of public deliberation. While museum staff have been holding community meetings in cities across the country to solicit ideas, their ability to take such ideas seriously based on their timeline is unlikely. Moreover, it’s unclear who exactly will be included in community meetings: citizens must express interest by clicking on a link on the museum’s website and then wait to be invited.

Consider the consultation process that took place during the development of the First Peoples Hall. When initial plans were deemed “too traditional” as a result of the controversy surrounding “The Spirit Sings” exhibit in the 1980s, museum planners went back to the drawing board. While the CMC’s inaugural exhibits opened in 1989, it took the museum about 14 more years to finally open the First Peoples Hall in 2003. Among the factors contributing to the delay was the formation of a consultation committee with First Nations people in accordance with recommendations from the Assembly of First Nations and Canadian Museums Association. In the case of the First Peoples Hall, First Nations participants actually helped to shape the themes and content of the exhibit from the planning stages.

As another point of comparison, the Museum of Anthropology of British Columbia (MOA) was similarly involved in a major design and renewal project titled “A Partnership of Peoples.” Collaboration was key in MOA’s case as museum staff worked with Musqueam and other local communities to design physical and digital museum spaces. Plans were in place in the 1990s for an expected completion in 2010 to allow ample time for collaboration and creation of spaces that would be useful and accessible to those communities represented within them.

CMC curators know very well what it means to participate in meaningful consultation and the benefits it can have for curation, if taken seriously. Many of the museum’s curators are talented, creative and in tune with some of the most important museological shifts occurring over the last 30 years, including the sensitive issues concerning exhibition and Canada’s First Peoples. However, it is not necessarily those talented museum staff who are driving the CMC to CMH transition, and whether those staff will be there throughout or after this transition is uncertain.

This too is troubling. Although the CMH plans to continue to include Aboriginal histories in the new exhibit, it is likely to include them only as they pertain to the teleological drive and accomplishment-oriented focus emphasized by the Canadian Government. The Heritage Minister’s vision for the museum includes a history told from the perspective of a conservative, non-First Nations majority. This suggests a convenient amnesia about past curatorial shifts in Canada toward meaningful consultation and collaboration with First Peoples and with Canadian citizens more broadly.

In the meantime, the Truth and Reconciliation Commission (TRC) has threatened to take the federal government to court for failing to hand over documents related to residential schools in violation of the Indian Residential Schools Settlement Agreement. Despite the millions of dollars allocated to the CMH reorganization, the National Research Centre, which will house some one million records related to Canada’s Indian Residential Schools legacy in accordance with the TRC’s Mandate, remains unfunded by the government. While the CMC’s “What is the Canadian Story?” timeline currently lists the closing of the last residential school in the 1990s as a seminal event marking the nation’s history, what of the 120 years prior to that when the schools were in operation? The testimonials of residential school and intergenerational survivors about abuses in these state-mandated institutions are not in line with the discourse of national “accomplishments,” “achievements” and “treasures” which are apparently to constitute Canadian history in the new exhibit. Perhaps this is why the CMC claimed they did not have the resources to support the TRC’s collection. In this context, it seems that meaningful conversations about historical issues that are actually formative of Canadian culture are less compelling than the $25 million incentive that comes with the tunnel vision of the Ministry of Heritage.


Crowdsourcing as Consultation: Branding History at Canada’s Museum of Civilization (Part I) http://blog.commarts.wisc.edu/2012/12/18/crowdsourcing-as-consultation-branding-history-at-canadas-museum-of-civilization-part-i/ Tue, 18 Dec 2012 12:00:38 +0000 http://blog.commarts.wisc.edu/?p=17029 Canadian Museum of CivilizationThe year 2017 will mark Canada’s sesquicentennial: 150 years since the British colonies in North America came together to form the Dominion of Canada. The date is eagerly anticipated by Canada’s Conservative government, which is planning a series of commemorative events. The trouble is, these events are contrived to commemorate the Conservative government far more than the nation’s glorious (or inglorious) pasts.

History appears to be a pet project of Prime Minister Stephen Harper and his elected officials. Since the Conservatives came to power in 2006, several cultural institutions have been pushed into service to articulate the government’s particular conception of Canadian culture: the twin pillars of monarchy and military. The 200th anniversary of the War of 1812, for example, has become an opportunity for the Conservatives to reframe the battle as a signal moment in Canada’s nation-building project. A budget of $28 million was earmarked for dramatic re-enactments, PSAs, a website and grade school curriculum, and an elaborate physical exhibit at the War Museum in the nation’s capital, all aiming to retrospectively situate the war as a pillar of Canadian identity. Never mind that Canada was little more than a frigid British outpost at the time; or that the outcome of that war remains a matter of scholarly and public debate. Suddenly, the government’s commitments in Afghanistan, its plans to purchase sixty-five F-35 fighter jets, and its desperate desire to thumb its nose at its American neighbor are placed on a teleological timeline whose origins can be traced to the bravery and dedication of those not-yet-Canadians in 1812.

In October 2012, Heritage Minister James Moore announced yet another project in the run-up to 2017: the rebranding and renovation of the Canadian Museum of Civilization, one of the most highly attended museums in the world, and arguably the most symbolically significant museum of Canadian heritage in the country (its collections stem from the mid-19th century and predate the founding of Canada). Its new name, the Canadian Museum of History, signals a novel mandate for the institution: Half the museum space, currently occupied by the dated Canada Hall, the Personalities Hall, and the Postal Museum, will be cleared to make way for a selection of Canadian “accomplishments” and “achievements.” So far, Mr. Moore has focused on objects as major drivers of the exhibit’s themes, pointing to iconic national “treasures” like The Last Spike, which completed Canada’s transcontinental railway in 1885, signifying the conquest of nature through human ingenuity; and items from Terry Fox’s Marathon of Hope, demonstrating the strength of the human spirit in the face of overwhelming adversity. Both, of course, are powerful, self-edifying signifiers for the nation itself. Moore’s vision for the museum has been heavily criticized in the local press, with journalists calling the rebranding “The end of civilization” and decrying the new collection as an uncritical “Hall of Fame.”

Even more egregious than this Whig version of history is the suggestion that the museum’s new role is to let visitors decide on the tenets of Canadian history. The museum’s Web site invites users to “Be part of its creation!” by clicking through to the “My History Museum” site. “What would you put in your national history museum? What stories would you tell? How would you reach Canadians across the country?” asks the site. Users are then presented with an array of options to participate in the creation of their “very own” museum. You can take the “Public Engagement” survey, which asks you to choose how you most like “to connect with history” (“Seeing real artifacts,” “Consulting websites,” Asking “people I know”?). You can scroll through the “What is the Canadian Story?” timeline, clicking the heart icon under images of the Calgary Winter Olympics or Expo ’67 to “like” different events listed on the timeline (with some of the milestones purposely removed to allow users’ suggestions to populate the gaps). You can make a video for the museum’s YouTube channel, telling the world which Canadian “you consider to be an icon for your generation” (recent votes: The Dionne Quintuplets and one user’s grandmother).

The museum is not alone in its attempts to use personal appeals to power public engagement. The digital media company ChinaOnTV recently launched “My Channel,” where visitors upload personal videos and stories about their trips to China, forming a kind of user-generated cultural diplomacy portal. Or witness the “Curators of Sweden” Twitter campaign, a fascinating if misguided effort by the Swedish government to encourage its citizens to tell the “true” story of Sweden by turning over the @Sweden channel to different, hand-picked citizens each week. Citizens duly took on their role as Swedish ambassadors by sleeping in, tweeting about their sexual proclivities, and in one extreme case, making racist and vulgar comments.

Other Canadian institutions have made similar crowdsourcing efforts. The Canadian Tourism Commission (CTC) ran a “35 Million Directors” contest, in which the “entire country” was invited to upload videos of “their” Canada. One Talkback comment on the CTC YouTube channel puts the problem in relief:

User: Although this is a very beautiful video, and I am a very proud Canadian, there was only half a frame on aboriginals…it was their beautiful land to start with.

CTC: … We’re sensitive to the concerns you’ve raised, and we can assure you that visible ethnicity in this video is solely a function of the content that was submitted by Canadians during the 35 Million Directors contest period. All Canadians were equally able to submit content for consideration in the contest.

Beyond the obvious exclusions crowdsourcing can perpetuate, there is much else wrong with this tactic. Such social media strategies are increasingly seen by government institutions as a panacea for problems of civic participation, public deliberation and transparency. Online users do not only provide the museum with content for its website; they give it the appearance of consensus. “Soft” participation platforms like Facebook comments and interactive timelines mask the hard reality that all of the really consequential decisions – the removal of archival material, the intensely problematic indemnification of the collection, the several-million-dollar budget that will require cutting other parts of the Heritage Ministry portfolio – have already been made. If citizens were really meant to be central in deciding which themes are important in Canadian history, the government would have included citizens in the decision-making process at a much earlier stage. And lest we see social media as Democracy 2.0, we would do well to recall Evgeny Morozov’s (2011) observation that oppressive totalitarian regimes also employ strategies of crowdsourcing in the process of nation-building, where “netizens” are made to feel as though they’re participating in important decisions. It doesn’t take a Hill & Knowlton associate to tell you that making people feel involved in politics is an excellent way to paper over the denial of actual political participation.

In the case of a museum, and in the case of the contested terrain of history, having the whole public involved might seem like a great idea. But in the brave new world of Governance 2.0, the invitation to Canadians to “be part of the creation” of Canadian history is an invitation to say very little that matters. And it’s the only invitation they’re likely to get.

Part 2 can be found here.


Mediating the Past: Licensing History, One Game At a Time http://blog.commarts.wisc.edu/2012/11/30/mediating-the-past-licensing-history-one-game-at-a-time/ Fri, 30 Nov 2012 16:43:54 +0000 http://blog.commarts.wisc.edu/?p=16850 **This post is part of our series, Mediating the Past, which focuses on how history is produced, constructed, distributed, branded and received through various media.

Ubisoft’s Assassin’s Creed series has always been invested in history: its central conceit is a machine, the Animus, that allows hero Desmond Miles to travel through his ancestral past unlocking details of an epic battle between the Assassins and the evil Templars. This overarching story has taken players through a collection of cities such as Florence, Venice, Rome, and Constantinople, each painstakingly recreated based on extensive research on the time periods in question.

However, with Assassin’s Creed III the series enters new territory, literally, moving its focus to North America with a game primarily set in Boston and New York during the American Revolution. The result has been an increase in discourse around the series’ approach to history, best captured by Slate’s “The American Revolution: The Game.” Lamenting the lack of realistic portrayals of the colonial era, Erik Sofge writes of Ubisoft Montreal’s bravery in embracing the underwhelming aesthetics of the period with a Colonial Boston that is “boldly, fascinatingly ugly.” Arguing that the general monotony makes the game’s standout moments all the more remarkable, Sofge commends the game, calling it “one of the best, most visceral examinations of the history of the Revolutionary War.”

When considering the Assassin’s Creed series in regards to mediating the past, its relationship to actual history—as in history as it happened—is limited. While heroes like Connor or Ezio might intersect with famous historical figures like George Washington or Pope Alexander VI, or weave their way through a particular historical event, their exploits—science fictional as they are—remain distinct enough that any label other than historical fiction would be undeserved, even if the games’ broader narratives stick largely to basic historical fact (unlike some planned downloadable content).

Indeed, the game’s developers have made a key distinction in the past: in an interview prior to the release of Assassin’s Creed II, creative director Patrice Désilets revealed, “I like to say that histories are licensed. So, yes we can take some liberties, but not too much, otherwise…what’s the point, right?” While we normally consider licensed video games as those based on other media properties—movies, television shows, comic books—the idea of licensing history is similar: the basic details of the licensed property are mediated through the principles of game design and the result is a product undoubtedly based on, but unlikely to be an exact recreation of, the original.

Of course, much as licensed games are often judged based on their authenticity (or lack thereof), Ubisoft Montreal must face questions of historical accuracy; this could lead to Sofge’s effusive praise of the developer’s commitment to representing this period in history, or The Globe and Mail’s objections to the game’s alleged suggestion that “indigenous peoples rallied to the side of the colonists.” Regardless of whether either argument has merit—ignoring that the Globe and Mail argument shows zero evidence the authors have played or even researched the game in question—they reflect the risk and reward in licensing something as meaningful as history, particularly American history: put another way, these same conversations weren’t as visible when the series was licensing European histories less familiar to the world’s largest gaming market.

Although the final judgment on Assassin’s Creed III’s historical accuracy lies in the game itself (which I haven’t played yet), I’m more interested in how the licensing of history is understood within pre-release hype surrounding the release of each game. Ubisoft’s claim to pure historical capital is tenuous, particularly before people have played the game, but they can more readily make a claim to gaming capital as it relates to historical accuracy. The accomplishment is not in the end result—the accuracy of which we can’t even really judge, given we didn’t live in the 18th century—but rather in the painstaking efforts necessary to recreate the minutiae of 18th-century Boston on an immense scale worthy of today’s high-powered consoles. Previews and interviews allow the developers to make this labor visible, detailing the process whereby historical capital is translated through—or, more critically, disciplined by—the logics of game design in order to create the most satisfying experience for the player while nonetheless maintaining that direct line to historical capital.

Concept art for the younger version of Connor featured in ACIII.

This notion of discipline is important, here, though: while history brings potential cultural capital, disciplining that history brings cultural consequences, particularly in a game that features a Native American hero (as Connor’s other was of Mohawk descent). Ubisoft’s narrative, however, firmly encloses the discussion of Native American representation within the rigor of the development process rather than the gameplay experience of the final product. Sofge’s article details their cultural sensitivity through the use of a Mohawk liaison and their commitment to authentic Mohawk dialogue, effectively reporting Ubisoft’s due diligence rather than how it manifests in gameplay; a similar process, rather than product, is revealed in Matt Clark’s profile of ACIII writer Corey May. Within game development this level of cultural accommodation is considered remarkable, allowing Ubisoft’s stated commitment to accuracy to potentially circumvent any criticism of how the game’s design complicates its representation of indigenous peoples (as simply the details about May’s diligence seem to impress the professor of Native American studies consulted for Clark’s article, who has not played the game in question).

As with all licensed properties, then, Ubisoft carefully manages the place of history within the Assassin’s Creed series, much as the series itself is closely managed as the developer’s most successful franchise. When Ryan Smith suggested a roaming art installation featuring works inspired by the game’s historical period was more apt to comment on the revolution of the Occupy movement than the generic “Get Out The Vote” connection on offer, Ubisoft responded that “I think that we have tried not to read into this that much, to be honest. At the end of the day, this is just a video game.” The cultural capital of licensing history has value up until the point that cultural capital becomes politicized or problematized, at which point Assassin’s Creed III ceases being about history, and focuses exclusively on the gaming capital to which Ubisoft can more comfortably claim ownership.


NYFF 2012: History Has Many Cunning Passages [Part Three] http://blog.commarts.wisc.edu/2012/11/01/nyff-2012-history-has-many-cunning-passages-part-three/ Thu, 01 Nov 2012 16:12:02 +0000 http://blog.commarts.wisc.edu/?p=16015 Antenna and Cinema Journal LogosThis post is part of a new, ongoing partnership between the University of Wisconsin-Madison’s Antenna: Responses to Media & Culture and the Society for Cinema & Media Studies’ Cinema Journal.

Study history, we are told, and avoid pitfalls. Easier said than done; its blind alleys are legion. At the New York Film Festival 2012, Pablo Larrain’s NO; Sally Potter’s Ginger and Rosa; and David Chase’s Not Fade Away each suggests that our best option is to see the world in a grain of sand, the macroscopic in the microscopic.

NO, Larrain’s third film about the Chilean dictatorship of Augusto Pinochet, like its predecessors Tony Manero and Post-Mortem, chooses the Chilean in the streets rather than the obvious power brokers to reveal the tenor of troubled times. Tony Manero evokes the brutality of Pinochet’s government through an anonymous psychopath; Post-Mortem views Pinochet’s coup from inside the morgue to which Salvador Allende’s corpse was delivered; and, on a brighter note, NO traces the end of the dictatorship through the transformation of the apolitical Rene Saavedra (Gael Garcia Bernal), a young advertising executive. But Saavedra’s conversion is not the occasion for a feelgood populist potboiler.

In 1988, under international pressure to prove his legitimacy, Pinochet reluctantly permitted a national “Si or No” referendum on his regime, believing he could control the results, as did most Chileans, including Saavedra. NO begins with Saavedra pitching a jazzy Cola commercial and touting its originality, lost in triviality as his country approaches a turning point. When Pinochet’s vulnerabilities become visible, Saavedra switches his loyalties to the “No” coalition, and it prospers. Rejecting the preference of the Communist Party faithful for a serious discussion of Pinochet’s deadly politics, he rocks Chile with rainbow logos and light-hearted singing/dancing commercials predicting happy times through a “No” vote. The popular response is so overwhelming that the military deserts Pinochet as the “No” responses roll in. And Saavedra exults? No. While the country celebrates, clutching his son’s hand, he walks dazedly through the streets.

Saavedra has helped to oust a dictator; however for Larrain, despite its heroism, “the [manipulative] ‘No’ campaign is the first step toward the consolidation of capitalism as the only viable system in Chile.” Case in point: Lucho Guzman (Alfredo Castro), Saavedra’s boss, who initially opposes Saavedra’s activism, ultimately claims credit for his success. The film ends the same way it began, with Saavedra pitching a new, inane commercial for Guzman’s agency. What has been won? What lost?

Sally Potter’s Ginger and Rosa too, is a bittersweet snapshot of an era through one life. It’s 1962; feminism is preparing for its second wave; and British teenager Ginger (Elle Fanning) is determined to avoid the domesticity that has crippled her mother’s ambitions. Her journey brings her in range of police brutality at a peaceful anti-war demonstration and her best friend Rosa’s (Alice Englert) hypocritical religiosity, finally forcing her discovery that her charming father, Roland (Alessandro Nivola), while right about his social beliefs, is wrong almost every time he acts. Prattling about social justice, he cravenly betrays his gorgeous wife Natalie (Christina Hendricks) and breaks his daughter’s heart by plunging into an affair with fourteen-year-old Rosa instead of dealing maturely with his impulses.

The plot is familiar, but Ginger and Rosa takes its melodrama up a notch through Potter’s brilliant creation of onscreen intimacy. While Roland is all about words, Potter is about silences, treating us to numerous glorious explorations of faces, feelingly lit and framed, that reveal everything the dialogue frequently conceals in order to set the stage for the film’s final moments, when Ginger is visited by the spirit of forgiveness behind the letter of both religion and philosophy, perhaps a hopeful omen in a dark present.

David Chase chronicles the same period in Not Fade Away through an unexceptional New Jersey suburb, as it is buffeted by sharply contrasting emancipatory and annihilating historical forces. Some years ago, he told me he thought the two greatest American contributions to the twentieth century were rock and roll and nuclear technology, and his movie begins with a television screen that sets up this dichotomy: a rock-and-roll number is interrupted by a “test of the emergency broadcast system,” Cold War code for hysteria about a potential nuclear attack. Thus, while the plot of Not Fade Away follows the attempts of four boys to form a rock-and-roll band, and to break the mold of their parents’ lives, it’s the counterpoint between the exhilarating influence of rock and roll and the nuclear threat that frames the film. The power of music over the lives of Douglas (John Megaro), the film’s hero, his close friends, and his preternaturally lovely and self-assured girlfriend Grace (Bella Heathcote) gets the lion’s share of screen time, but music is not a cure-all. Douglas ultimately fulfills his father Pat’s (James Gandolfini) repressed dreams, and Grace transcends her father’s abusive, albeit darkly comic, treatment of his daughters’ aspirations. But Grace’s sister collapses into druggy oblivion, the band falls apart, and, after Douglas and Grace make their freedom trek to Los Angeles, at a party in a Hollywood mansion, Grace disappears mysteriously.

At this point, liberation becomes slippery, and the new America spawned by the 60s fades into an unrealized dream. When Douglas, now on his own, tries to hitch a ride home from the party, an old jalopy stops to pick him up. He is invited in by an eerie girl whose face is painted with black tear drops, while, beyond her, a sinister, partly visible driver looks on. Douglas wavers between the seductions of the abyss and a powerful sense of foreboding–and walks. As he leaves the screen, Douglas’s sister appears surreally, and explicitly asks whether the nuclear option or rock will determine the future. There’s a rush of galvanizing music, but that’s not an answer. Rather, a hope? Or a heartfelt exclamation? We don’t know.

In the poem “Gerontion,” T. S. Eliot invokes history as the cemetery of human striving through a desiccated character who has rejected life, as inevitably “adulterated.” Also recognizing the alloyed nature of reality, Larrain, Potter, and Chase do not do likewise, but, rather, imaginatively appraise imperfect options and elusive ideals.

Stay tuned, as Part Four of this series about the New York Film Festival is on the way.


Mediating the Past: Sacred History and Sacrilegious Television Comedy http://blog.commarts.wisc.edu/2012/08/22/mediating-the-past-sacred-history-and-sacrilegious-television-comedy/ Wed, 22 Aug 2012 13:23:37 +0000 http://blog.commarts.wisc.edu/?p=15029

**This post is part of our series, Mediating the Past, which focuses on how history is produced, constructed, distributed, branded and received through various media.

In a 2009 episode, Family Guy joked about a world in which JFK had never been shot. This was not an earnest exploration of historical causality however, but a setup to a gruesome site gag replacing JFK’s assassination with Mayor McCheese’s. To make matters worse, after the parody’s eerily accurate recreation of the event, Jacqueline Kennedy climbs onto the back of the car not to flee, but to eat the McViscera. Of course, offensive humor is Family Guy‘s stock in trade and a spin around the young-skewing dial from FOX to Comedy Central to Adult Swim reveals a host of gags about collective traumas from the assassinations of the 1960s to the catastrophes of 9/11 and Katrina in the 2000s. Even more recent events like the Penn State scandal and the Aurora shootings have comics laughingly asking, “Too soon?”

Joking on these topics offends because they are sacred moments in popular culture’s understanding of its own history. Reportage defines these events using a combination of extreme seriousness and emotion, marking them as sacred in the senses of their importance, uniqueness, and as common touchstones for popular memory. Indeed, the two most archetypal “national traumas” remain the Kennedy assassination and 9/11. The cliche stating that everyone recalls where they were when they first heard about either highlights their importance to both individual and national history. The rules governing humor in bad taste highlight the complex and often ambiguous conflicts of different values in culture. In these instances for example, solemnity regarding the events runs against the respect for free expression tested by sick jokes.

More practically for television, this humor represents an attempt to corner valuable young demographics, but risks public, advertiser, and regulatory flak. So while this humor is governed by the time elapsed since the initial event and decorum of the period, its appearance on television gives particular insight into the growth of narrowcasting and sick humor in the last half century. As the archetypal national trauma of the television era, the JFK assassination not only demonstrates this growth, but the ways in which television comedy has come to play with these events, in a sense rejecting the sacred framework.

In 1983, Eddie Murphy grew tired of his signature SNL character and decided to kill Buckwheat. While obviously in dialogue with the recent shootings of John Lennon and the pope, the bit alluded most directly to the recent Reagan shooting. Certainly, Reagan’s survival helped make this event available for SNL‘s humor, but while Buckwheat’s assassin was a composite of Reagan’s and Lennon’s mentally ill shooters, Buckwheat’s assassin, John David Stutts (also played by Murphy), was killed while being led down a hallway in handcuffs. SNL thus played it relatively safe by limiting its most direct references to Oswald’s death and not JFK’s. Nevertheless, this is one of the first (possibly the very first) example of a comedy show parodying the assassination in any way and it occurred notably on a late-night show known, even as late as 1983, for its edginess.

When the cultural zeitgeist of the early 1990s turned towards conspiracy theorist’s view of history, JFK’s death figured heavily. Along with The X-Files, Oliver Stone’s 1991 film JFK was arguably conspiracy culture’s central text. Television’s growing penchant for parody in the 1990s meant shows like The Simpsons, The Critic, and The Ben Stiller Show would reference the film. But a 1992 episode of Seinfeld left the greatest mark on pop culture. In explaining why they “despise” Keith Hernandez, Kramer and Newman convey the story of having been spat on, launching into an extended stylistic parody of JFK.

Although airing during prime time, Seinfeld skewed young, urban, and liberal–especially in 1992 when it had yet to dominate the ratings. During the season in question, the program aired at 9:30 Eastern in between the risque, if juvenile, humor of Night Court and Quantum Leap, a show often about working through historical trauma. More importantly, though playing with the imagery of the assassination, Seinfeld acts more as a parody of Stone’s stylistic excess rather than a joke about Kennedy’s death.

Despite its apparent edginess, the magic loogie bit would pale in comparison to the ways in which parodists like self-consciously sick Family Guy played with this imagery later. Despite rocky beginnings this program has surpassed The Simpsons as the crown jewel in FOX’s valuably young-skewing Sunday night lineup and acts as the centerpiece to a growing cadre of Seth MacFarlane productions. In 1999, but since cut from reruns, a young boy holds up his “JFK Pez Dispenser” just as a stray bullet shatters its head. Ominously, the child consoles himself with his Bobby Kennedy dispenser.

Like the 2009 Mayor McCheese gag, this joke plays on juxtapositions between sacred politicians and childhood trifles. But they also elicit “I-can’t-believe-they-just-did-that” laughter. They stack uncomfortable humor on top of the fundamental joke. Even by Family Guy‘s standards though, the 1999 gag was edgy. But the shattered plastic of 1999 is downright tame compared to Jackie Kennedy eating Mayor McCheese’s head.

Since the early 80s then, this type of sacrilegious humor has not only grown more extreme, but has moved from fringe programming hours into prime time. To some extent, general social factors like generational shift, the time elapsed since 1963, and broadly-labeled “permissiveness” account for these examples’ increasingly flippant attitude towards sacred history. More pointedly, the network tendency towards ever-more-specific demographics has allayed standards & practices, network, and FCC fears with the assumption that easily offended audiences would not be watching. For a particular demographic, often one too young to remember the moment directly, moments of common historical importance are increasingly being inflected with the flippant attitude of sick humor.


Mediating the Past: History and Ancestry in NBC’s Who Do You Think You Are? http://blog.commarts.wisc.edu/2012/07/12/mediating-the-past-history-and-ancestry-in-nbcs-who-do-you-think-you-are/ http://blog.commarts.wisc.edu/2012/07/12/mediating-the-past-history-and-ancestry-in-nbcs-who-do-you-think-you-are/#comments Thu, 12 Jul 2012 14:45:46 +0000 http://blog.commarts.wisc.edu/?p=13994 **This post is part of our series, Mediating the Past, which focuses on how history is produced, constructed, distributed, branded and received through various media.

NBC’s Who Do You Think You Are? (WDYTYA?) transports viewers through time and space by tracing American celebrities’ ancestries. The show is one of ten international offshoots from the BBC’s show of the same name. Each episode of NBC’s version follows the genealogy of a well-known celebrity. In its three seasons, the show has featured the family trees of Steve Buscemi, Rashida Jones, Gwyneth Paltrow, Sarah Jessica Parker, and Martin Sheen, among others. For each episode, the show’s research team works with corporate sponsor Ancestry.com’s genealogists to trace one or more lines of the celebrities’ family trees. The documentary-style show takes the celebrities to the locations of their historic family moments, both in the United States and abroad, and chronicles their reactions to the historians and historical documents that reveal their ancestors’ pasts. The show is based upon a premise that is repeated in the show’s title sequence, “To know who you are, you have to know where you came from.”

WDYTYA? is fascinating because it tells histories that are different from those most Americans learn from textbooks and celebrated national stories. To help viewers understand the triumphs and injustices uncovered while researching each celebrity’s history, the series regularly includes brief segments that give viewers relevant historical context. Though critics and viewers have largely overlooked the series (its cancellation in May 2012 received no fanfare), its stories are rich and compelling historical lessons about immigration, assimilation, gender, race, and class—their emphasis on both diversity and unity echoes the discourse of the American “melting pot.” Some of the most compelling episodes, however, are those where relatively little information about a celebrity’s ancestors can be found. These episodes shine a light on the fact that the histories of Americans with little economic, political, or cultural power were recorded unevenly (or not recorded at all), and usually focus on one of the United States’ most shameful and devastating practices: slavery.

Through the ancestries of Spike Lee, Emmitt Smith, Vanessa Williams, Lionel Richie, Blair Underwood, and Jerome Bettis, we learn about “the wall” many African Americans hit when trying to uncover their histories. In each of these episodes, we watch disappointed and frustrated Black celebrities learn that their ancestors’ histories are segregated in documents like the “Slave Schedule” (a census-like document used to collect information about slaves), which lists the number and characteristics (sex, age, color, literacy, mental capacity) of slaves owned, but infrequently contains their names. Instead of the typical “go-to” sources white celebrities use to locate their family histories, Black celebrities must read the diaries and wills of slave-owners to find faint traces of their family members’ fates. And unlike the other celebrities WDYTYA? features, many Black celebrities cannot visit their family members’ graves because their burial sites were only marked provisionally, if at all.

The most compelling moments in episodes of WDYTYA? that feature Black celebrities come when they can break through “the wall.” For example, Bettis feels pride when he learns that his three-times-great-grandfather, hit by a steam engine, successfully sued the Illinois Central Railroad; and Williams is humbled to learn her great-great-grandfather risked his life as a Black union soldier fighting in the American south. Further, WDYTYA? uses DNA tests to trace Underwood’s and Smith’s ancestors to particular villages in Africa, and in Underwood’s case, introduces him to a distant relative. These episodes end with cathartic pride as Black celebrities reclaim pieces of their family histories that at first seemed unrecoverable. These inspiring emotional moments, however, overshadow the fact that the experiences of Black women who shaped and, quite literally, birthed these histories, are absent.

This is particularly frustrating given that many celebrities, like Lee and Underwood, begin their journey hoping to explore the ancestry of a beloved female relative. Bettis’ episode, for example, is driven by his desire to learn more about his mother’s ancestry, yet glosses over evidence that his great grandmother and grandmother were abandoned in the early 1900s, and focuses instead on his great grandfather’s bravery for taking legal action against his abusive white employer at a spoke factory. WDYTYA? sidesteps discussion of enslaved Black women’s experiences of master-slave rape with euphemisms like “commingling” in Smith’s episode. Similarly, Richie passes over evidence that his enslaved great-great-grandmother was raped and praises his ancestor’s slave master for providing for his own mulatto children, saying, “to protect what was his was just the greatest gift.” Only Lee acknowledges his mulatto heritage is a product of slave rape, but neither he nor the show explore what that reality might have been like for his ancestors or Black women more generally.

WDYTYA?‘s efforts to break through “the wall” by uncovering and narrating our ancestors’ lost stories are vitally important. Still, as compelling and significant as Black men’s narratives are, their histories are incomplete without Black women’s voices. And, considering the show’s inclusion of white women’s historical narratives (notably in episodes featuring Susan Sarandon and Helen Hunt), the omission of Black women’s experiences is all the more problematic. Though our female ancestors’ stories have long been overshadowed by their husbands’ and fathers’ histories, Black women’s historical narratives tell equally important and compelling stories–and they should be told on WDYTYA? and elsewhere, no matter how fragmented or uncomfortable. Without Black women’s stories, we can never really know “who we are” or “where we came from.”


http://blog.commarts.wisc.edu/2012/07/12/mediating-the-past-history-and-ancestry-in-nbcs-who-do-you-think-you-are/feed/ 6
An Incomplete History: “Women Who Rock” at the Rock and Roll Hall of Fame http://blog.commarts.wisc.edu/2012/03/01/an-incomplete-history-women-who-rock-at-the-rock-and-roll-hall-of-fame/ Thu, 01 Mar 2012 14:20:15 +0000 http://blog.commarts.wisc.edu/?p=12349 I found myself in Cleveland last week.  My friend Amy Rigby, a musician who plies her trade in one of the parallel music industries that I talked about in my recent post about the Grammy Awards, had things to do in Cleveland.  I’d been threatening for months to take advantage of being on sabbatical to see the “Women Who Rock” exhibit at the Rock and Roll Hall of Fame, a few hours from where I live, before it closed on February 26, although the thought of it made me queasy.  A road trip was born and Amy and I visited the exhibit.  It was all that I expected, which is to say, not much.

Critiquing it is like shooting fish in a barrel; it’s just too easy.  The Rock and Roll Hall of Fame reflects the worldview of Rolling Stone magazine and major players in the commercial music industry.  The “meta” issue at play is how to put rock and roll, or any type of music, in a museum.  What good is looking and reading about music when you haven’t heard it?  Posting individual listening stations at each display would make progress through any exhibit impossibly slow, compromising an institution’s ability to remain financially solvent.  Curators have to assume that we know what the music, at least some of it, sounds like.  The cultural authorization that results from exhibits of this type is also problematic, as are issues of inclusion and exclusion.  Who gets recognized, who doesn’t, and why?

These problems are especially germane to an exhibit about women and rock.  I imagine that although some of the people who saw the exhibit were music nerds like me and Amy, most were not.  Those of us who are already familiar with the music could “play” it in our heads.  What would everyone else do? We were there on the President’s Day holiday, and the majority of visitors in the galleries were either middle-aged couples or moms with their pre- or just-teen daughters.  The solution proffered by the Rock Hall was to focus on performers who visitors might be familiar with.  Hence a couple of displays devoted to Lady Gaga (the meat dress!), pop stars from the 80s on, the most well-known punk, new wave and “alternative” acts, several rap, hip-hop, and nouveau girl group artists, and some appropriately reverential educational videos about early blueswomen and R&B singers from the 50s and 60s.  Emphasis was on singers.  “Wait,” you may say, “I thought you were talking about rock?” Indeed.  “Rock” here is stretched so thin as to be a meaningless term – a common discursive move.  “Rock,” the exhibit claimed, is an attitude, and apparently these selected artists all have it.

We were a bit amazed by some of the performers who were barely recognized, or left out entirely. Wither the Slits? The Raincoats? Poly Styrene?  Patsy Cline? Emmylou Harris? Exene? Mo Tucker?  Oh yeah, listed in a paragraph on a small sign, maybe.  Patti Smith, Debbie Harry, Kim Gordon, Kathleen Hanna, and Liz Phair were there, but where were the rest of the women from the 70s to the 90s and beyond who made or continue to toil in the rock trenches? Where were the non-Americans? (Joni Mitchell was included, but she’s lived south of the 49th parallel for a long time.) Pioneering female rock group Fanny was represented with a group photo on a swinging door that led to a side exhibit; Amy and I were relieved that it was not the entrance to the restroom, which is what it looked like.

After a while, it seemed that artists who were able to donate “stuff,” usually in the form of stage attire identified by the name of the designer, were the de facto focal points of the exhibit.  That wasn’t necessarily the case, as it turns out.  Upon my return, I asked June Millington of Fanny whether she’d been contacted about the exhibit. She hadn’t, although she knew that her photo was being used and that the curatorial staff had her contact information.  Too bad, as Millington has plenty of stuff, in the way of guitars, other instruments, and memorabilia that I’m sure she’d have been happy to lend and that could have formed part of more interesting exhibits. (I also emailed one of the Raincoats with the same question; if I get a response I’ll post her answer in the comments to this piece.)

The exhibit was accompanied by a PBS documentary, so upon my return home I watched that, hoping that it would fill in some blanks.  I’m sorry to say that it did not.  (It streams here: http://video.pbs.org/video/2168854975). Although narrated by leading male and female critics, and hosted by Cindy Lauper, it omitted even more artists in order to create a smooth narrative from Bessie Smith to Janelle Monae, culminating in an implied celebration of “poptimism,” currently a vogue amongst writers, bloggers and some academics who think critically about popular music.  I think that the critical turn to poptimism is well intentioned, as it attempts to break down hackneyed binaries that as much as we don’t want them to continue to inform discussions of popular music (e.g., authentic/commercial, male/female, white/black, rock/pop), but believe it does not deal adequately with other things, for example: the political economy of the industries; entrenched sexism; the tyranny of playlists, Pitchfork, and tightly constructed radio formats that shut down possibilities for artists like Amy, a long-time critical favorite whose music has always fallen between the cracks; the tracking of women away from rock and into softer or more “appropriate” pop listening practices as they age; the myth of the “middle-class musician” who can actually afford things like health care and a guarantee of a decent living; and the politics of representation and identity in their myriad configurations.[1]

Calling it all “rock” does not attenuate or explore these and other issues.  Ultimately, the story of “women who rock” is not all about clothes and good feelings. Writing so much and so many out of what could, because of its institutional status, become the foundation of the sanctioned and commonsense history of women and rock threatens to erase or trivialize their past, present and future contributions.  We scholars and critics have our work cut out for us if we want to capture the more inclusive and nuanced history that the subject deserves.

[1] Yes, this is a bit of a shameless advertisement for Amy, a woman who most definitely rocks yet is, in my opinion, criminally unknown.  Check out her music and blog at www.amyrigby.com, or legally download her albums from Emusic.com or iTunes.