Representation – Antenna http://blog.commarts.wisc.edu Responses to Media and Culture Thu, 30 Mar 2017 23:48:47 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.5 Unpacking Rust, Race, and Player Reactions to Change http://blog.commarts.wisc.edu/2015/06/15/unpacking-rust-race-and-player-reactions-to-change/ Mon, 15 Jun 2015 14:25:10 +0000 http://blog.commarts.wisc.edu/?p=26929 Rust courted controversy by assigning players unchangeable, racialized avatars. Adrienne Shaw unpacks how game design helped produce some of that player outrage.]]> Rust 3

Post by Adrienne Shaw, Temple University

Having recently published a book on representation in video games, several people have asked me about the “Rust controversy” (and a blog post is easier to manage than multiple email threads). One of the more surprising findings from my book and prior audience studies projects is just how little some people (take note internet: some people) say they care about representation in games. The actual core argument of the book, however, is that media scholars (among others) need to be more attentive to when and how people come to care about representation. Looking at when and how people care about representation helps us better interrogate the limits of the kinds of diversity we have seen in games. And fights over representation, moments when people really care or militantly don’t care about representation, illustrate that really well.

So Rust… The original story broke back in March, when the post-apocalyptic massively multiplayer online (MMO) game released an update that assigned a randomly raced avatar to all players, which could not be changed. Prior to this, all the avatars looked the same: a bald white guy. Responses to this change varied. Some welcomed the injection of aesthetic diversity in the game; others were pissed. Some of this anger was expressed as racist language, some felt the change was “social justice” activism through design, and many just wanted to know how to change what the avatar looked like.

A lot of other smart people have already written about these various player reactions: go read these great pieces by Megan Condis, Kishonna Gray, and Tauriq Moosa now! I want to focus on a slightly different issue than they do however: the role the design of Rust played in helping create those negative reactions.

Rust 4

First, I think it’s a mistake to say that Facepunch Studios experimented here. They took an existing game and changed it pretty dramatically and suddenly. There is a long history of gamers (terminology note) reacting poorly to changes in their favorite franchises (example). Most of the coverage of Rust’s change conflates the effect of making people play as a specific avatar with changing an existing game. MMO players, especially, become really attached to their avatars; there are decades of research on this (start here). Certainly, players of Rust before this update didn’t have choices for what their avatar looked like, but now that there are appearance options I suspect players think they should have more choice (bracketing out for a moment the fair critique that they were willing to accept a default white male option, because that’s what many games typically offer). Self-representation — that is having the chance to represent yourself how you wish, whether the thing on the screen looks like you or not — is a longstanding part of MMOs. That people took the Rust change so hard, and manifested those emotions as racist chat and play behavior is unsurprising (which is not to condone the racism expressed in those comments).

Second, in my book, I talk about the distinction between characters and avatars, and in online spaces especially people are known through their avatars. Rust lead developer and owner of Facepunch Studios, Garry Newman’s comments on the matter demonstrate a misunderstanding of the contextuality of how and when what the avatar/play character embodiment affects when and how people care: “People have a strange need to play someone similar to themselves in games,” he said. “That’s not something I understand. I don’t think I’d have enjoyed Half-Life more if Gordon Freeman didn’t have glasses or a beard.” From my own research, certainly those games (narrative-driven, solo player games) are the ones in which players do not always care much about playing as a character “like them” because there are other ways (narrative mostly) for them to connect to those characters (or not). People who feel emboldened to demand things of games, moreover, do wish that on a broad level there was more diversity within those narrative-driven assigned character games. Players do often care about how they are being represented in contexts in which they are being represented to others through an avatar, like an MMO. And they really care in games that imply they have a choice, which is among the many reasons people care strongly about what relationship options are available in games.

Rust 2

Finally, the way the race was introduced in the game actually helped make it feel arbitrary. Indeed, in the announcement of the change they call race arbitrary: “It’s quite pleasing to see different races working together in game, and makes you realise how arbitrary race is.” Race in the game is an aesthetic addition so people can tell each other apart visually. That isn’t what race is, which is why “color-blindness” has never been an actual anti-racist goal. Robert Yang discusses his own approach to this issue in designing Cobra Club. What would be even more interesting than randomized races is if someone created a game where you are born into a body that affects the way you interact with the world. Now that would be an interesting experiment in how people react to being thrust into an identity that may not be like their own. There is a model for this in fact, in Marsha Kinder’s Runaways, and if anyone has info on what happened to that game please leave a comment.

None of this is to say that Facepunch Studios should be condemned for trying something new. New players will come to the game expecting to be assigned a body. And that’s interesting, and might lead to some unique in-game interactions that change how we understand avatar-player relationships (I sense a dissertation being formed in the distance). The danger, though, is that more risk-averse studios will see the negative response as evidence that players aren’t ready for more diversity in games. There are plenty of games out there for those players who aren’t ready for more diversity; I think the rest of us are ready for something new.

Share

]]>
Beyond the Nominations: The Emmys and Representation http://blog.commarts.wisc.edu/2014/07/01/beyond-the-nominations-the-emmys-and-representation/ http://blog.commarts.wisc.edu/2014/07/01/beyond-the-nominations-the-emmys-and-representation/#comments Tue, 01 Jul 2014 13:30:15 +0000 http://blog.commarts.wisc.edu/?p=24199 During last year’s Emmy Awards ceremony, Kerry Washington was feted by Diahann Carroll as a beacon for diversity in an award show dominated by white actors and actresses, particularly in lead categories. The nominations for categories like Lead Actress in a Drama Series, where Washington competed for her role on Scandal, are typically where this discussion takes place when engaging with the diversity of award shows.

However, this discussion truly begins in the months ahead of the Emmys. The numerous roundtable interviews and photo shoots organized by trade publications and wrestled over by publicists are where the politics of representation of awards season begin to form, while the Emmy submissions themselves offer a subsequent space in which such politics are negotiated.

thr_cover_emmy_actress_19This is particularly clear this year, given that Washington was notably absent from early Emmy campaigning, despite having been part of roundtables for both The Hollywood Reporter and Variety the previous year. This led to a Hollywood Reporter cover featuring the year’s top actresses, all of whom were white, and a Variety roundtable featuring six different lead actress contenders, all of whom were also white. Washington’s absence—likely tied to the fact she gave birth to her first child earlier this spring—offers a stark reminder that if not for Washington, there would be no women of color competitive in her category.

This is not to say that there are no other women of color submitted in the category: the official Emmy ballots revealed six, including Being Mary Jane’s Gabrielle Union, Sleepy Hollow’s Nicole Beharie, Elementary’s Lucy Liu, Nikita’s Maggie Q, and The Fosters’ Sherri Saum and Cierra Ramirez. Eliding for a moment the depressing statistic that only seven of the fifty-six women in the category are women of color, there are other reasons these women have been less visible than their counterparts. Issues of genre, network/channel branding, and cultural hierarchies of taste all make series like these less likely to draw Emmys in a dramatic field dominated by prestige cable dramas or network dramas with prestige cable auspices.

However, this does not necessarily exclude these women from participating in roundtables with major trade publications, provided their publicists—either associated with the network, studio, or the actress herself—work hard to get them there. Features like the Hollywood Reporter cover are competitive by nature, a coup for a publicist working hard to prove their worth to their client. But if you don’t have a publicist or agent who has played the Emmy game, or if you’re part of a show on a cable channel like BET with limited experience Emmy campaigning, there’s a good chance you will not be represented. And even if Gabrielle Union’s publicist had pushed for her to be included in one of these roundtables, would anyone have taken Union as a serious contender, given the low cultural standing of BET compared to the networks and channels dominant in the roundtables?

As Dear Black Woman reminds us, these realities do not render these situations ideologically neutral, because the optics they create are real, and offer a stark reminder of the state of diversity in not only the Emmys but in television more broadly. Rather, such considerations highlight how the Emmy nominating process functions as the intersection of multiple spaces of industry practice, each equally disinterested in confronting issues of diversity in a meaningful way unless someone like Washington emerges who fits the other requirements—a successful series, a reputable network, a strong publicity team—dominant in those spaces.

Screen Shot 2014-06-30 at 11.40.46 AMWe can extend this into the submissions process itself. Transitioning to issues of gender, three examples stand out among the series and performers submitted for consideration. In the case of Amy Schumer, star and executive producer of Comedy Central’s sketch comedy series Inside Amy Schumer, she’s forced to compete in the Supporting Actress in a Comedy Series category due to rules surrounding variety series. Although she is unlikely to garner a nomination in either category, the optics of the ballot push her into a supporting role on a show based around her comedy, and which in its appeal to female viewers signifies a meaningful shift in Comedy Central’s brand identity.

In other cases, Emmy campaigns reinforce broader readings of a series’ gender politics. HBO’s Silicon Valley focuses its satire on the male-dominated technology field, and in its short first season featured only one supporting female character in Monica, played by Amanda Crew. And although the show drew significant criticism for its engagement with gender, HBO nonetheless chose to submit every other credited actor in the series for Emmy consideration without submitting Crew, making it their only series without an acting submission from each gender. The chances of Crew being nominated are slim to none, but the optics of not even submitting her don’t seem worth the money saved with one less submission among their extensive slate.

Screen Shot 2014-05-21 at 11.18.43 AMIn the case of FX’s Fargo, broader channel strategy intersects with gender in problematic ways. Although Allison Tolman has been cited as the series’ lead actress in interviews with its creator, she is submitted as a supporting actress, a category she won at the recent Critics’ Choice Television Awards. There is an awards logic to this decision: Tolman is a newcomer without the name recognition of those likely to compete in Lead Actress, plus FX has a better shot in that category with American Horror Story’s Jessica Lange. And yet this strategy marginalizes Tolman to a lesser category (which was nearly eliminated last year), and pitches Fargo as a show with two male leads (Billy Bob Thornton and Martin Freeman), which is notable given how some critics felt the finale worked to marginalize her character.

While the ideological dimensions of Emmy campaigning are made visible in trade publications, the same dimensions in Emmy submissions need to be excavated, and depending on the nominations may never make it past the ballots. However, exploring these questions reinforces that our understanding of the politics of the Emmys is not only driven by who is nominated or wins, but by how issues of race and gender are negotiated in the processes that lead to those results.

Share

]]>
http://blog.commarts.wisc.edu/2014/07/01/beyond-the-nominations-the-emmys-and-representation/feed/ 1
Sucks to Be Ru: America’s new Russian Other http://blog.commarts.wisc.edu/2014/02/21/sucks-to-be-ru-americas-new-russian-other/ http://blog.commarts.wisc.edu/2014/02/21/sucks-to-be-ru-americas-new-russian-other/#comments Fri, 21 Feb 2014 21:49:20 +0000 http://blog.commarts.wisc.edu/?p=23667 0208-Sochi-opening-rings_full_600

For years, America has lacked a true constitutive “other”—the sort of competing entity that can represent everything we are not and thus help us agree on what we are. Yes, the Muslim world has played the role of Enemy of the State for the past decade, but the varied, decentralized and complex nature of that entity has prevented anything approaching a consensus among the American populace. What we have missed is the simple (and, of course, oversimplified) Soviet—a militarized, soulless Ivan Drago to our informal, scrappy Rocky Balboa.  And while that pre-millenial foil is gone forever, the Sochi Olympics has brought us something perhaps even better.  The unending string of hilarious #SochiProblems and daily stories of government gluttony filling our Facebook feeds have positioned Russia not so much as America’s polar opposite, but instead as a sort of shadow version of the American Way of Life.

In putting on the Sochi winter games, Russia has expended an absurd amount of resources—about $50 billion worth—with the expectation of fundamentally repositioning the country’s place in the global imagination.  And although much of the media coverage surrounding Sochi has focused on the tremendous amount of graft and waste it took to ring up such a bill, the Russian government is unlikely to view the event as anything but a success.  Above all, President Vladimir Putin wanted to use the Sochi spotlight to disrupt the unipolar, American-centric geopolitical map that has emerged since the fall of the Soviet Union.  And in some small way he has.  It takes a terribly powerful man to waste such a terrible sum of money.  The cost may have been surrealistically high, but for Putin it was a one-time-only opportunity to demonstrate that he has the surplus of power and the utter lack of conscience one needs to make a play for international hegemony.

The United States, however, has gotten a much better deal, at least in terms of buying an improved sense of national identity. Russia, in no small part due to Sochi, no longer embodies a set of virtues—extreme discipline, ideological orthodoxy, etc—that we choose to reject.  Instead it has come to stand in for all of those vices that we fear we may have but would rather not face.  No longer Drago, Russia has become America’s drunken Uncle Paulie, a bumbling, wasteful reminder of what we could become but never will.  Whenever Rocky is down, he can always look to Paulie’s blubbery ineptitude and realize he’s not in such bad shape.  And now, whenever Americans fear their government may be dolling out favors to corporations or spending money on all the wrong things, well, at least we don’t build Dadaist toilets when the whole world is watching.

This narrative received a wholly unexpected boost last Sunday with Michael Sam, a mid-level professional football prospect, announced that he is gay.  The reaction from the National Football League has been predictably tepid, with nearly every team echoing the standard neo-liberal take on gay rights.  Discrimination cannot be tolerated if it is going to get in the way of profits or, in the case of an NFL team, winning football games.  Of course, America in general and its sports culture in specific still have a long way it to go when it comes to eliminating discrimination over sexuality.  The discussion should be less the sports media’s preferred “where will Sam be drafted?” and more “why does only one player feel safe enough to be open about being gay?”  However, in comparison to our Other Russia, America looks positively enlightened.  When faced with the medieval anti-gay laws and mockable public statements (“We don’t have [gays] in our town”) on display in Sochi, it’s easy to give America a pass.  Russia’s backwardness on the issue provides the perfect backdrop against which to avoid asking truly tough questions about ourselves.

The Washington Post’s Max Fisher has called for Americans to avoid the temptation of Russophobia when engaging in discourse about Russia.  It can be all too easy to mock a population that has long been the target of so many stereotypes and Internet memes. In general this is good advice.  However, it by no means suggests that we should hesitate to condemn the authoritarianism and corruption that Putin’s Russia has put on display in its effort to make the world it seriously. Just as important, however, is that we use the opportunity as a means of interrogating the weaknesses of our society, not as an excuse to ignore them.

Share

]]>
http://blog.commarts.wisc.edu/2014/02/21/sucks-to-be-ru-americas-new-russian-other/feed/ 1
Julie D’Acci on the Emergent Qualities of Sublimating Circuits http://blog.commarts.wisc.edu/2014/02/18/julie-dacci-on-the-emergent-qualities-of-sublimating-circuits/ Tue, 18 Feb 2014 14:51:40 +0000 http://blog.commarts.wisc.edu/?p=23644 D'Acci2On (the) Wisconsin Discourses: Julie D’Acci (Part Two)

Part One: Here

Does circulating information influence, inflect, or inhibit material relations in empirically verifiable ways? And do strategic interventions in the super-structural sphere actually promote sustainable social effects?

These are easily the two most vexing problems facing cultural studies research. In a previous post, this series argued that Julie D’Acci’s work has attempted to empirically apply the theoretical assumptions of the Birmingham School without emulating a social science method.

Such an approach begins with the observation that cultural processes require evaluation beyond taxonomic description. Mass communication is effective at describing media events by focusing on the relationship between evaluative (after) and predicative (before) time. In contrast, cultural studies seeks to contribute to social change among multiple lines of tactical intervention – industrial content production, information circulation, identity representation, and most difficultly, consciousness formation. Put differently, the when of social change is a central distinction between these two forms of communication analysis.

By focusing on concurrent characteristics of tangible discursive processes, opportune moments to plant seeds for democratic thought are revealed as internal adjustments to qualitative phenomena by discursive blocs. Attention to the temporality of how blocs shift their positional presence among heavily circulating codes and representations, otherwise called emergence, was a central concentration of cultural studies between the 1960s and 1980s.

Two major methodological strains came out of this period, “bottom-up” research that extrapolates scale practices through empirical study, and critical/cultural theory that evaluates circulating concepts regarding perspective, embodiment, and aspiration. Julie D’Acci’s major contribution—her refinement of Richard Johnson’scircuit model”—can be viewed as a kind of intellectual diplomacy intent on reconciling empirical and conceptual approaches innovated from shared political investments.

The Paradox of Strong Effects in Cultural Studies

As discussed in the last post, the cultural study of media differs from mass communications over philosophical orientation regarding how social change occurs and is elicited. A communications researcher begins by identifying an event, asking a directed question about the event, measuring a response to that question across demographic categories, identifying the legacy of the event in its opinion effects, and then proposing a short-gain solution based upon the political will of the discursive bloc queried. This model of media event analysis, usually associated with Paul Lazarsfeld and Elihu Katz, has been successful at communicating social processes to the hard sciences, and is accurate at accounting for opinion and predicting political outcomes that are opinion-based. A quantitative study builds context through a reverse gestalt that assembles parts into a constituted whole. A saturation of directed questions applied across demographic categories can be graphed as a topography of belief, in which utterances after an event provide a glimpse of analytical certainty.

In contrast, media and cultural studies has resisted adopting a strong effects model precisely because qualitative datum, such as imagination, intuition, and residual belief, tend to be minimized when survey findings are translated into quantified results. Survey research cannot assess how the historicity of implicitly coded information is received. Cultural studies begins with all constitutive characteristics of a phenomenon, which are then critically bracketed, detailed, and mobilized in line with a specific tactical project. A cultural analysis resists reactive delegation of a problem or position into a pre-constituted category. And uncomfortably received by social sciences, cultural research also requires acknowledgement of a kind of existential paradox, debated little since the 1980s in media studies:

Qualitative analysis resists strong effects correlations, specifically because analytic research questions predetermine a field of vision. Yet, cultural studies researchers have simultaneously held the belief that media has important role in the struggle over equity and representation. Put differently, cultural analysis believes that reception is more complex than transmission or measurement models permit, yet the method rests on the faith that circulating messages carry “strong” social effects central to the struggle over democratic participation.

Among recent responses to this paradox has been a common conclusion to simply evade this paradox, and continue with empirical mapping as a descriptive form with no intended outcome. This has led to what might be called a “third-wave” of de-theorized cultural studies in which some researchers have argued for increased attention to production blocs that carry emergent characteristics similar to discourses. In some cases, the “third-wave” has taken an additional step in de-politicization and argued that business practices are even more promising sites for eliciting social change than previously favored grass-roots interventions in public forums. While industries are worthy of close analysis, they are rarely if ever engines for equity.

Observing that businesses aren’t invested in social justice research does not reconcile a further existential dilemma that faces cultural studies. Media industries are an inseparable and permanent node in the production and circulation of representations. Part of the field must consist of analyses of the logic of capitalist entities; to do this requires a maxim for political investment, as well as healthy debate over the strengths and weaknesses of conceptual methods. Descriptive work is doomed to make evaluative claims after a social process or media event has already occurred. Cultural analysis that follows a “circuit model” centers on the continuous evaluation discursive reflexivity, as groups refine and begin to circulate their own messages.

The Circuit Model as Sublimation of Relational Distanciation

How to avoid the accidental reproduction of official knowledge? One appropriate method is to reveal inequalities as they occur. Another strong option is to genealogically analyze source concepts and practices as predicates for future social relations. A more difficult option is provided by D’Acci’s circuit model (below): to identify emergences as they’re presented among the constellation of limits of a specific context. Examining cultural emergence requires an understanding of how productive, distributive, and receptive qualities consist of series of exchanges that build valuation. Valuation of a specific coded representation increases circulatory momentum. Heavily circulating codes sublimate cultural processes and become what’s commonly referred to as a “determinant”, or a reference point for future code making.

D'Acci Circuit

D’Acci’s proposition is to account for this process by first identifying a 4-tiered but inter-subjective system of circulation. While its spheres—social, industrial, receptive, textual—come from Johnson, D’Acci’s system also implicates an additional methodological fourfold: empirical, conceptual, ethical, and aesthetic analysis, as a way to identify exchanges between interrelated determinants, including the internal organization of each sphere. Under such a rubric, a determinant further acts as a connective that facilitates a continuum of heavily circulating codes, while exposing contradictions opportune for change action. Circulating messages interact with cultural determinants, and the result, as argued by Stuart Hall and John Fiske, is that we can clearly identify discursive emergence.

If one sphere is cut from analysis, it recedes from sublimation into isolated divisions. Over-formalization of one node further has the danger of resulting in reproductive analysis of “official knowledge”, because representation has been cut off the process in which a code has been circulating.

A successful analysis of the sublimation between actants, objects, and processes causes even ephemeral relationships to become empirically revealed as an identifiable communicative relation that can be ethnographically described. Over time mapping of cultural representations might result in a sublated grasp of relational distanciation between circulating concepts. Sublimation of exchanges within a contextual field also reveals how degrees of duration might help to understand constitutive degrees of discursive positioning.

In terms of the ethical investments of cultural studies, devising a method to capture ephemera, subjective interpretation, and anticipatory aspirations, are a central to understanding how the performative dimensions of discourses respond to dominance. These are the sites of resistance, change, and possibility, and they are the first significations exorcised by methods of standardization.

To respond to the second question posed at the beginning of this piece: we can’t be certain if cultural analysis is able to promote sustainable cultural change in the U.S., researchers haven’t organized in a sustained way toward these stated goals. But we do have working theories from which to begin.

Share

]]>
Mediating the Past: Licensing History, One Game At a Time http://blog.commarts.wisc.edu/2012/11/30/mediating-the-past-licensing-history-one-game-at-a-time/ Fri, 30 Nov 2012 16:43:54 +0000 http://blog.commarts.wisc.edu/?p=16850 **This post is part of our series, Mediating the Past, which focuses on how history is produced, constructed, distributed, branded and received through various media.

Ubisoft’s Assassin’s Creed series has always been invested in history: its central conceit is a machine, the Animus, that allows hero Desmond Miles to travel through his ancestral past unlocking details of an epic battle between the Assassins and the evil Templars. This overarching story has taken players through a collection of cities such as Florence, Venice, Rome, and Constantinople, each painstakingly recreated based on extensive research on the time periods in question.

However, with Assassin’s Creed III the series enters new territory, literally, moving its focus to North America with a game primarily set in Boston and New York during the American Revolution. The result has been an increase in discourse around the series’ approach to history, best captured by Slate’s “The American Revolution: The Game.” Lamenting the lack of realistic portrayals of the colonial era, Erik Sofge writes of Ubisoft Montreal’s bravery in embracing the underwhelming aesthetics of the period with a Colonial Boston that is “boldly, fascinatingly ugly.” Arguing that the general monotony makes the game’s standout moments all the more remarkable, Sofge commends the game, calling it “one of the best, most visceral examinations of the history of the Revolutionary War.”

When considering the Assassin’s Creed series in regards to mediating the past, its relationship to actual history—as in history as it happened—is limited. While heroes like Connor or Ezio might intersect with famous historical figures like George Washington or Pope Alexander VI, or weave their way through a particular historical event, their exploits—science fictional as they are—remain distinct enough that any label other than historical fiction would be undeserved, even if the games’ broader narratives stick largely to basic historical fact (unlike some planned downloadable content).

Indeed, the game’s developers have made a key distinction in the past: in an interview prior to the release of Assassin’s Creed II, creative director Patrice Désilets revealed, “I like to say that histories are licensed. So, yes we can take some liberties, but not too much, otherwise…what’s the point, right?” While we normally consider licensed video games as those based on other media properties—movies, television shows, comic books—the idea of licensing history is similar: the basic details of the licensed property are mediated through the principles of game design and the result is a product undoubtedly based on, but unlikely to be an exact recreation of, the original.

Of course, much as licensed games are often judged based on their authenticity (or lack thereof), Ubisoft Montreal must face questions of historical accuracy; this could lead to Sofge’s effusive praise of the developer’s commitment to representing this period in history, or The Globe and Mail’s objections to the game’s alleged suggestion that “indigenous peoples rallied to the side of the colonists.” Regardless of whether either argument has merit—ignoring that the Globe and Mail argument shows zero evidence the authors have played or even researched the game in question—they reflect the risk and reward in licensing something as meaningful as history, particularly American history: put another way, these same conversations weren’t as visible when the series was licensing European histories less familiar to the world’s largest gaming market.

Although the final judgment on Assassin’s Creed III’s historical accuracy lies in the game itself (which I haven’t played yet), I’m more interested in how the licensing of history is understood within pre-release hype surrounding the release of each game. Ubisoft’s claim to pure historical capital is tenuous, particularly before people have played the game, but they can more readily make a claim to gaming capital as it relates to historical accuracy. The accomplishment is not in the end result—the accuracy of which we can’t even really judge, given we didn’t live in the 18th century—but rather in the painstaking efforts necessary to recreate the minutiae of 18th-century Boston on an immense scale worthy of today’s high-powered consoles. Previews and interviews allow the developers to make this labor visible, detailing the process whereby historical capital is translated through—or, more critically, disciplined by—the logics of game design in order to create the most satisfying experience for the player while nonetheless maintaining that direct line to historical capital.

Concept art for the younger version of Connor featured in ACIII.

This notion of discipline is important, here, though: while history brings potential cultural capital, disciplining that history brings cultural consequences, particularly in a game that features a Native American hero (as Connor’s other was of Mohawk descent). Ubisoft’s narrative, however, firmly encloses the discussion of Native American representation within the rigor of the development process rather than the gameplay experience of the final product. Sofge’s article details their cultural sensitivity through the use of a Mohawk liaison and their commitment to authentic Mohawk dialogue, effectively reporting Ubisoft’s due diligence rather than how it manifests in gameplay; a similar process, rather than product, is revealed in Matt Clark’s profile of ACIII writer Corey May. Within game development this level of cultural accommodation is considered remarkable, allowing Ubisoft’s stated commitment to accuracy to potentially circumvent any criticism of how the game’s design complicates its representation of indigenous peoples (as simply the details about May’s diligence seem to impress the professor of Native American studies consulted for Clark’s article, who has not played the game in question).

As with all licensed properties, then, Ubisoft carefully manages the place of history within the Assassin’s Creed series, much as the series itself is closely managed as the developer’s most successful franchise. When Ryan Smith suggested a roaming art installation featuring works inspired by the game’s historical period was more apt to comment on the revolution of the Occupy movement than the generic “Get Out The Vote” connection on offer, Ubisoft responded that “I think that we have tried not to read into this that much, to be honest. At the end of the day, this is just a video game.” The cultural capital of licensing history has value up until the point that cultural capital becomes politicized or problematized, at which point Assassin’s Creed III ceases being about history, and focuses exclusively on the gaming capital to which Ubisoft can more comfortably claim ownership.

Share

]]>
Grimm and the Monstrous Feminine http://blog.commarts.wisc.edu/2012/05/22/grimm-and-the-monstrous-feminine/ http://blog.commarts.wisc.edu/2012/05/22/grimm-and-the-monstrous-feminine/#comments Tue, 22 May 2012 13:00:33 +0000 http://blog.commarts.wisc.edu/?p=13097 Once upon a time, a new genre of fairy-tale-based American media emerged. Instead of Disney’s dancing teapots and talking birds, thrillers like Red Riding Hood, The Brothers Grimm, and Hanna point out that “Little Red Riding Hood” and “Snow White” are actually stories about young girls devoured by wild animals and ordered gutted by monstrous queens, respectively. This darker side of fairy tale culture is the spirit of NBC’s newly renewed Grimm. I initially shared Kyra Hunting’s skepticism about the series, but the gorgeous cinematography and cleverly adapted fairy tale plotlines hooked me (and the other 5.3 million viewers who tuned in for Friday night’s season finale), despite a nagging feeling that Grimm’s monstrous women told a politically problematic tale.

Grimm’s weekly plotlines, developed around Jacob and Wilhelm Grimm’s 1812 Children’s and Household Tales, follow the modern crime genre format. The series follows Detective Nick Burkhardt, a Portland police officer dismayed by his great aunt’s revelation that he is a “Grimm” – a descendent of the Brothers Grimm whose powers allow him to see pseudo-shape-shifting Wesen whose otherwise human faces transform into grotesque configurations reminiscent of big bad wolves, evil witches, giants, and the like. What’s worse, these creatures-in-disguise often partake in inter-species violence, resulting in many of Portland PD’s murder cases. The series is as much police drama as fairy tale, featuring—like CSI, Criminal Minds, and NCISdimly lit camera shots, stealthy detectives with guns and flashlights, and, of course, women’s bloody and broken bodies.

Dead women are standard set dressing on most crime dramas, so I wasn’t surprised by the ill-fated red-hooded coed in Grimm’s pilot. But the more I watched, the more I realized the women in this series aren’t usually homicide victims – they’re monsters. The first morphing face belongs to a beautiful Hexenbiest (loosely translated “witch bitch”) named Adalind whose Barbie-esque blond exterior hideously contorts to reveal that Grimm’s beauty is only skin deep. Adalind makes consistently dangerous choices, using her beauty—and the monstrous truth beneath it—to ruin men. Not only does she cast a love spell on Nick’s partner Hank, she also kills an elderly cancer patient and unleashes an impressive physical attack on Nick – she uses her beauty, NBC muses, to “put any man completely at her mercy.” Adalind is the monstrous feminine who seduces men before castrating them, at least figuratively, with her power.

I can’t say that I’m surprised by Grimm’s monstrous femininity. Fairy tales (and crime dramas, for that matter) are morally instructive, recycling cultural fears into cautionary tales, and Grimm funnels the mythos of women’s irresponsibility and cold indifference into a crime drama. “Tarantella,” for example, features a Spinnetod (or “black widow”) – a mother who seduces men before sucking out their internal organs through their abdomen – and the Cinderella-turned-Murciélago (“hideous bat-like creature”) in “Happily Ever Aftermath” emits a shrieking sound that explodes her entire family’s eyeballs, leaving herself heir to her father’s fortune. It is from these stories that we learn what a “good mother” looks like, and she certainly wouldn’t seduce men in the name of eternal youth. And “good women” like Cinderella are rewarded through quiet suffering, not monstrous murder.

Grimm emerged from a cultural climate particularly interested in moral instruction, as evidenced by recent legislative fervor over women’s choice. Last week, Kansas legally allowed pharmacists to withhold prescriptions they “reasonably believe” could terminate pregnancy, the “Protect Life Act” allows hospitals to “let women die” rather than perform life-saving abortions, and of course, transvaginal ultrasound legislation requires women to be probed vaginally before terminating a pregnancy. These bills are just as terrifying as, say, tales of fire-breathing lady-Dämonfeuer, which also come from the assumption that women’s free (and presumed irresponsible) choice destroys American morality in a fury of fire and brimstone. Just as Grimm’s monstrous women threaten men, GOP politics frames women’s rights as a threat to family values, “fetal rights,” and men’s sovereignty. Grimm naturally channels this milieu, borrowing from traditional anxiety about strong, independent women encapsulated by the Brothers Grimm.

As Grimm’s first season wrapped up, the monstrous women were back. Adalind sicked her cat on Nick’s fiancé, turning her into a modern-day (comatose) sleeping beauty, and the mysterious “woman in black” unleashed a flurry of ninja-like moves before revealing her identity as (spoiler alert) Nick’s mother, long thought dead. Missing mothers are common in fairy tales, but Grimm’s finale raises the question: if Nick’s mother has been alive all of these years, why hasn’t she been mothering him? Even though the woman in black may not be the archetypal “evil stepmother,” I’m not holding my breath for a “happily ever after” moment in Grimm’s second season – what kind of a boring fairy tale leaves a “good mother” alive? We’ll have to wait until fall to see if her crime was drinking children’s blood or simply disappearing from her child’s life. Or maybe Grimm will give us a truly updated fairy tale – you know, one with progressive female characters.

Share

]]>
http://blog.commarts.wisc.edu/2012/05/22/grimm-and-the-monstrous-feminine/feed/ 4
Glee: The Countertenor and The Crooner http://blog.commarts.wisc.edu/2011/05/03/glee-the-countertenor-and-the-crooner/ http://blog.commarts.wisc.edu/2011/05/03/glee-the-countertenor-and-the-crooner/#comments Tue, 03 May 2011 11:00:40 +0000 http://blog.commarts.wisc.edu/?p=9227

This is the first in a series of articles on these male voices in Glee.

 

Part 1: The Trouble with Male Pop Singing

 

What immediately struck me about this still of Glee’s Chris Colfer (as Kurt Hummel) and Darren Criss (as Blaine Anderson) from Entertainment Weekly’s January 28, 2011 cover story is that this image might easily have been taken in the mid-to-late 1920s,  but it would have been unlikely to appear in the mainstream press since that time. Attractive young men in collegiate attire, sporting ukuleles or megaphones, singing to each other and to their adoring publics in high-pitched voices was a mainstay of 1920s American popular culture, then vanished during the Depression. Even the easy homoeroticism of a boy positioned between another boy’s legs dates back to popular images of the 1920s. In the early 1930s, a combination of greater media nationalization and censorship, increasing homophobia, and panic regarding the emasculating effects of male unemployment formed the context for the first national public attack on male popular singers as effeminate and as cultural degenerates. As a result, new, restrictive gender conventions became entrenched regarding male vocalizing, and the feminine stigma has remained. Until now, that is. The popularity of Glee, and, in particular, these two singers, has made me think that American culture may finally be starting to break with the gender norms of male singing performance that have persisted for the last 80 years. Since much of my research has focused on the establishment of these gendered conventions, I would like to offer some historical context and share some of the reasons why I find Glee’s representation of male popular singing so potentially groundbreaking.

Male singing has not always been so inextricably tangled up with assumptions about the gender/sexuality of the performer. Before the reactionary gender policing of popular singing, men who sang in falsetto or “double” voice were greatly prized. Song styles such as blues, torch, and crooning were sung by both sexes and all races; lyrics were generally not changed to conform to the sex of the singer or to reinforce heterosexual norms, so that men often sang to men and women to women. Crooners became huge stars for their emotional intensity, intimate microphone delivery, and devotion to romantic love. While they sang primarily to women, they had legions of male fans as well, and both sexes wept listening to their songs.

When a range of cultural authorities condemned crooners, the media industries developed new standards of male vocal performance to quell the controversy. Any gender ambiguity in vocalizing was erased; the popular male countertenor/falsetto voice virtually disappeared, song styles were gender-coded (crooning coded male), female altos were hired to replace the many popular tenors, and all song lyrics were appropriately gendered in performance, so that men sang to and about women, and vice-versa. Bing Crosby epitomized the new standard for males: lower-pitched singing, a lack of emotional vulnerability, and a patriarchal star image. Since then, although young male singers have always remained popular and profitable, their cultural clout has been consistently undermined by masculinist evaluative standards in which the singers themselves have been regularly ridiculed as immature and inauthentic, and their fans dismissed as moronic young females.

From its beginnings, however, Glee has actively worked to challenge this conception. The show’s recognition and critique of dominant cultural constructions of performance and identity has always been one of the its great strengths. Glee has continually acknowledged the emasculating stigma of male singing (the jocks regularly assert that “singing is gay”) while providing a compelling counter-narrative that promotes pop singing as liberating and empowering for both men and society at large. Glee‘s audience has in many ways been understood to be reflective of the socially marginalized types represented on the show, and one of the recurring narrative struggles is determining who gets to speak or, rather, sing. Singing on Glee is thus frequently linked to acts of self-determination in the face of social oppression, a connection that has been most explicitly and forcefully made through gay teen Kurt’s storyline this past season, which has challenged societal homophobia both narratively and musically. In the narrative, Kurt transfers to Dalton Academy to escape bullying and joins the Warblers, an all-male a cappella group fronted by gay crooner Blaine. Musically, Glee also takes a big leap, shifting from exposing the homophobic, misogynist stigma surrounding male singing to actively shattering it and singing on its grave.

From the very first moment Kurt is introduced to Blaine and the Warblers, as they perform a cover of Katy Perry’s “Teenage Dream” to a group of equally enthusiastic young men, we know we’re not in Kansas anymore. The song choice is appropriate in that it posits future-boyfriend Blaine as both a romantic and erotic dream object for Kurt, and it presents Dalton as a fantasy space in which the feminine associations of male singing are both desired and regularly celebrated. “Teenage Dream” was the first Glee single to debut at #1 on iTunes, immediately making Criss a star and indicating that a good portion of the American public was eager to embrace the change in vocal politics.

And “Teenage Dream” was only the beginning. This fantasy moment has become a recurring, naturalized fixture of the series. Just as Kurt turned his fantasy of boyfriend Blaine into a reality, so did Glee effectively realize its own redesign of male singing through a multitude of scenes that I never thought I would see on American network television: young men un-ironically singing pop songs to other young men, both gay and straight; teen boys falling in love with other boys as they sing to them; males singing popular songs without changing the lyrics from “him” to “her” to accommodate gender norms; and the restoration and celebration of the countertenor (male alto) sound and singer in American popular culture (I will address Chris Colfer’s celebrated countertenor voice in the next installment of this series). And instead of becoming subjects of cultural ridicule, Colfer’s rapturous countertenor and Criss’s velvety crooner have become Glee’s most popular couple, its stars largely celebrated as role models of a new order of male performer. It’s about time.

Share

]]>
http://blog.commarts.wisc.edu/2011/05/03/glee-the-countertenor-and-the-crooner/feed/ 3
Watching the World’s Amazing Races http://blog.commarts.wisc.edu/2011/03/30/watching-the-worlds-amazing-races/ http://blog.commarts.wisc.edu/2011/03/30/watching-the-worlds-amazing-races/#comments Wed, 30 Mar 2011 21:26:03 +0000 http://blog.commarts.wisc.edu/?p=8842

I’m teaching Othering right now in my Media and National Identity class, and so once more Amazing Race is in my mind. Functionally, next to no other primetime shows spend as much time outside the United States, thereby making Amazing Race one of the most prominent, widely seen sites on American television for the depiction of foreign countries and peoples. And thus its representation of the world stands to “weigh” a lot more than, for instance, CSI: New York’s depiction of New York City, given the vast number of televisual depictions of the Big Apple.

What I find so frustrating about the show is not simply that it ends up Othering again and again, but that it’s a format that could allow for such interesting challenges to ideas of Othering, and that occasionally does so. It’s like a B student who writes occasionally brilliant sentences, and hence who you know could do better if s/he really applied him/herself, yet who isn’t trying hard enough.

A key problem with televisual representations of other countries and their peoples is precisely that other countries and their people are so actively represented, by which I mean the writers and directors have very certain ideas of who they want on camera. Think of Survivor here, as perhaps the only other show on primetime American television that films overseas. The locals have been evacuated from the filming site, and are only encountered as a “reward,” and as accompaniment to the nice meal that serves as centerpiece for the reward (screaming out for bell hooks’ “Eating the Other”!). They are usually chosen for their stunning primitiveness, grass-skirts, ability to dance with a smile for the cast, and/or perhaps to impart ancient tribal lore.

By contrast, Amazing Race holds great promise as a site for encountering the world. The format sees teams racing through towns, cities, and countryside and encountering random individuals who have not been selected by the directors (cabbie luck in particular playing a key role in who wins or loses). Especially when we’re in cities and places that the crew simply cannot stage manage, we therefore see an eclectic mix of foreigners. Their comments are of course heavily edited, and selectively translated, but they hold more power to speak for themselves, and to represent themselves. This may take place through quotidian acts like giving directions, refusing a team member’s requests to buy something in a challenge, or so forth, but it frees them from the need to appear solely as “reward,” and as dancing, cooking primitives.

Yet the Amazing Race still falls back into tired, old set pieces. Phil’s mat serves as an especially contentious site, somewhere for smiling, costumed locals to sit and wait for hours for the pleasure of welcoming Americans to their country. Phil’s allowed to look pissed off at having his time wasted, but they just sit there and smile. Oddly, we don’t even see Phil talk to them (I’m not looking for a Benetton ad, but are they that odious?). And once they’ve said “welcome,” it’s time to shut up and let Phil speak again, as their agency is so severely restricted.

Then there are the tasks, many of which spectacularly reduce a nation to two predominant activities (“Beg or Boogie”!), and that hire a cast of colorful locals to be their very best cover-of-the-tour-book stereotypes. When the race went to Kenya, we had Masai warriors leaping up and down, in Russia it was babushkas planting potatoes (more on them in a second, though), and so forth.

I’m also constantly both fascinated and depressed by the battle of looking, and of the imperial gaze, that goes on in many episodes. On one hand, the show often conforms to a “Heart of Darkness”-esque rendering of foreigners as painted onto a backdrop, mere props to draw the attention back to the American subjects, who constantly speak of and for the locals. See Chinua Achebe’s famous broadside attack on Conrad for more details on how insidious this kind of Othering is. On the other hand, the photographers often treat us to images of the foreigners staring at the American racers, and occasionally offer us delicious soundbytes of them criticizing them (as when, in a recent season, a group of babushkas engaged in wonderfully wry commentary on the racers’ plowing techniques and general physique). We’re also shown egregiously bad behavior from some racers, and the editing usually chastises the offending, offensive team. It might be easy to see this as a reminder that we’re looked at as much as the foreigners are, and at times it encourages us to look with the locals’ eyes, not the racers’. Yet there is no problematization of our own looking and gaze as viewers. The suggestion is a classically white liberal feel-good one that some travelers are bad, but that we’re not – our own motivations for watching, and investment in or at least culpability with the exoticization and spectacularization of difference, are never really questioned.

Despite all my criticism, though, I keep watching. The simple fact is that the show is doing more than most are to at least engage with the world at large. Us non-Americans don’t come out of this process looking all that good, and I’d love to reform the program in many ways (Sorry, Phil, but you’re not needed: let’s replace you with locals who can say more. How about international racing teams? And please, please, let’s do something about the challenges). But there’s potential, which is met at times. There are no tribal elimination scenes and fauxthentic team names. The soundtrack is rarely a lost recording session from Peter Gabriel. Nobody’s in jail at the hands of a brutal foreign government. The countries are more than just an amalgam of their lovely wildlife and pitiable slums. And none of them are being bombed or supposedly plotting the downfall of the USA en masse. In the radically culturally chauvinist landscape of American television, that alone puts Amazing Race in a rare position.

Share

]]>
http://blog.commarts.wisc.edu/2011/03/30/watching-the-worlds-amazing-races/feed/ 3
Charlie Chan and Contemporary B-Movie Fandom http://blog.commarts.wisc.edu/2011/01/15/charlie-chan-and-contemporary-b-movie-fandom/ Sat, 15 Jan 2011 15:00:34 +0000 http://blog.commarts.wisc.edu/?p=7879 Professor Yunte Huang’s recent book Charlie Chan: The Untold Story of the Honorable Detective and His Rendezvous with American History (W.W. Norton and Co, 2010) attempts to rehabilitate the reputation of a character that has for decades been considered a degrading racist stereotype.  Huang’s efforts are particularly timely in light of the Chinese-American sleuth’s revival on DVD; the numerous boxed sets have sold very well, even in a depressed market for classic films on video.  However, the fact that today’s Charlie Chan fans tend to be middle-aged (or older) white males does nothing to banish the specter of racism that surrounds the character.  On internet fan forums dedicated to Chan, accusations of racism are usually quickly dismissed as hyper-sensitivity or failure to take into account historical context.  As a Chan fan who is also a Chinese immigrant and scholar specializing in the historical intersections between Asian and American literature, Huang is in a unique position to defend the “honorable detective.”

First, Huang argues that Chan’s critics present a reductive, caricatured view of the character, who is in fact “a multilayered Chinese box” that “encapsulates both the racial tensions and creative energies of a multicultural nation” (xvii, 280).  Huang acknowledges Chan’s stereotypical qualities but asserts that the stereotype is a positive one when compared to contemporaneous representations like Fu Manchu.  Where critics like playwright Frank Chin see Chan as a reprehensible symbol of subservient acculturation, Huang understands Chan’s “cultural miscegenation” as “epitomiz[ing] the creative genius of American culture” (282-283).  Referring to Huck Finn, hip-hop, and George Carlin, Huang argues that racism has a role to play in art.  He positions Chan as an example of “a peculiar American brand of trickster prevalent in ethnic literature” that flourishes “in spite of as well as because of racism.” (287).

Huang’s arguments, erudite as they are, resemble the typical response of the Chan fan in their call to recognize but “look past” the racism.  Yet for all the time Huang spends on his personal relationship with Chan (including details of his research trips for the book), he never fully explains why the character appeals to him.  And for all of Huang’s references to “creative genius”, he never discusses the aesthetic merits of the books and films (xx).  Isn’t Chan’s lingering popularity due in part to the fact that the films are lively, enjoyable mysteries?  Does Huang especially enjoy the books because he identifies with Chan in a way that a white American-born fan cannot?  He doesn’t say.

In any event, it’s unlikely that Huang’s assortment of arguments will do much to convince Charlie Chan’s most vocal critics.  (Listen to Frank Chin’s highly critical response here, and note the listener comment that refers to Huang as a “Chinese Uncle Tom.”)  But if Huang’s defense of Chan is not entirely satisfying, it remains perhaps the only reasonable defense, as it does not ignore the way in which the character embodies the racism of its era, while also arguing that one’s enjoyment of Charlie Chan need not be grounded in a racist condescension.  But the questions remain: is it possible to enjoy a racist media text in good conscience?  Can (and should) a text’s problematic racial representations be divorced from its other qualities, such as its generic appeals?

Despite what this review might imply, Huang’s book spends relatively little time defending Chan.  Rather, it is a breezy pop history that tells the Chan story (including the fascinating life of Chang Apana, the Honolulu police officer that inspired the character) while also using Chan as a structuring device for the book, a highly digressive work that deals with the history of institutionalized racism against the Chinese in America and pre-statehood Hawaii.  Huang deals with a tremendous variety of topics (albeit in fairly shallow fashion), including the history of Chinese immigration, Chinese representation in the American media, the notorious Massie Case, and Huang’s own personal history.  Huang’s original research seems to have centered around Apana and Chan creator Earl Derr Biggers; the research in the film section is quite thin.  Here Huang relies on old Chan fan histories and internet sources, making the book a fun, interesting read for the general reader, but not particularly valuable for the film scholar.

Share

]]>
Kids Today http://blog.commarts.wisc.edu/2010/08/25/kids-today-2/ http://blog.commarts.wisc.edu/2010/08/25/kids-today-2/#comments Wed, 25 Aug 2010 13:44:04 +0000 http://blog.commarts.wisc.edu/?p=5743

By Alan McKee & Emmy-Lou Quirke

Kids are being sexualized these days. And, wouldn’t you know it, popular culture is to blame:

RAUNCHY pop stars, including Kylie Minogue, have been blasted by their own industry for going too far with sexual imagery. Kylie’s former producer, Mike Stock, has slammed saucy film clips as “sexualising” children, saying modern pop stars are going “too far”

You might think this is just another example of knee jerk prejudice against the culture of the masses .… But is it? Research shows that boys as young as two are displaying sexual behaviour such as masturbating, touching their genitals in public, and undressing in front of others. Similar research found alarming results revealing 20% of boys had had intercourse with a prostitute by the age of 18.

These are alarming statistics.  What’s happening to children today? ‘The X-rated generation’, some call them, are out of control and the media MUST be to blame.

Except, of course, that the statistics in question come from 1943. Young people are not suddenly becoming ‘sexualised’. Sexual development is a normal, healthy part of childhood, and for as long as sexology has been examining it, sexual exploration has been a part of children’s development.

Research on the sexual development of children makes for fascinating reading.  A 1928 study of infant boys reports that 55% of the cohort had masturbated before the age of 36 months. In 1933 ‘games involving undressing or sexual exploration (often under the guise of “mothers and fathers” or “doctors” [were] common by age 4 years’. As noted above, in 1943 more than 20% of boys had visited a prostitute by age 18. In 1957 about half of pre-school children exhibited ‘sex play’ or ‘genital handling’.

Not only has this been going on for at least 90 years, the research suggests that it’s perfectly healthy. In a 1993 retrospective study, 85% of women described ‘a childhood sexual game experience … [and] statistical analysis showed that these subjects did not differ from those who did not remember any childhood sexual games’.

A common phrase in the Australian media when discussing this topic is ‘let kids be kids’. ‘Kids Free 2B Kids’ is even the name of a local lobby group protesting about child sexualisation. We agree – kids should be free to be kids. And part of being a kid is sexual development. That is what kids do – what they have always done. It’s normal, and it’s healthy. Some people don’t like it, and would like it to stop. But let’s take it seriously – let kids be kids – and do what kids have always done.

Share

]]>
http://blog.commarts.wisc.edu/2010/08/25/kids-today-2/feed/ 7