We all look to Disney movies for the villains we love to hate, such as Hades or Scar, and we’ve been blown away by some of TV’s complex antagonists, such as The Handmaid’s Tale’s Serena Joy or half the characters in Game of Thrones. Yet some of the best TV villains are the ones who aren’t particularly tortured or ambiguous, but the ones who are clearly bad news, but adorable anyways.
Science fiction has long explored, and warned of, our obsession with media and its power to control our thoughts. As propaganda efforts successfully encouraged complacency among citizens of Nazi-led Germany, as countless Americans today willfully consume fake news, these concerns seem justified. Even Black Mirror has done its part to control our behavior, making some of us (ahem) sit for hours to finish the episode “Bandersnatch.” It appears that we’re simply unwitting sheep dragged along by the crook of mass media.
But does propaganda really motivate people to do things they wouldn’t normally do? Can people really not tell the difference between reality television and reality? Do violent video games cause shooting sprees? Can someone be programmed with a musical trigger to assassinate someone? Are we being controlled by social media?
In the golden age of Hollywood, movie producers were largely responsible for obtaining funding and personnel for productions; within the confines of the studio system and without a big-name producer (or a member of the United Artists), a screenplay would rarely be realized into a movie. From the mid-1940s to the present, the output of independent filmmakers has increased tremendously, and the dissolution of the studio system allowed more people to break into the industry; films produced independently could even be distributed by a major studio.
Sadly, the same thing has not happened for television. There, producers are still largely responsible for creative output; a pilot script must be pitched to network executives, if not coming from within the company, and production remains in the company’s hands. Worse, a show’s success is largely determined by Nielsen ratings, an increasingly flawed measure of consumption (and, implicit in sustained numbers, enjoyment).
Naturally, with the rise of video publishing websites and social networks, there are a number of Web shows that are independently produced and distributed. Some, like “The Guild,” are highly acclaimed and well-suited to their medium.
There have been a few experiments in crossover: in 2008, during the writers’ strike, producer Joss Whedon went indie with “Dr. Horrible’s Sing-Along Blog.” After streaming as a web mini-series, the show was distributed by iTunes, Hulu, Netflix, and eventually on DVD through Amazon.com and New Video Group. This summer, the show was picked up for rerelease on the The CW network. Despite high consumption online and in DVD sales, the rating for the broadcast, on Oct. 9, 2012, was a paltry 566,000 viewers. While it may be said that fans of the show already had digital or hard copies and didn’t need to tune in, it’s more likely that “Dr. Horrible” was simply more suited to viral distribution. It does raise questions about the marketability of network TV shows in the reverse direction.
Or for that matter, of studio projects to viewers before their production. Acclaimed director David Fincher is attempting to recruit the media-hungry Internet masses to an avenue traditionally not open to the public: funding a movie production. Unable to get studio backing for an adaptation of The Goon and unable to fund it himself, Fincher has turned to Kickstarter to crowdsource the funds:
So far the project has just shy of 2,500 backers raising about $162,000 with 19 days left to raise the funds. In response to the question of why Fincher does not just fund it himself, Miller said: “Hollywood is filled with the ‘vanity projects’ of successful movie stars and producers. It really is not as easy to get a film made no matter who you are.”
If Fincher succeeds, the line between the studio-made and the homegrown will be irreversibility blurred. TV, however, remains firmly stratified.
Dystopian settings have been a staple of speculative and science fiction for over a century. While Jules Verne was penning his tales of great feats of engineering, incredible new worlds, and the future of human expression and technology, H.G. Wells envisioned bleak scenarios of alien takeovers, the division of the human race, the devastating effects of nuclear weapons, and the dangers of genetic modification. A few decades later, Ray Bradbury and George Orwell wrote stories of oppression and destruction set in postwar, tyrannical regimes. High school students may not have appreciated their teachers’ efforts to introduce them to these cautionary tales, but as the apocalypse obsession gains steam, doubled with an uprising of dystopian and postapocalyptic TV shows and movies, one wonders what the demand for and popularity of these stories indicate about the American and British states of mind, and why science fiction in particular has largely shifted in the past twenty-odd years from triumphant tales of resistance to invasion (“Independence Day,” the latter two “Alien” films) and exploration of new worlds and species (“ET,” “Contact”), to despairing stories of crumbling societies (“V for Vendetta”), overwhelming paradigm shifts (“Children of Men”, “Minority Report”), and incredible biological (“28 Days Later”) and environmental destruction “(“Waterworld”). While not all of these films involve a major event of destruction, they all involve a complete change in the fundamental nature of the world, and, often, the revelation of underlying truths, which is the etymological meaning of “apocalypse” and the purpose of most speculative fiction.
As Rapturists secure post-Rapture arrangements for their pets, 2012 enthusiasts piddle away their savings in anticipation of the end times, and extremist conspiracy theorists stock their fallout shelters, anxiety about the apocalypse, whether or not it is followed by dystopia, makes for hot news, yet isn’t limited to the “nutjobs.” The popularity of what we might call “downfall news,” such as the now infamous Miami face-eating attack, and the constant stream of horrible stories involving rapes and murders, is plainly demonstrated even in sources as mundane as Yahoo!’s “Top Emailed Stories” feed. The fictional apocalypse and the chronicles of the days thereafter have broad appeal: children’s movies, “Wall-E” and the “Ice Age” franchise’s increasingly apocalyptic content, blockbusters like “2012” and “Battle: Los Angeles,” science fiction drama-thrillers such as “Gattaca” and “Daybreakers,” the endless stream of zombie apocalypse entertainment, the Katniss novels, and TV shows such as “Revolution,” “FlashForward,” “Terra Nova,” and “The Walking Dead.” Even the past season of “Dexter” and the current season of “Glee” involve characters who are Rapture-obsessed.
I would suggest that this prevailing interest stems not only from an urge to caution in an age where widespread environmental degradation and the threat of nuclear warfare are a part of the global reality, but an sense of disenchantment and isolation and the desire to experience a unity of humankind, even if through traumatic events. I have heard many critics complain that post-apocalyptic entertainment focuses on the trials of a small group of people, especially if they are all white/related/rich/etc., but this is done not only for production realities, but as a reflection of the viewer’s primary frame of reference…”What would I do? What connects me to this event?” In the end, a post-apocalyptic, dystopian world needs to be accessible to the pre-apocalyptic, not-utopian viewer: a sense of cynic detachment can only be resolved through the catharsis of moral certainty and confident survival attained through this genre of fiction.
As most anthropologists know, the roles that people inhabit and are assigned in society are neither inherent nor permanent. Categories and classes of people are historically built, and change occurs in both membership and definition upon economic changes, new ideologies, or technological development. However, these psychocultural systems and biases are perpetuated through both language and praxis. The labeling hypothesis, developed largely by Erving Goffman, maintains that the connotations, expectations, and implications of a label form the scope of the role inhabited by a person with that label. Moreover, class, racial, sexual, and other distinctions are drawn in part by opposing labels.
The construction of “us vs. them” is accomplished in myriad ways and in almost every social venue. It is done in workplaces, schools, families, and in the public sphere. It is used by politicians and pundits to draw lines between the audience and the party’s opposition. It is used by religious leaders to explain why followers are privileged over the non-believers and the wrong-believers. It is even used by reality show hosts and teen romance writers to divide the audience into two camps who can compete and thus increase viewer- or readership. I would like to briefly review some recent news items to demonstrate how “us” is divided from “them” via the media.
According to Megan Reback of the Women’s Media Center:
More than a decade has passed, yet the deep hatred in the United States of those who practice Islam has not subsided. In fact, the radical right – which has increasingly become part of the GOP’s status quo – has held onto these beliefs both proudly and shamelessly.
Some weeks ago, I confronted startling evidence of this mindset as I disembarked from a Metro North train at the end of my weekday commute to and from New York City. Amid the familiar army of black and navy blue suits eager to join families for dinner, I noticed a stark, black advertisement with red, white, and blue type: “*19,250 DEADLY ISLAMIC ATTACKS SINCE 9/11/01 *AND COUNTING. IT’S NOT ISLAMOPHOBIA, IT’S ISLAMOREALISM.”
According to Mother Jones magazine, the ad and others targeting New York and San Francisco commuters are sponsored by the anti-Muslim blogger Pamela Geller. She made headlines last year when she backed other ads castigating a proposed Islamic community center near ground zero, calling it a “mega mosque” and a “victory mosque” that celebrated 9/11. … By mid-2010, Geller became a fixture on Fox News, commenting on U.S. foreign policy in the Middle East and the threat of Muslims and Shariah law in the United States. The Southern Poverty Law Center considers Geller’s organization, Stop Islamization of America, a hate group.
The anti-Islam ads, however, are not random outliers or radical statements. Instead, they represent a fear and hatred of Muslims and Islam that has been particularly rife of late.
Reback recaps some anti-Islam statements by GOP members and the attack on the Sikh temple in Wisconsin, and concludes by characterizing the ad described above as unproblematic except in its reflection and perpetuation of hate by a major political party. Which is a pretty big problem, to say the least. The ad’s language is hard to deny, as well. Even a person ignorant of the details but who still gets nauseous thinking of 9/11 would see that statistic, consider its impact, and be more inclined to think of Muslims as dangerous, violent people. The pejoration is done very simply, over repeated exposure to these blanket statements that play on the emotions. This is the entire purpose of rhetoric.
Here’s a more mundane example:
Women don’t enjoy this military action movie because it is “our” movie. And this is “our” drink. The us-them division couldn’t be more clear. This ad plays on stereotypes of movie and drink preferences to (over)compensate for the suggestion—already socially constructed—that a low-calorie drink is feminine. The ad ends by proudly exclaiming that women could keep their “romantic comedies and lady drinks,” because Dr. Pepper Ten is a low-calorie drink that’s appropriate for men.
Of course, these ads are very plain in their intent. The anti-Islam ad was clearly intended to incite anger towards a group of people deemed “other,” in an attempt to curry favor for a particular group of organizations. The Dr. Pepper Ten was clearly intended to be ridiculous enough to sell a product. However, its affirmation of gender stereotypes is distressing in an era where women are earning more and more yet comprise 17% of Congress, are vastly underrepresented in Hollywood, and are at risk for domestic violence, which encompasses a range of crimes that number, in reported cases in Florida alone, in the thousands.
We also cannot ignore the rhetorical power of non-advertising visual media. Stories are excellent vehicles for ideology and both tools and venues for social construction as the audience absorbs, reacts with, and retells the story. In 1998, “Will & Grace” popularized the first flamboyant gay characters on television. Unfortunately, the gay comic relief became somewhat of a trope, repeated endlessly on various sitcoms or on dramedies such as “Sex and the City.” A new class of “homosexual person” had been formed, and while the likable characters, their unrealism stood in stark contrast to news reports of various violent or pedophilic acts by gay men, encouraged by and conflated with anti-gay campaigns by conservative and/or religious outfits.
Eventually non-flamboyant gay characters featured on longer-form shows that allowed for character development, including “Glee” and “Modern Family.” Now, as though to trumpet the progress of positive gay representation in entertainment television, NBC’s “The New Normal” has hit the small screen, and follows the lives of a gay male couple exploring their options for children. Unfortunately, the rhetorical intent of the production, however important, may not be as salient as the symbolic content, which involves a certain exoticization (“them!”), explains Frank Bua of The Huffington Post:
[M]any of the show’s generalizations are likely more damaging than entertaining: Gays are wealthy, materialistic effetes with crazy disposable income. Gay men randomly wake up and decide that they want a child as the latest must-have accessory. Prospective parents look through a catalog of egg donors like they are recruiting for the HJ. A gay couple? One part effeminate man-boy, the other part a football-watching handsome dude
While Bua bemoans the show’s shortcomings, it is nonetheless clear that social change is happening, more visibly and perhaps more quickly, thanks to mass and entertainment media. Furthermore, each of these examples should demonstrate why studying the media is so revelatory of the process of construction; the understanding of these processes allows us to deconstruct enough to put the spare bits towards change. If we want it.