Chamber Horror: A New Film Subgenre

Chamber theatre is a style of theatrical production in which there is typically little to no set (and any set pieces are moved by performers as part of the show), and the emphasis is on the text and character development rather than on theatrics and special effects. One of the best known examples is the chamber drama 12 Angry Men, in which all of the action occurs within a single room where jurors deliberate over a man’s fate. Naturally, this format evolved into film, where small-scale, low-budget films emphasized dialogue and character dynamics — usually, tension and conflict—over big and showy cinema.
Small-scale and low-budget? Sounds perfect for horror filmmakers! I jest, of course… there are some high-budget chamber horror films, as I’ll discuss below. Yet the chamber style empowers the horror genre to tap into its philosophical roots: what does it mean to be human? what does it mean to be good … or evil?

Read my analysis of chamber horror films on Medium.

TV’s Quirkiest Villains

We all look to Disney movies for the villains we love to hate, such as Hades or Scar, and we’ve been blown away by some of TV’s complex antagonists, such as The Handmaid’s Tale’s Serena Joy or half the characters in Game of Thrones. Yet some of the best TV villains are the ones who aren’t particularly tortured or ambiguous, but the ones who are clearly bad news, but adorable anyways.

Read my list on Medium.

Does the Media Control Our Minds?

Science fiction has long explored, and warned of, our obsession with media and its power to control our thoughts. As propaganda efforts successfully encouraged complacency among citizens of Nazi-led Germany, as countless Americans today willfully consume fake news, these concerns seem justified. Even Black Mirror has done its part to control our behavior, making some of us (ahem) sit for hours to finish the episode “Bandersnatch.” It appears that we’re simply unwitting sheep dragged along by the crook of mass media.

But does propaganda really motivate people to do things they wouldn’t normally do? Can people really not tell the difference between reality television and reality? Do violent video games cause shooting sprees? Can someone be programmed with a musical trigger to assassinate someone? Are we being controlled by social media?

Let’s find out.

Cultural Exchange through the Movies (LINK)

Americans may feel privileged to have such access to films and television. Indeed, we’ve reached (perhaps even surpassed) a saturation point in entertainment media. What we forget is that audiovisual media is a major cultural conduit—or rather a network of connective fibers that generate and shape our social consciousness—and its immersive qualities are well suited to cultural exchange. A few months ago, I attended a screening on campus of a film about HIV, filmed and produced by the Datoga in partnership with anthropologists. The film is particularly ethnographic in a grassroots sort of way, in that its target audience elected its own informants, those the community deemed trustworthy, and used prevailing cultural symbols and expressions to communicate the often Western-centric rhetoric of HIV/AIDS awareness. (The film is here.) The idea of a shared cultural consciousness permeating its works was a hallmark of Straussian structuralist anthropology, but in a post-postmodernist age, the understanding of the lattice effect of structure, ritual, symbol, and ethos has proved particularly fruitful in applied visual anthropology. While the Datoga project was an example of applied visual ethnography, with an explicit educational purpose, a recent NYTimes article discussed the approach from a different standpoint: culturally applied filmmaking.

What [the Iraqi filmmakers] definitely don’t have at home is a film industry, something being addressed, at least to a degree, by the nonprofit International Film Exchange. The exchange brought the students over from Baghdad where, several weeks before, the filmmaker Bill Megalos of Los Angeles had conducted a 10-day workshop on storytelling and editing. The exchange is devoted primarily to cultural give and take and international understanding. But in the case of the Iraqis, it may help create a base of knowledgeable filmmakers, a “crew” as the young men themselves called it. Since the economic sanctions imposed after the first gulf war, making films in Iraq has become all but impossible.

“It was my family business,” said the bearish Salam S. Mazeel, 35, whose mother was a sound designer, and who wants to be a cinematographer like his father. “But in the ’90s, everything stopped. We go to the hard times. No money, no hope.”

After his father died, his mother quit the business to raise her children; there was no cinema anyway. “That’s how it was,” Mr. Mazeel said. “Now, maybe something is different and we come to America and there are a few things in our minds. Like how to apply American rules to Iraqi movies.”

The article goes on to discuss how Iraqi films could take a cue from Hollywood movies and move away from the previous emphasis on style that European cinema demonstrates:

Years ago, Iraqi filmmakers would regularly attend VGIK, the Moscow film school; Iraqi film was influenced far more by European than American cinema. In Los Angeles, the Iraqi visitors were being advised by almost everyone to make their stories clear, to emphasize narrative over style.

That’s an interesting thing, considering the woefully incomplete or slapdash plots seen in much Hollywood fare. But truth be told, the expressionistic, avant-garde philosophy, seen in Soviet cinema and developed in later German and French films, has been relegated to the indie circuit in the U.S., the fortress of solitude for disillusioned American film buffs. The interest in plot in the United States derives partly from the well-made tradition that was popular in Britain and the U.S. around the same time the film industry was developing, and partly, I believe, from a capitalist ethos. But that’s a topic for another blog.

What’s intriguing about Megalos’ workshop for Iraqi filmmakers is its prescriptive purpose. It is a shade of the cultural imperialism the U.S. holds around the world. Are our films successful overseas because of their effective narratives, as the article suggests? Or because of the corollary economic influence? And films are products, as we know. Moreover, after congratulating ourselves on bringing democracy and peace to Iraq (at least for a moment or two), it seems an echo to claim artistic benefits to them as well.

However, the infusion of the Iraqi filmmakers’ films with their distinctive ethos, under the auspices of Hollywood economic,  political, and aesthetic structures, is not only the product, but the method, of cultural exchange. It is a new kind of ethnographic filmmaking, in which the individuals’ culture is writ large through collaborative works, nestled within a historical portrait of fluctuating, overlapping sociocultural conditions. It is why films are of interest to anthropologists, and why anthropologists continue to use films to communicate ideas. It is probably clear to the International Film Exchange; thanks to the U.S.’s economic power, Hollywood has the tools of the trade to empower all filmmaking cultures, with the end goal being understanding of humanity, not imperialism.

 

Related: Activist Filmmaking

The Impossible: How Important Is Ethnicity in “True Story” Films? (LINK)

The Oscar-nominated The Impossible has received a lot of flak for casting white British actors in the story of a Spanish family who experienced the 2004 tsunami while on vacation in Thailand. Accusations of racism, disinterest, and simply lack of trying have been hurled at the (Spanish) production team. According to this article on HuffPost,

Though perhaps seemingly a bit harsh, the real answer might not be that far off. When asked by the Spanish daily El Mundo about the reason why he didn’t cast Spanish actors for his film, Bayona admits it all came down to one factor: money.

“I would have loved to tell this story with Spanish actors. We tried, but it proved impossible to raise funding without international actors. The first version of the screenplay was written in Spanish and then we realized that 80% of the dialogue was also in English. So it was natural that we chose European actors who speak English. But, without revealing the nationality of the protagonists. This is not a film of nationality, race or social class. All that was swept by the wave,” the director said.

Once again we see the conflict between marketing needs and cultural realism. Is the film dishonest or harmful for using white actors, in particular British actors, considering that the tsunami affected areas formerly part of the British Empire? How different would the film have been if Spanish actors had been cast? Is it possibly to successfully promote a film in an international market using unknown (read:non-white) actors? I would note that The Life of Pi did not turn its main character white.

The Role of Competition in American Movies (LINK)

This short essay on Sociological Images discusses a trope in American film. As a capitalistic society, one would expect themes of competition and conquest to dominate our culture. Interesting, isn’t it, how the predestined fate of the tragic hero has been supplanted by the possibility for absolution, based upon social conquest?

 In British films of the sixties – “The Loneliness of the Long Distance Runner” or “This Sporting Life” for example – athletic contests bring a heightened consciousness of the class system.  But in American movies, regardless of the setting – the boxing ring, the pool hall, the poker game, the karate dojo, the dance floor, etc. – competition works its magic and allows the heroes to overcome all personal and interpersonal problems.

via The Role of Competition in American Movies » Sociological Images.

Women in Hollywood: What the Awards Season Tells Us

The Oscar nominations have been announced, and the honeymoon period of a post-“Bridesmaids” and “The Help” Hollywood seems to be over. As a Women’s Media Center feature notes, the number of women nominated has decreased (not that it was terribly high last year); moreover, most of the nominations for women reflect the gender roles expressed de facto in the film industry, i.e. that women work in, and are acknowledged for Makeup, Costume Design, and Art Department. Tech-heavy categories such as Sound and F/X are dominated by men, as are the longtime boys’ clubs of Directing and Screenwriting.

The awards season is a contentious time for both film buffs and those in the industry. The former complain that the films they’ve actually seen or had popular appeal are not represented in the nominations; the latter complain that the folks nominated are only so based on obscurity or specificity of role, popularity among the Academy members, or seniority in the industry. The Academy has responded by changing the rules governing the number of Best Picture Nominees, first increasing it from 5 to 10 for the 82nd (2010) Oscars, then requiring that the nominees get at least 5 percent of the first-place votes. Understandably, they are reluctant to change the categories, although they have responded to pressure to create the Best Animated Feature category. The newer specific categories were created in the 60s and, interestingly, reflect the male-heavy tech categories, such as Best Visual Effects and Best Sound Mixing. However, the Academy continues to generalize the categories for which women are more often nominated and that encompass a wide range of traditionally female positions in film production. Last year, the Art Direction category was generalized into Production Design; this year, the Makeup category was expanded to include Hairstyling.

Additional Reading:

Dystopian TV Shows and Films: An Introduction

Dystopian settings have been a staple of speculative and science fiction for over a century. While Jules Verne was penning his tales of great feats of engineering, incredible new worlds, and the future of human expression and technology, H.G. Wells envisioned bleak scenarios of alien takeovers, the division of the human race, the devastating effects of nuclear weapons, and the dangers of genetic modification. A few decades later, Ray Bradbury and George Orwell wrote stories of oppression and destruction set in postwar, tyrannical regimes. High school students may not have appreciated their teachers’ efforts to introduce them to these cautionary tales, but as the apocalypse obsession gains steam, doubled with an uprising of dystopian and postapocalyptic TV shows and movies, one wonders what the demand for and popularity of these stories indicate about the American and British states of mind, and why science fiction in particular has largely shifted in the past twenty-odd years from triumphant tales of resistance to invasion (“Independence Day,” the latter two “Alien” films) and exploration of new worlds and species (“ET,” “Contact”), to despairing stories of crumbling societies (“V for Vendetta”), overwhelming paradigm shifts (“Children of Men”, “Minority Report”), and incredible biological (“28 Days Later”) and environmental destruction “(“Waterworld”). While not all of these films involve a major event of destruction, they all involve a complete change in the fundamental nature of the world, and, often, the revelation of underlying truths, which is the etymological meaning of “apocalypse” and the purpose of most speculative fiction.

As Rapturists secure post-Rapture arrangements for their pets, 2012 enthusiasts piddle away their savings in anticipation of the end times, and extremist conspiracy theorists stock their fallout shelters, anxiety about the apocalypse, whether or not it is followed by dystopia, makes for hot news, yet isn’t limited to the “nutjobs.” The popularity of what we might call “downfall news,” such as the now infamous Miami face-eating attack, and the constant stream of horrible stories involving rapes and murders, is plainly demonstrated even in sources as mundane as Yahoo!’s “Top Emailed Stories” feed. The fictional apocalypse and the chronicles of the days thereafter have broad appeal: children’s movies, “Wall-E” and the “Ice Age” franchise’s increasingly apocalyptic content, blockbusters like “2012” and “Battle: Los Angeles,” science fiction drama-thrillers such as “Gattaca” and “Daybreakers,” the endless stream of zombie apocalypse entertainment, the Katniss novels, and TV shows such as “Revolution,” “FlashForward,” “Terra Nova,” and “The Walking Dead.” Even the past season of “Dexter” and the current season of “Glee” involve characters who are Rapture-obsessed.

I would suggest that this prevailing interest stems not only from an urge to caution in an age where widespread environmental degradation and the threat of nuclear warfare are a part of the global reality, but an sense of disenchantment and isolation and the desire to experience a unity of humankind, even if through traumatic events. I have heard many critics complain that post-apocalyptic entertainment focuses on the trials of a small group of people, especially if they are all white/related/rich/etc., but this is done not only for production realities, but as a reflection of the viewer’s primary frame of reference…”What would I do? What connects me to this event?” In the end, a post-apocalyptic, dystopian world needs to be accessible to the pre-apocalyptic, not-utopian viewer: a sense of cynic detachment can only be resolved through the catharsis of moral certainty and confident survival attained through this genre of fiction.

“Us vs. Them”: How the Media Can Breed Hate and Inequality…or Love and Social Change

As most anthropologists know, the roles that people inhabit and are assigned in society are neither inherent nor permanent. Categories and classes of people are historically built, and change occurs in both membership and definition upon economic changes, new ideologies, or technological development. However, these psychocultural systems and biases are perpetuated through both language and praxis. The labeling hypothesis, developed largely by Erving Goffman, maintains that the connotations, expectations, and implications of a label form the scope of the role inhabited by a person with that label. Moreover, class, racial, sexual, and other distinctions are drawn in part by opposing labels.

The construction of “us vs. them” is accomplished in myriad ways and in almost every social venue. It is done in workplaces, schools, families, and in the public sphere. It is used by politicians and pundits to draw lines between the audience and the party’s opposition. It is used by religious leaders to explain why followers are privileged over the non-believers and the wrong-believers. It is even used by reality show hosts and teen romance writers to divide the audience into two camps who can compete and thus increase viewer- or readership. I would like to briefly review some recent news items to demonstrate how “us” is divided from “them” via the media.

According to Megan Reback of the Women’s Media Center:

More than a decade has passed, yet the deep hatred in the United States of those who practice Islam has not subsided. In fact, the radical right – which has increasingly become part of the GOP’s status quo – has held onto these beliefs both proudly and shamelessly.

Some weeks ago, I confronted startling evidence of this mindset as I disembarked from a Metro North train at the end of my weekday commute to and from New York City. Amid the familiar army of black and navy blue suits eager to join families for dinner, I noticed a stark, black advertisement with red, white, and blue type: “*19,250 DEADLY ISLAMIC ATTACKS SINCE 9/11/01 *AND COUNTING. IT’S NOT ISLAMOPHOBIA, IT’S ISLAMOREALISM.”

According to Mother Jones magazine, the ad and others targeting New York and San Francisco commuters are sponsored by the anti-Muslim blogger Pamela Geller. She made headlines last year when she backed other ads castigating a proposed Islamic community center near ground zero, calling it a “mega mosque” and a “victory mosque” that celebrated 9/11. … By mid-2010, Geller became a fixture on Fox News, commenting on U.S. foreign policy in the Middle East and the threat of Muslims and Shariah law in the United States. The Southern Poverty Law Center considers Geller’s organization, Stop Islamization of America, a hate group.

The anti-Islam ads, however, are not random outliers or radical statements. Instead, they represent a fear and hatred of Muslims and Islam that has been particularly rife of late.

Reback recaps some anti-Islam statements by GOP members and the attack on the Sikh temple in Wisconsin, and concludes by characterizing the ad described above as unproblematic except in its reflection and perpetuation of hate by a major political party. Which is a pretty big problem, to say the least. The ad’s language is hard to deny, as well. Even a person ignorant of the details but who still gets nauseous thinking of 9/11 would see that statistic, consider its impact, and be more inclined to think of Muslims as dangerous, violent people. The pejoration is done very simply, over repeated exposure to these blanket statements that play on the emotions. This is the entire purpose of rhetoric.

Here’s a more mundane example:
[youtube http://www.youtube.com/watch?v=3iuG1OpnHP8?rel=0]

Women don’t enjoy this military action movie because it is “our” movie. And this is “our” drink. The us-them division couldn’t be more clear. This ad plays on stereotypes of movie and drink preferences to (over)compensate for the suggestion—already socially constructed—that a low-calorie drink is feminine. The ad ends by proudly exclaiming that women could keep their “romantic comedies and lady drinks,” because Dr. Pepper Ten is a low-calorie drink that’s appropriate for men.

Of course, these ads are very plain in their intent. The anti-Islam ad was clearly intended to incite anger towards a group of people deemed “other,” in an attempt to curry favor for a particular group of organizations. The Dr. Pepper Ten was clearly intended to be ridiculous enough to sell a product. However, its affirmation of gender stereotypes is distressing in an era where women are earning more and more yet comprise 17% of Congress, are vastly underrepresented in Hollywood, and are at risk for domestic violence, which encompasses a range of crimes that number, in reported cases in Florida alone, in the thousands.

We also cannot ignore the rhetorical power of non-advertising visual media. Stories are excellent vehicles for ideology and both tools and venues for social construction as the audience absorbs, reacts with, and retells the story. In 1998, “Will & Grace” popularized the first flamboyant gay characters on television. Unfortunately, the gay comic relief became somewhat of a trope, repeated endlessly on various sitcoms or on dramedies such as “Sex and the City.” A new class of “homosexual person” had been formed, and while the likable characters, their unrealism stood in stark contrast to news reports of various violent or pedophilic acts by gay men, encouraged by and conflated with anti-gay campaigns by conservative and/or religious outfits.

Eventually non-flamboyant gay characters featured on longer-form shows that allowed for character development, including “Glee” and “Modern Family.” Now, as though to trumpet the progress of positive gay representation in entertainment television, NBC’s “The New Normal” has hit the small screen, and follows the lives of a gay male couple exploring their options for children. Unfortunately, the rhetorical intent of the production, however important, may not be as salient as the symbolic content, which involves a certain exoticization (“them!”), explains Frank Bua of The Huffington Post:

[M]any of the show’s generalizations are likely more damaging than entertaining: Gays are wealthy, materialistic effetes with crazy disposable income. Gay men randomly wake up and decide that they want a child as the latest must-have accessory. Prospective parents look through a catalog of egg donors like they are recruiting for the HJ. A gay couple? One part effeminate man-boy, the other part a football-watching handsome dude

While Bua bemoans the show’s shortcomings, it is nonetheless clear that social change is happening, more visibly and perhaps more quickly, thanks to mass and entertainment media. Furthermore, each of these examples should demonstrate why studying the media is so revelatory of the process of construction; the understanding of these processes allows us to deconstruct enough to put the spare bits towards change. If we want it.

The Art of Immersion: Found Footage and Classical Film

The “found footage” genre (style?) of filmmaking was wildly popularized for the horror genre beginning with the (in)famous Sundance entrant “The Blair Witch Project,” and judging by the apparently exhaustive list here, horror has become its home; one might even describe such films as “Paranormal Activity,” “V/H/S,” “The Last Exorcism,” as part of a subgenre of horror. All the films present themselves as a recording of horrific events—some “intentionally” done as a documentary, some as a loose narrative involving some supernatural terror. Most revolve around young people who mirror the films’ demographic: the share-all, tech-obsessed Generation Me.

Beyond this style’s obvious reflection in the rise of YouTube artists, confessional “reality” TV programming, and emphasis on social media as a means of establishing relationships, the “found footage” style marks a new type of filmmaking, one that eschews special f/x and big names of modern cinema, but also the grand art and epic tales of classic cinema.

In fact, reading recently about French New Wave cinema and Francois Truffaut’s skewering of the literary, artificial cinematic tradition in “A Certain Tendency of the French Cinema,” I couldn’t help but think that a similar paradigm shift is taking place now in the United States. Of course, the range of U.S. films includes the emotional indies, the riveting biopics, the sweeping historical pieces, the wide range of comedy, the disturbing drama, the provocative speculative fiction pieces, and the grandiose adventure tales. Even in the  next few months, we have “2 Days in New York,” “Lincoln,” “Argo,” “Bachelorette,” “Compliance,” “Looper,” and “The Hobbit.” None of these are in the “found footage” format.

However, to compare the construction of certain classic films with these new styles reveals several interesting similarities. To begin with, I will discuss two exceptional found-footage sci-fi movies. The first, “Cloverfield,” was released in 2008, directed by J.J. Abrams and written by Drew Goddard, one of his “Lost” writers and one of “Buffy’s” and “Angel’s” latter-season writers. The film is presented as a tape found at the aftermath of an attack on New York City, and uses the “film-within-a-film” device to contrast the lives of its subjects before the attack, with the horror they experience. (The beginning of the tape, which is preserved before one of the characters accidentally records over it with footage of a party, followed by full documentation of their attempted escape, provides a rather clever means of exposition.) This approach allows for a very natural exploration of character and a very relaxed method of storytelling: pure causality, in that the characters are shoved into one situation or another according to primarily external factors, but with the overriding impulse of the central character to save his girlfriend. By using a handheld that was partly operated by one of the actors, the wonderfully blunt and sometimes inappropriately funny T.J. Miller, we are given a greater sense of realism than we do with the self-elected documentary style. Rather than filming what’s “important,” and thus destroying the illusion of the film (which is a problem that I had with “The Blair Witch Project”), the camera is simply always on, with only some necessary cuts made when Miller’s character is forced to turn off the camera. This film has no mark of being a “film” in name, but by its fast-paced, unforgiving, accidental documentation of the monster and its associated terrors, of the trauma experienced by the characters, and the general horrific atmosphere, is so convincingly “found” that we lose ourselves in it. We become immersed…and scared.

More recently, “Chronicle” (2012), directed and written by “The Kill Point” creator Josh Trank and co-written by horror TV scribe Max Landis, appears to be in the found-footage style but takes it to the next level: rather than limit to one camera and present itself as a found recording, it is told entirely through cameras that we are made aware exist. While one would expect that this would produce a distracting barrier, it accomplishes the opposite.

Firstly, since it can be told through multiple cameras, the story may be told through different points of view. This may sound obvious, but consider the tremendous bias given by true “found footage” movies. When the actors are the cameramen (or when the cameramen adopt only the POVs of the actors), it limits the audience’s involvement. The viewer enjoys being in a privileged place of knowing more than the actors do; it’s actually surprising that this style is popular with horror movies since the viewer loses the privilege of shouting “Look out!” Imagine the classic horror films in found-footage style: there would be no sense of dread as the shadow of Norman Bates appears beyond the shower curtain, no sense of doom as the Alien unfolds behind Brett, no sense of panic when the shark edges up to the skinny-dipper’s legs. What is more frightening than something unseen?*

Yet “Chronicle” manages to frighten by going in the opposite direction: freed from the documentarian bias and couched in the entirely realistic exploits of three teenagers who happily videotape themselves engaging in various mischief courtesy of their newfound telekinesis, it oversaturates us with endless videos of their (sometimes harmful) pranks, then at the climax, when the powers are (of course) all-consuming, turns the requisite “movie media” into unwitting storytellers (as they are…). The plethora of cameraphones, security feeds, news cameras, and personal camcorders all capture this tremendous event from a dizzying number of perspectives, edited together so randomly, with unheard-of perspectives as the equipment falls and flies, and completely sucks in the viewer. We are so aware that we are watching it through cameras, but it is so overstimulating that the illusion is reverse-engineered: we cannot help but believe what we are seeing.

In a completely different genre and decade, we notice a similarly collective approach. In fact, this same philosophy applies to many films whose common denominator is their use of multiple perspectives and ambient emotion. Quentin Tarantino’s and Wes Anderson’s films are a prime example. Recently for a class, I watched Otis Preminger’s “Bonjour Tristesse” and was fascinated by the use of subtle nonverbal acting, the construction of the main story within an uneventful night in the central character’s life, and the extensive use of wide perspective and long takes. Moreover, much happens in the film that is not directly related to the central plot, which details a teenage girl’s initial support for her father to take up with a woman she has admired, and its turn into a plot to expel her supposedly wicked new stepmother from her life. The minor stories surrounding this main plotline have no impact on those events, nor do they all concern the central characters. However, they express themes or reveal character such that we have a strong sense of who the central characters, who they are not, who they are pretending to be, and in addition what we’re supposed to be considering while we watch the drama unfold. Preminger’s style is very detached, very deliberate, and very skeptical, yet immerses us in a world in which we can watch (and criticize) characters’ choice.

Despite the complete difference in genre and topic, the same thing is accomplished in Tarantino’s and Anderson’s films, among others, and in “Chronicle.” This phenomenon certainly points to humans’ complicit construction in viewing a story, and may have implications for research into story therapy—perhaps the new immersion therapy?

*As Stephen King wrote in Entertainment Weekly #1001 (2008):

“Horror is an intimate experience, something that occurs mostly within oneself…the event films that pack the plexes in the summer…blast our emotions and imaginations, instead of caressing them with a knife edge. […] Horror is not spectacle, and never will be.”