It bothers me so much when people say that Beauty and the Beast is about Stockholm Syndrome. It is about a girl’s love for her father and being able to look beyond appearances — and about how amazing books are. The Beast has lessons to give too. He is a symbol of self-loathing toxic masculinity who then discovers what true masculinity, bravery, and selflessness involve.
We all look to Disney movies for the villains we love to hate, such as Hades or Scar, and we’ve been blown away by some of TV’s complex antagonists, such as The Handmaid’s Tale’s Serena Joy or half the characters in Game of Thrones. Yet some of the best TV villains are the ones who aren’t particularly tortured or ambiguous, but the ones who are clearly bad news, but adorable anyways.
Science fiction has long explored, and warned of, our obsession with media and its power to control our thoughts. As propaganda efforts successfully encouraged complacency among citizens of Nazi-led Germany, as countless Americans today willfully consume fake news, these concerns seem justified. Even Black Mirror has done its part to control our behavior, making some of us (ahem) sit for hours to finish the episode “Bandersnatch.” It appears that we’re simply unwitting sheep dragged along by the crook of mass media.
But does propaganda really motivate people to do things they wouldn’t normally do? Can people really not tell the difference between reality television and reality? Do violent video games cause shooting sprees? Can someone be programmed with a musical trigger to assassinate someone? Are we being controlled by social media?
Exciting news: My semantic analysis of news articles on bullying, that I presented at the IASESP conference in April, was accepted for publication in the Journal of Contemporary Anthropology Vol. 4! The title of the article, “The Social Construction of Bullying in U.S. News Media,” describes my contextualizing research for my upcoming documentary. I am thrilled to have this validation and exposure for an anthropological approach to bullying, and the boost it will give to the future stages of my research.
Some highlights from the reviews:
The author has provided a thought-provoking and well-written paper on the topic of bullying and the application of folklore and media studies methodologies in the study and prevention of the phenomenon. I think that the paper adds much to the discipline with respect to its multidisciplinary scope. The author does an excellent job of backing up the use of the folklore/media studies approach. The paper is also an important addition to applied anthropology and can serve as a catalyst for further studies related to bullying and other social phenomena.
This article discusses an interesting topic relevant to our contemporary society, cleverly set
against the backdrop of folklore studies and media culture. Overall the article has a strong potential and
displays a good understanding of related theoretical and contextual framework.
Americans may feel privileged to have such access to films and television. Indeed, we’ve reached (perhaps even surpassed) a saturation point in entertainment media. What we forget is that audiovisual media is a major cultural conduit—or rather a network of connective fibers that generate and shape our social consciousness—and its immersive qualities are well suited to cultural exchange. A few months ago, I attended a screening on campus of a film about HIV, filmed and produced by the Datoga in partnership with anthropologists. The film is particularly ethnographic in a grassroots sort of way, in that its target audience elected its own informants, those the community deemed trustworthy, and used prevailing cultural symbols and expressions to communicate the often Western-centric rhetoric of HIV/AIDS awareness. (The film is here.) The idea of a shared cultural consciousness permeating its works was a hallmark of Straussian structuralist anthropology, but in a post-postmodernist age, the understanding of the lattice effect of structure, ritual, symbol, and ethos has proved particularly fruitful in applied visual anthropology. While the Datoga project was an example of applied visual ethnography, with an explicit educational purpose, a recent NYTimes article discussed the approach from a different standpoint: culturally applied filmmaking.
What [the Iraqi filmmakers] definitely don’t have at home is a film industry, something being addressed, at least to a degree, by the nonprofit International Film Exchange. The exchange brought the students over from Baghdad where, several weeks before, the filmmaker Bill Megalos of Los Angeles had conducted a 10-day workshop on storytelling and editing. The exchange is devoted primarily to cultural give and take and international understanding. But in the case of the Iraqis, it may help create a base of knowledgeable filmmakers, a “crew” as the young men themselves called it. Since the economic sanctions imposed after the first gulf war, making films in Iraq has become all but impossible.
“It was my family business,” said the bearish Salam S. Mazeel, 35, whose mother was a sound designer, and who wants to be a cinematographer like his father. “But in the ’90s, everything stopped. We go to the hard times. No money, no hope.”
After his father died, his mother quit the business to raise her children; there was no cinema anyway. “That’s how it was,” Mr. Mazeel said. “Now, maybe something is different and we come to America and there are a few things in our minds. Like how to apply American rules to Iraqi movies.”
The article goes on to discuss how Iraqi films could take a cue from Hollywood movies and move away from the previous emphasis on style that European cinema demonstrates:
Years ago, Iraqi filmmakers would regularly attend VGIK, the Moscow film school; Iraqi film was influenced far more by European than American cinema. In Los Angeles, the Iraqi visitors were being advised by almost everyone to make their stories clear, to emphasize narrative over style.
That’s an interesting thing, considering the woefully incomplete or slapdash plots seen in much Hollywood fare. But truth be told, the expressionistic, avant-garde philosophy, seen in Soviet cinema and developed in later German and French films, has been relegated to the indie circuit in the U.S., the fortress of solitude for disillusioned American film buffs. The interest in plot in the United States derives partly from the well-made tradition that was popular in Britain and the U.S. around the same time the film industry was developing, and partly, I believe, from a capitalist ethos. But that’s a topic for another blog.
What’s intriguing about Megalos’ workshop for Iraqi filmmakers is its prescriptive purpose. It is a shade of the cultural imperialism the U.S. holds around the world. Are our films successful overseas because of their effective narratives, as the article suggests? Or because of the corollary economic influence? And films are products, as we know. Moreover, after congratulating ourselves on bringing democracy and peace to Iraq (at least for a moment or two), it seems an echo to claim artistic benefits to them as well.
However, the infusion of the Iraqi filmmakers’ films with their distinctive ethos, under the auspices of Hollywood economic, political, and aesthetic structures, is not only the product, but the method, of cultural exchange. It is a new kind of ethnographic filmmaking, in which the individuals’ culture is writ large through collaborative works, nestled within a historical portrait of fluctuating, overlapping sociocultural conditions. It is why films are of interest to anthropologists, and why anthropologists continue to use films to communicate ideas. It is probably clear to the International Film Exchange; thanks to the U.S.’s economic power, Hollywood has the tools of the trade to empower all filmmaking cultures, with the end goal being understanding of humanity, not imperialism.
Related: Activist Filmmaking
Somewhat of a departure from the topics we’ve been discussing of late, but interesting: A Knox College study of young girls brings to light factors of self-sexualization:
Media consumption alone didn’t influence girls to prefer the sexy doll. But girls who watched a lot of TV and movies and who had mothers who reported self-objectifying tendencies, such as worrying about their clothes and appearance many times a day, in the study were more likely to say the sexy doll was popular.
The authors suggest that the media or moms who sexualize women may predispose girls toward objectifying themselves; then, the other factor (mom or media) reinforces the messages, amplifying the effect. On the other hand, mothers who reported often using TV and movies as teaching moments about bad behaviors and unrealistic scenarios were much less likely to have daughters who said they looked like the sexy doll. The power of maternal instruction during media viewing may explain why every additional hour of TV- or movie-watching actually decreased the odds by 7 percent that a girl would choose the sexy doll as popular, Starr said. “As maternal TV instruction served as a protective factor for sexualization, it’s possible that higher media usage simply allowed for more instruction.”
Mothers’ religious beliefs also emerged as an important factor in how girls see themselves. Girls who consumed a lot of media but who had religious mothers were protected against self-sexualizing, perhaps because these moms “may be more likely to model higher body-esteem and communicate values such as modesty,” the authors wrote, which could mitigate the images portrayed on TV or in the movies.
However, girls who didn’t consume a lot of media but who had religious mothers were much more likely to say they wanted to look like the sexy doll. “This pattern of results may reflect a case of ‘forbidden fruit’ or reactance, whereby young girls who are overprotected from the perceived ills of media by highly religious parents … begin to idealize the forbidden due to their underexposure,” the authors wrote.
The authors [of the 2007 APA study] cited examples like “advertisements (e.g. the Sketchers naughty and nice ad that featured Christina Aguilera dressed as a schoolgirl in pigtails, with her shirt unbuttoned, licking a lollipop), dolls (e.g. Bratz dolls dressed in sexualized clothing such as miniskirts, fishnet stockings and feather boas), clothing (e.g. thong underwear sized for 7- to 10-year-olds, some printed with slogans such as ‘wink wink’), and television programs (e.g. a televised fashion show in which adult models in lingerie were presented as young girls).”
I will say that I think adults dressing as children is probably less of an influence on girls’ self-sexualization than the plethora of kid-size adult clothing styles. Years ago, I saw girls at the pool dressed in halter-top swimsuits…with nothing to halter! I see girls in miniskirts, mini cowboy boots, spaghetti-strap tops, mini-heels, the works.
I have been so busy with papers that I haven’t had enough creative juices left to write a blog post. But the documentary on graffiti artists reminded me of something I’ve been wanting to discuss for awhile: branding. Graffiti artists will “tag” their artwork (or just the walls of public restrooms). It may not be known to all who view it, but it is a pictographic signature. As graffiti artists work in a visual medium, this is not surprising. However, a similar process may be seen in social media, as people construct multimedia “signatures” that import and transmit their personality (or what they desire their personality to be; we are what we do).
The modern concept of brands dates back to the 19th century, when manufacturers imprinted their goods before shipping them miles away. Around the turn of the 20th century, companies began to develop advertising based on their trademark. The rise of radio and broadcast television was a natural boon to advertisers, who drafted audiovisual texts to accompany their slogans.
Brand identity describes the psychological associations of a product that purportedly mirror the interests and emotions of the target audience. While the effectiveness of this technique has been demonstrated over and over again in market research studies, few have considered how consumers reappropriate brand identities to describe themselves. Except of course, certain advertisers who observed this brand fandom (e.g. “I’m a Mac”).
In a previous iteration of Facebook, users could install modules on their page that incorporated logos, religious symbols, celebrity images, witty sayings, TV/film quotes and musical lyrics, and other such cultural memes. This capacity is gone on Facebook, but has been renewed with force by Pinterest. Similar to the “biographical collage” projects we had in grade school, but heavily incorporating advertising logos, images from mass media, and TV/film stills, these collages show that users do not mere fall for brand identity, but construct a branded identity themselves.
What symbols do you surround yourself with? Are you a Coke or a Pepsi? A donkey or an elephant? A Trekkie or a Lucas nerd? A Mac or a PC? McD’s or BK? …the possibilities are endless.
In the golden age of Hollywood, movie producers were largely responsible for obtaining funding and personnel for productions; within the confines of the studio system and without a big-name producer (or a member of the United Artists), a screenplay would rarely be realized into a movie. From the mid-1940s to the present, the output of independent filmmakers has increased tremendously, and the dissolution of the studio system allowed more people to break into the industry; films produced independently could even be distributed by a major studio.
Sadly, the same thing has not happened for television. There, producers are still largely responsible for creative output; a pilot script must be pitched to network executives, if not coming from within the company, and production remains in the company’s hands. Worse, a show’s success is largely determined by Nielsen ratings, an increasingly flawed measure of consumption (and, implicit in sustained numbers, enjoyment).
Naturally, with the rise of video publishing websites and social networks, there are a number of Web shows that are independently produced and distributed. Some, like “The Guild,” are highly acclaimed and well-suited to their medium.
There have been a few experiments in crossover: in 2008, during the writers’ strike, producer Joss Whedon went indie with “Dr. Horrible’s Sing-Along Blog.” After streaming as a web mini-series, the show was distributed by iTunes, Hulu, Netflix, and eventually on DVD through Amazon.com and New Video Group. This summer, the show was picked up for rerelease on the The CW network. Despite high consumption online and in DVD sales, the rating for the broadcast, on Oct. 9, 2012, was a paltry 566,000 viewers. While it may be said that fans of the show already had digital or hard copies and didn’t need to tune in, it’s more likely that “Dr. Horrible” was simply more suited to viral distribution. It does raise questions about the marketability of network TV shows in the reverse direction.
Or for that matter, of studio projects to viewers before their production. Acclaimed director David Fincher is attempting to recruit the media-hungry Internet masses to an avenue traditionally not open to the public: funding a movie production. Unable to get studio backing for an adaptation of The Goon and unable to fund it himself, Fincher has turned to Kickstarter to crowdsource the funds:
So far the project has just shy of 2,500 backers raising about $162,000 with 19 days left to raise the funds. In response to the question of why Fincher does not just fund it himself, Miller said: “Hollywood is filled with the ‘vanity projects’ of successful movie stars and producers. It really is not as easy to get a film made no matter who you are.”
If Fincher succeeds, the line between the studio-made and the homegrown will be irreversibility blurred. TV, however, remains firmly stratified.
Check out this stirring article on the death of theatregoing culture and the rise of new media. A couple of excerpts.
Film culture, at least in the sense people once used that phrase, is dead or dying. Back in what we might call the Susan Sontag era, discussion and debate about movies was often perceived as the icy-cool cutting edge of American intellectual life. Today it’s a moribund and desiccated leftover that’s been cut off from ordinary life, from the mainstream of pop culture and even from what remains of highbrow or intellectual culture.
One could argue that, in our era of consumer capitalism, films have been revealed as manufactured commodities rather than works of art, and people root for certain film franchises or producers or studios in the same way they root for Apple over Samsung, GM over Ford, or the Red Sox over the Yankees.
As most anthropologists know, the roles that people inhabit and are assigned in society are neither inherent nor permanent. Categories and classes of people are historically built, and change occurs in both membership and definition upon economic changes, new ideologies, or technological development. However, these psychocultural systems and biases are perpetuated through both language and praxis. The labeling hypothesis, developed largely by Erving Goffman, maintains that the connotations, expectations, and implications of a label form the scope of the role inhabited by a person with that label. Moreover, class, racial, sexual, and other distinctions are drawn in part by opposing labels.
The construction of “us vs. them” is accomplished in myriad ways and in almost every social venue. It is done in workplaces, schools, families, and in the public sphere. It is used by politicians and pundits to draw lines between the audience and the party’s opposition. It is used by religious leaders to explain why followers are privileged over the non-believers and the wrong-believers. It is even used by reality show hosts and teen romance writers to divide the audience into two camps who can compete and thus increase viewer- or readership. I would like to briefly review some recent news items to demonstrate how “us” is divided from “them” via the media.
According to Megan Reback of the Women’s Media Center:
More than a decade has passed, yet the deep hatred in the United States of those who practice Islam has not subsided. In fact, the radical right – which has increasingly become part of the GOP’s status quo – has held onto these beliefs both proudly and shamelessly.
Some weeks ago, I confronted startling evidence of this mindset as I disembarked from a Metro North train at the end of my weekday commute to and from New York City. Amid the familiar army of black and navy blue suits eager to join families for dinner, I noticed a stark, black advertisement with red, white, and blue type: “*19,250 DEADLY ISLAMIC ATTACKS SINCE 9/11/01 *AND COUNTING. IT’S NOT ISLAMOPHOBIA, IT’S ISLAMOREALISM.”
According to Mother Jones magazine, the ad and others targeting New York and San Francisco commuters are sponsored by the anti-Muslim blogger Pamela Geller. She made headlines last year when she backed other ads castigating a proposed Islamic community center near ground zero, calling it a “mega mosque” and a “victory mosque” that celebrated 9/11. … By mid-2010, Geller became a fixture on Fox News, commenting on U.S. foreign policy in the Middle East and the threat of Muslims and Shariah law in the United States. The Southern Poverty Law Center considers Geller’s organization, Stop Islamization of America, a hate group.
The anti-Islam ads, however, are not random outliers or radical statements. Instead, they represent a fear and hatred of Muslims and Islam that has been particularly rife of late.
Reback recaps some anti-Islam statements by GOP members and the attack on the Sikh temple in Wisconsin, and concludes by characterizing the ad described above as unproblematic except in its reflection and perpetuation of hate by a major political party. Which is a pretty big problem, to say the least. The ad’s language is hard to deny, as well. Even a person ignorant of the details but who still gets nauseous thinking of 9/11 would see that statistic, consider its impact, and be more inclined to think of Muslims as dangerous, violent people. The pejoration is done very simply, over repeated exposure to these blanket statements that play on the emotions. This is the entire purpose of rhetoric.
Here’s a more mundane example:
Women don’t enjoy this military action movie because it is “our” movie. And this is “our” drink. The us-them division couldn’t be more clear. This ad plays on stereotypes of movie and drink preferences to (over)compensate for the suggestion—already socially constructed—that a low-calorie drink is feminine. The ad ends by proudly exclaiming that women could keep their “romantic comedies and lady drinks,” because Dr. Pepper Ten is a low-calorie drink that’s appropriate for men.
Of course, these ads are very plain in their intent. The anti-Islam ad was clearly intended to incite anger towards a group of people deemed “other,” in an attempt to curry favor for a particular group of organizations. The Dr. Pepper Ten was clearly intended to be ridiculous enough to sell a product. However, its affirmation of gender stereotypes is distressing in an era where women are earning more and more yet comprise 17% of Congress, are vastly underrepresented in Hollywood, and are at risk for domestic violence, which encompasses a range of crimes that number, in reported cases in Florida alone, in the thousands.
We also cannot ignore the rhetorical power of non-advertising visual media. Stories are excellent vehicles for ideology and both tools and venues for social construction as the audience absorbs, reacts with, and retells the story. In 1998, “Will & Grace” popularized the first flamboyant gay characters on television. Unfortunately, the gay comic relief became somewhat of a trope, repeated endlessly on various sitcoms or on dramedies such as “Sex and the City.” A new class of “homosexual person” had been formed, and while the likable characters, their unrealism stood in stark contrast to news reports of various violent or pedophilic acts by gay men, encouraged by and conflated with anti-gay campaigns by conservative and/or religious outfits.
Eventually non-flamboyant gay characters featured on longer-form shows that allowed for character development, including “Glee” and “Modern Family.” Now, as though to trumpet the progress of positive gay representation in entertainment television, NBC’s “The New Normal” has hit the small screen, and follows the lives of a gay male couple exploring their options for children. Unfortunately, the rhetorical intent of the production, however important, may not be as salient as the symbolic content, which involves a certain exoticization (“them!”), explains Frank Bua of The Huffington Post:
[M]any of the show’s generalizations are likely more damaging than entertaining: Gays are wealthy, materialistic effetes with crazy disposable income. Gay men randomly wake up and decide that they want a child as the latest must-have accessory. Prospective parents look through a catalog of egg donors like they are recruiting for the HJ. A gay couple? One part effeminate man-boy, the other part a football-watching handsome dude
While Bua bemoans the show’s shortcomings, it is nonetheless clear that social change is happening, more visibly and perhaps more quickly, thanks to mass and entertainment media. Furthermore, each of these examples should demonstrate why studying the media is so revelatory of the process of construction; the understanding of these processes allows us to deconstruct enough to put the spare bits towards change. If we want it.