Anthropology is Huh???…the National Geographic Effect

When I tell people I am studying anthropology, I quite frequently get one of these responses:

  • “So you dig up pots?”
  • “So you study dead people?”
  • “So you look for dinosaur bones?” (yes, really)

It is probably most indicative of a highly professionalized culture, in which disciplinary alignment rarely intersects with job title, that the term “anthropologist” means little to people, but smacks of something archaic enough that people assume its subject is necessarily archaic as well. It’s also partly due to the paucity of anthropologist characters in popular films and TV shows who vaguely resemble real anthropologists (i.e. not many of the hapless characters in scifi flicks on whom the writers have slapped a random scientist label—hello, Prometheus). And the few that exist, of course, engage in wildly unusual quests and work in exotic or hyper-nerdy locations: Temperance Brennan, Indiana Jones, Robert Langdon (symbologist…?).

An anthropology major is number 15 on a list of the majors with the highest unemployment (link).* It doesn’t sound too bad until you review a list of majors at major universities. (here’s my school’s). I wonder if the increasing gap between academia and job placement for the social sciences is at least partly due to a misunderstanding of anthropology. How many jobs have I been turned down for because someone thought I looked at things in the ground? (Hence my necessary return to school.)

Anthropology has a long and, sadly, somewhat sketchy history. It wasn’t until the late 19th century and the likes of Lewis Henry Morgan and John Wesley Powell that anthropology began to be a matter of ethnology, not armchair anthropology of the exotic, nor evolutionist comparative biological anthropology. Later, the blooming generations of Boasian anthropologists began to work for various government agencies, such as the Bureau for Indian Affairs. This was a good moment to be an anthropologist, if only because their insights into other cultures proved useful in native negotiations and in war time (e.g., The Chrysanthemum and the Sword).

In the postmodernist 1980s in America, a new hyper-relativist, activist trend emerged in anthropology that marked the final phase of anthropology in the job market. We had gone from the self-assured, racist, positivist ethnographer, to the state-sanctioned, exoticist, empirical ethnologist, to the doubtful, self-reflective, cultural detectives. Bolstered by people like George Marcus and Michael Fischer, anthropology in the postmodernist flavor, like the literature and philosophy of the time, questioned everything in order to answer a few things, and managed to insult American sensibilities (both anthropologists and non-anthropologists) in the meantime. It’s my guess that this trend greatly affected attitudes towards anthropology in laypeople, such as my religious, conservative aunt, who once railed against political correctness and expected acceptance of “sinful” lifestyles thanks to anthropologists (without having asked me what I was studying in school). It probably also contributed to a characterization of anthropologists as weird or even non-cultured (like Bones, who has little psychocultural connection to the society in which she lives).

Fact is, in an increasingly globalized yet politicized world, in which most people have anywhere from an occasional to a constant connection to global markets of information and products, it’s more important than ever to understand modern anthropology’s fundamental question: why we do the things we do. The same question permeates all fields of anthropology and gives us a scientific yet practical approach to all pursuits. It may not be reflected in the resumés that land on the hiring manager’s desk, or in curricular requirements in universities, or in popular culture, but the ability to reflect on one’s own behavior and their society, to imagine oneself in another’s shoes, to communicate efficiently with someone of a different walk, and to understand the purpose of one’s own and others’ actions, can benefit people in those top-hired areas—business, medical, education—as well as people in the supposedly unhireable majors of art, architecture, liberal arts, humanities, and history.

After all, anthropology is the study of humanity, and last time I checked, every aspect of your  life involves just that.

* Doesn’t look too good in this breakdown either.

Resources:

Self-Justification: Why People Do Crappy Things, Part 1

Cross-post with Eponymy’s Delight:

In a recent post I offered a hypothetical example of a bullied teenager whose diagnosis of “mental disorder” absolves his tormentors of responsibility, and even bolsters their behavior. Labeling theory holds that the word in question itself is culturally salient, and it is so in part because of both parties’ ability to rewrite experiences in their heads. Self-justification is a mental process discussed in the dissonance theory of psychology, which holds that humans will say or do contradictory, unsavory, or dishonest things in order to reduce cognitive dissonance, the perceived disconnect, and the discomfort resulting from that, between an expectation and an experience.

In Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, Carol Tavris and Elliot Aronson compile hundreds of sociological studies, politician snafus, cultural documents, and pop culture dramas demonstrating the reign of self-justification in all aspects of human existence:

Dissonance reduction operates like a thermostat, keeping our self-esteem bubbling on high. … [W]e are constantly interpreting the things that happen to us through the filter of those core beliefs [about ourselves]. When they are violated, even by a good experience, it causes us discomfort. (pp. 30-31)

Discomfort that is easily resolved by little lies or excuses to ourselves, that happen so frequently we don’t even process them. I could not even guess when the last time I did this was. I can say that within the past week I have likely pretended a text was lost to cover for not responding to it immediately, swiped a pen because mine kept getting stolen, and cut someone off on the road because they tailed me for a long time before getting ahead of me. These aren’t great moral missteps, but in a situation with patterned stimuli, e.g. an office where the same boss makes ridiculous demands of you or a school where you pass the same awkward-looking kid at his locker every day, it’s not difficult to (a) project insecurities and stress onto others and (b) justify further bad behavior, to the point where you must justify that pen swipe or locker shove by convincing yourself of your rightness so well, you eventually have no problem with stealing money from the company or beating the kid with a baseball bat behind the school.

Indeed, both the (in)famous Milgram experiment and a similar experiment by Ellen Berscheid, the perpetrators in the artificial social situation belittled their victims and in post-study interviews explained that the victims “deserved” the shocks. In the Berscheid experiment, half the perpetrators were told that they would later be the recipient of shocks, and as dissonance theory would predict, “when victims are armed and able to strike back, perpetrators [would] feel less need to reduce dissonance by belittling them . . . . [T[he only participants who denigrated their victims were those who believed the victims were helpless” (200).

Interestingly, the authors don’t address the realms of customer service or office workplaces, or the experience of bullying, but it isn’t difficult to apply their observations. In a store, for example, a customer feels dissonance when they expect one price and get another and decides to belittle the employee to resolve it. Dissonance theory explains why otherwise rational people deal with such confusion in this way, rather than politely asking the staff to check or change to price. Furthermore, they feel able to insult the employee because the rules of customer service dictate that employees are not allowed to retaliate. (This doesn’t seem to be true in other countries.)

Needless to say, it’s much more serious when dissonance reduction of this type occurs among adolescents and leads to bullying, which has a much greater capacity to be fatal or at least permanently psychologically damaging. Self-justification seems to be a natural process empowered by the human brain’s complexity and plasticity, and has many positive effects, including transmitting culture, allowing forgiveness, and bolstering imagination, but we must understand that of all the factors of bullying and harassment, this is likely the only one that’s unchangeable.

Thanksgiving…let’s be thankful we’re the best country in the world

It is commonly taught and widely believed that Thanksgiving is the anniversary of a feast between the Pilgrims, who landed at Plymouth Rock in 1621, and the native population. Yet many of us know by now that the first such feast didn’t take place in 1621, let alone on the last Thursday of November. In fact, there are multiple instances of great celebrations held by groups of wearied settlers in different parts of the country.

The Thanksgiving feast is constructed as a peaceful alliance among people of different races celebrating their wealth and common blessings. Yet the Pilgrim settlers, along with all other groups of settlers, have a complicated history with the Native Americans. The Wampanoag suffered a population drop after contracting leptospirosis from earlier settlers, and in an attempted alliance with the Pilgrims, signed over significant lands to them. Their alliance proved fruitful in the Wampanoag’s struggles with the Narragansett. Thus one could say that Thanksgiving, if attributed to the feast in 1621, is in a sense a celebration of alliance, but certainly not of peace. Which begs the question: how many holidays, rituals, and artifacts meant to bring or celebrate peace and understanding really promote cultural dominance or ethnocentrism, or honor instances of genocide or conquest? Do we equate pacifism with the absence of violence, or compassion with the absence of criticism or missionary activity?

Like most incidents in American history, the first Thanksgiving has taken on new meaning in the repetition that established it as an American tradition. It doesn’t specifically emphasize America or the dominant culture, if there is one, and doesn’t relate to a specific incident in which America or Americans won or otherwise reigned supreme. However, that it is contextualized in this alliance between Pilgrims and Wampanoag, certainly informs our meanings system that is activated upon Thanksgiving. Dysfunctional families force themselves to eat together in the name of  togetherness, good neighbors open their tables to the weak and hungry, and speeches are made about cultural acceptance and working together to improve the world. Why this is so, I cannot say without further research, but I suspect that, over generations, this incident of peace between settlers and natives has been heightened to downplay the subsequent offenses by settlers against natives. Yet that the alliance was temporary and allowed the Pilgrims to easily take over native lands upon declining Wampanoag population, is neither compassionate nor violent, but certainly not an act of cultural acceptance.

Thanksgiving, the Celebration of Consumerism and Cultural Dominance

I don’t meant to smash Thanksgiving as it means to a lot of people: a time for families and friends to gather and give thanks for each other and for their good fortune. However, in a weak economy with ever-increasing Black Friday sales and store hours, Thanksgiving has become the National Shopping Holiday, second only to the Saturday before Christmas, according to the data since 2002*. Last year 212 million people went shopping on Black Friday, well above predictions of 138 million and above the previous year’s turnout of 195 million, and total spent on Friday alone was $10.66 billion, for an average of $50.28 per shopper. The total spending for the whole post-Thanksgiving weekend 2010 was $41.2 billion; that’s $365.34 per shopper. Although not every holiday shopper shops post-Thanksgiving weekend, it suggests if not proves the focus of holiday shopping on Black Friday to note that the average amount spent during the entire 2010 holiday season was $112.20. This year the annual NRF survey reports an expected 152 million planning to shop tomorrow (they will provide data from this year’s event on Nov. 27), and an anticipated $130.43 per shopper (see also here).

Here in Gainesville, people had begun camping by Wednesday morning outside the Best Buy, presumably for the Sharp 42″ HDTV selling for $200. Similar stakeouts are happening round the country.

Black Friday has its dangers; most notably, the 2008 death of an employee at the Valley Stream Wal-Mart in Nassau, who died of asphyxia after being stampeded by hordes of shoppers. The incident triggered a lawsuit and an OSHA investigation, with Wal-Mart insisting they were not culpable. The company ended up settling and spending more on good publicity and donations than it would have for a fine. Additional info here.

While deaths by Black Friday is certainly not a leading cause of death around the holidays, it is common for both employees and shoppers to experience injuries. The national shock over the Valley Stream death prompted many to curse the frantic shoppers for giving way to a herd mentality or being willing to harm others for their own’s sake. Normally I am more cynical and willing to dismiss humans as, after all, animals, but as anthropology teaches us, humans are an animal that lives in a constructed world. Observers of herds of perisso- and artiodactyla or groups of rodents traveling in large groups might surmise, as several fall off cliffs or stumble, that herding is a blind activity in which all participants have lost a sense of self-preservation or and ability to calculate movement and speed necessary to manuever obstacles. And yet a human crowd is nothing like a herd of wildebeest. When wildebeest and other creatures move in a herd, they are identifying similar creatures and improving their chances of survival by staying in a crowd large enough to defend against would-be predators, and navigating changes in the landscape as a large, fluid group. Humans, however, pick focal points and move towards  them. Cultural predisposition towards moving on the right side of a landscape or the left determine some patterns of crowd movement, but humans don’t navigate according to groups. This is why crowds can easily reach a crush point; the masses, unable to adapt to the shape of a landscape, bottleneck themselves or push into a barrier, and do not leave safe distance between themsleves and others. The bigger the crowd and the smaller the space, the more likely it is that people will die in the crowd (and of asphyxia, not being crushed), because humans are extremely reliant on artifacts as landmarks in their visual field, and apply an understanding of a basic shape from memory (think of those line illusions you saw in grade school). So a mass of shoppers will orient themselves towards big glass doors, the sight of tall shelves and big signs, or any available open space, and when all rush towards these things, injuries occur.

This is hardly the fault of shoppers or even retailers. Once the practice of Black Friday sales began, it perpetuates itself despite the dangers because it manages to escape its self-destruction. Rather than avoid attending the sales because of the dangers, shoppers brace themselves by getting there as early as possibly (3 days early sometimes!), using carts as buffers, and occasionally becoming defensive to the point of physical aggression (just YouTube Black Friday to see some of this violence). They do this not because they are selfish, animalistic, or evil, but because the ethos of a consumeristic culture emphasize bargain-hunting, obtaining valuable objects, and boosting the economy.

More on the cultural dominance aspects of Thanksgiving later…

* With one exception; see source here.

Related:

Shoppers should put purchases into perspective

NRF predictions for 2011

Fewer plan to shop on Black Friday 2011

Fountains Are Silly

n. fountain: a soothing or exciting spray of water upward from a body of water for aesthetic and relaxation reasons

Fountains are so ubiquitous that they have become part of our landscape schema…we might not notice that they’re there, but we would notice if they’re gone, from malls, universities, hospitals, doctor’s offices, hotels, government buildings, and conference centers. We would notice if they were placed somewhere like a daycare center, a Wal-Mart, or a public park. And why are they not there?

Fountains were a common decorative feature in ancient Rome, and have continued to be so. However, they have taken on an added dimension of promoting relaxation and contemplation. And yet fountains are expensive. Why do we need an expensive spray of water to relax? According to structuration theory, the practices of installing fountains, upon repetition, creates a structure, which could be social or mental, through which fountains become expected or required. My guess is that the rules for when and where fountains “should be” stem from the common function of the places they tend to be…places where a lot of money is spent. Fountains are, appropriately, a sign of decadence and wealth. Wouldn’t you rather cut an important business deal at a conference center with a fountain that costs the same amount of money the deal is for? Or go to a hospital that can afford the best staff and equipment because clearly they have enough to blow on a fountain?

The Misanthropologist Descends

My experiences as an American and my training as a cultural anthropologist have led me to one conclusion: People suck. It’s become custom among modern anthropologists to emphasize cultural relativism and downplay dogma-based assessments of morality and worthiness, but let’s face it: People suck. Human beings are notoriously selfish, overpreoccuppied with their own immortality, stuck in their heads to the point of complete exclusion from physical reality, and kill each other not for reasons of survival, competition over mates (usually), or food, but because of beliefs in abstractions.

So why study anthropology? Because it explains so much of human behavior, which, let’s face it, is usually perplexing even to the most open-minded and compassionate individual.  Americans in particular are subject to perhaps the most damning of questions, “Who are we?” simply on account of our multiethnic history and our unprecedented reliance on technology and digital personas. So can honest self-exploration of our human flaws help us break free? That is the purpose of this blog. More to come….

The study of anthropology enables us to understand why people treat us the way they do, what we can do to be better, and what we can do to improve the world. Anthropology is, after all, in part, the pitting of neurosis against guilt. It is necessary for self-understanding, to understand the cultural sources of our behavior and the scope of our behavior in the grand scheme of human activity (hint: we tend to believe it’s more than we say it is).