I Do: Examining Expensive Weddings through Three Sociological Theories
It’s June, the month of love… and expensive weddings. Chances are you have been to a June wedding, you were married in June, or you know that June is a popular month for nuptials. In this post, Ami Stearns examines the increasing costs of wedding ceremonies through the “big three” sociological theories: conflict, functionalism, and symbolic interactionism.
June is still one of the more popular months for weddings, either because June is named after the Roman god Juno and his wife, Jupiter, who reigns as goddess of marriage and childbirth, or because June was the one month centuries ago that people smelled really good.
Weddings, in our culture, are extremely significant. The significance can be shown by examining the cost of an average wedding, which continues to skyrocket. However, simultaneously, the desire to wed has fallen. Why is this? How can we explain the seemingly contradictory practice of exorbitantly priced nuptials with the decreasing importance of marriage itself? Sociology can give us a few hints (you knew I’d go there, right?), especially when examining the reason for these skyrocketing prices for a couple of “I Do’s.”
Wedding Pricetags
If you’ve been married recently, you may already know this. The rest of you need to hang on to something as I tell you this. The average cost of a wedding is $32,641according to a recent CNN report. That money would buy a brand new car, provide a 10-15% down payment on an average house purchase, or contribute substantially to a future child’s college education. Some couples elect to take out loans to pay for their wedding, while some rely on parents to pitch in. Frugal and DIY weddings are definitely a trend, but we’re still talking in the realm of $5-6 grand, by some estimates. Clearly, spending over $5,000 on a few hours’ activity indicates there is a huge importance placed upon the exchange of vows in our society. (This dollar amount, by the way, is before taking into account the cost of a honeymoon)….
The Name Game: Ex-offenders and Labeling Theory
This post applies a basic concept found in labeling theory to the Office of Justice Program’s new initiative to erase certain language from their references to individuals released from prison. In this post, Ami Stearns argues that a rose by any other name may not be a rose anymore at all- at least not within the context of the criminal justice system.
In Romeo and Juliet, Shakespeare famously wrote, “That which we call a rose by any other name would smell as sweet” which is paraphrased often as “A rose by any other name is still a rose.” This saying implies that what we label or name things doesn’t have an effect on what they really are. Is this true, though? This month, the Office of Justice Programs announced that the words “felons” and “convicts” would no longer be used. Instead, people-first language will become standard in speeches and in written communication for the agency.
People-First Language and Incarceration
People-first language has become the standard for writers and journalists seeking to place the person before the disability. For example, instead of referring to someone as a handicapped person, which puts the significance on the disability, the individual should be identified as a person with physical challenges. Rather than calling someone “homeless,” they should be referred to as “a person without a home” or “person living on the streets.”
The people-first language movement has spread to the criminal justice system with the Office of Justice Program’s recent proclamation. Suggested phrases when referring to someone recently released include “person formerly incarcerated” and “citizen returning to society,” for example, instead of “convict” and “felon.” This change in policy was the logical next step from “banning the box,” the move to eliminate the question on hiring forms about past crime convictions, discussed in this Sociology in Focus post from last fall.
It’s important to note that the people-first language movement is not without its critics. Both the autistic community and the deaf community have voiced opposition to people-first language. People-first language implies that the disability itself should be viewed negatively instead of positively, which is a problem for some. Many will use the argument “a rose by any other name…” – that it does not matter what you call someone and, further, that you cannot separate the person from the person’s issue. However, do these critiques still apply in the context of the criminal justice system, where terms like “ex-con” are nearly always negative and rarely embraced as in some communities?…
The Good, the Bad, and the Ugly: The Social Construction of Cosmetically Challenged Food
Have you ever bought a potato that looked like Abraham Lincoln’s face, an eggplant that had arms to hug you with, or an orange with a strange growth? Why is it that in America, we pass over “ugly” produce that is nutritionally sound in favor of pretty produce? In this post, Ami Stearns argues that the designation of our fruit and vegetables as edible or non-edible has been socially constructed.
In America, a quarter to a third of all food grown is simply never eaten. In fact, much of it is discarded before it even has the chance to reach the grocery store shelves. When organic “trash” is added to a landfill, dangerous methane gas is produced, which is a significant contributor to the greenhouse effect. Not only is trashing organic matter dangerous, it is a moral and ethical issue as well. According to Jonathan Bloom’s American Wasteland, twenty-five percent of the food that Americans waste would provide three daily meals to over 40 million people. We are a culture that celebrates throwing things away as a symbol of high status. American sociologist Thorstein Veblen wrote about our consumer culture in the late 1800’s, arguing that the higher one’s social standing, the more one should consume, discard, and consume again.
Our throw-away culture replaces phones not when they wear out but when a new one is introduced, and also re-creates this behavior with food. Because we live in an environment of excess, we are free to be wasteful. Food waste is an issue at all levels, from individual families to mega-corporations. The average family throws out food that costs them $1,365 to $2,275 per year. The food waste that restaurants discard is now equal to approximately 15% of the average landfill waste. A full 20% of crops grown are turned away by grocery stores based on the cosmetic appearance of the fruits and vegetables.
The Ugly Fruit Movement
Grocery stores insist on a certain “perfect” look for their produce. Skin discolorations and other unsightly blemishes, shapes, or textures are deemed not fit for the fruit and vegetable aisle. Many good-looking tomatoes make it all the way to the grocery store loading dock only to be rejected due to bumping and bruising from the long trip. Because this practice leads to so much waste, food activists have begun raising awareness about the waste of perfectly good, unattractive food. Let’s call it the ugly fruit movement….
“Glamping” & The Gendering of Outdoor Recreation
Glamping (glamorous + camping) is one of the trendiest new outdoor activities. In this essay Ami Stearns argues that glamping is an attempt to overcome the stereotype that camping is manly and in the process Stearns asks us to consider why we gender any activity in the first place.
What do grilling hot dogs on a stick, pooping in the woods, gathering sticks for a fire, and working for hours to set up a sagging tent have in common? Camping! While it’s true that both males and females go camping, this is an activity that, in our culture, particularly embodies masculinity. Camping brings to mind the rugged outdoors and the sense of manly survivalism seen in movies like The Revenant (do NOT watch that movie without a big, warm blanket and a copious amount of beef jerky).
Gendering Outdoor Recreation
Outdoor life is typically associated with men, while indoor activities are considered the domain of women. For example, women tend to bicycle indoors while men ride “real” bicycles outdoors more often. Women are still doing most of the indoor chores while men work outside mowing lawns and carrying the trashcans out to the curb. Fishing, hunting, and camping are typically related to men’s activities more often than they are related to women’s activities. Many behaviors, like camping, are gendered, whether we are aware of it or not.
Keep in mind I’m not saying that women don’t camp or don’t enjoy camping. I am arguing that in our culture, we stereotype camping and other outdoorsy activities as more masculine than feminine. In terms of gendering an activity, it is much easier to create new, alternative versions of a masculine activity (like shaving) than to convince women to completely “inhabit” a masculine behavior. Plus, any good Marxist would state that alternative versions of products (like his and hers body wash) are merely marketing gimmicks to double the consumer pool. With all these things in mind, I give you the trendy concept of glamping. Glamping is the word that occurs after mashing-up glamour and camping. It calls to mind Victorian safaris and elegant getaways. Glamping is luxurious (some glampsites come with butler and chef) and “authentic, effortless, and inspiring.”…
Can’t Buy Me Love (Or Can You?)
You know Valentine’s Day is just around the corner when the stores are filled with the pink and red signs, along with chocolates, cards, jewelry, and flowers. In this post, Ami Stearns argues that love – an abstract concept- is bought and sold on Valentine’s Day through the purchase of goods and services. Whether we realize it or not, we commodify our love on Valentine’s Day. Those who “opt out” may face the wrath of a significant other.
In middle school, a boyfriend promised he had sent me a bouquet on Valentine’s Day that never showed. While the school hallways echoed with screeches of appreciation over other deliveries of cold roses and baby’s breath, I felt like I had been slapped in the face. It later turned out that there was a mix-up at the florist and I received my flowers the next day. I’ll never forget this thirteen year-old boy saying, “It’s the thought that counts” and me thinking, “That is NOT how this day works!” In my view, I had overestimated our relationship.
In America, many of our holidays have taken on a life of their own: the guts of stores literally change color depending upon the season. We have rituals, meals, and expectations surrounding nearly every holiday. Sociology offers several useful lenses through which to view these holidays. In previous pieces on Sociology In Focus, we have analyzed Halloween, Christmas, Father’s Day, and Thanksgiving. This post will explore Valentine’s Day through one of Marx’s core concepts: commodification.
The Stuff of Valentine’s Day
You love it or you hate it- but Valentine’s Day is an inescapable part of our culture. It’s risky to ignore this holiday if you’re part of a couple. The day affects our romantic decision-making as well: would you start dating a new person right before Valentine’s Day? I wouldn’t! Not only do we avoid getting entangled right before V-Day, we sometimes find ourselves re-evaluating relationships during this love-filled month: Mid-February actually boasts one of the highest break-up rates in the year. Valentine’s Day is clearly more than “just” a holiday, but is in fact a socially constructed expectation of what love looks like and how we express that to others through gifts such as chocolate, candies, romantic dinners, or flowers.
On Valentine’s Day, particularly, beleaguered florists hope to break even on the flower-filled day (however, once a sure thing, mom and pop flower shop “owners” these days are becoming laborers themselves and may even lose money on the historic flower-buying holiday). The obligation of buying that special something is a large profit for a handful of corporations- not very romantic, is it?…
Throw Me Something, Mister! Mardi Gras and Deviance
Mardi Gras is typically a time where deviant behavior is expected and encouraged. Lewdness, public intoxication, and begging are all commonplace during this centuries-old festival. Within all societies, certain behaviors that would be considered inappropriate are allowed in some contexts. In this post, Ami Stearns examines Mardi Gras with a Durkheimian lens to suggest that deviance is relative.
Mardi Gras is serious business in Louisiana. This will be my second Mardi Gras in South Louisiana but the first that I will brave the crowds in New Orleans for a firsthand look at this over-the-top festival that has ancient roots in Western history. Many cities across the nation also celebrate Mardi Gras, and share in the typical problems that accompany the revelry, including shootings (both at people and at Centaur floats), partiers falling from rooftops, “suspicious” activity, and general mayhem.
The Situational Normality of Begging at Mardi Gras
What comes to mind first when conjuring up a Mardi Gras scene are probably the beads that are thrown en masse from passing floats- but beads are not exclusively thrown to women who bare their breasts. Beads and coins are thrown to those who beg the loudest, basically rewarding the best beggars. At the Mardi Gras parades in Lafayette (the second largest Mardi Gras celebration in the world and, most notably, fairly devoid of nudity), I noticed that the loudest people who shouted “Throw me something, mister!” received the most beads.
The “urban” Mardi Gras that is most well known demands that the revelers scream and beg for things, usually in return for nothing. But there is also a more rural version that is still practiced vibrantly in small communities all over South Louisiana. During these Mardi Gras celebrations, the members of the parade do the begging, running from house to house in tattered clothes to beg for food. The items that are collected, including live chickens, ultimately end up in a delicious gumbo….
Snow’s Coming! Panic at the Bread Aisle!
When winter storms threaten, it’s time for a race to the grocery store for milk, bread, and eggs. You know you’ve done it. In this post, Ami Stearns uses this yearly panic to illustrate Durkheim’s concept of mechanical versus organic society.
Comedian Vic Dibitetto’s “Milk and Bread” video resonated with so many people that this former bus driver was propelled to internet fame and is now appearing in movies like “Paul Blart: Mall Cop 2.” We have all been there: racing to the store for staples like bread, milk, and eggs just in case. (Just in case of what? A French toast emergency?)
Snow’s coming!
Once the weatherman or weatherwoman makes a pronouncement, it’s time for you and everyone in your city to stock up the holy trinity to deal with bad weather: milk, bread, and eggs. It’s common sense that households will need to eat something while snowed inside, but why the utter panic?
Some psychologists suggest that we stockpile goods in order to feel in control. After all, what is more uncontrollable than Mother Nature? It could also be that we perceive a shortage of food and are simply hard-wired to go into panic mode, clearing out the bread aisle like a plague of locusts.
But out of all the thousands of food choices in a store- why milk, bread, and eggs? Grocery store bread is not particularly nutritious, and if the power goes out- how will you cook those eggs? Seems like a more logical choice would be cans of potted meat or tuna, powdered milk, peanut butter, and beef jerky, items that won’t spoil quickly, don’t need refrigeration, and don’t have to be cooked.
Some stories allege that the milk, bread, and eggs panic began in the Northeast during a 1978 New England snowstorm that trapped people inside their homes for over a week. Another apocryphal story reports that it was the onset of a 1950s Pittsburgh storm that saw stores running short of bread and milk. While the origins of this odd modern panic may be in question, there is no doubt that the impulse to clean out the store in advance of a storm warning unites us all, especially when it comes to those three items….
The Racialized Panopticon: There’s an App for That
This post is Part One of a two-part discussion addressing this October 13th article in The Washington Post. The story describes the effects of a private app called GroupMe that enables users to send out real-time notices of suspicious activity in the neighborhood. In this first post, Ami Stearns suggests that the concept of the Panopticon can be applied to the racialized nature of this smartphone surveillance app.
“Big Brother is Watching.” That’s the famous phrase from George Orwell’s dystopian novel, 1984, and the theme of a popular TV series where every move of the cast is recorded every moment of the day. In the late 1700s, the British philosopher Jeremy Bentham envisioned a building that enabled a single, invisible, watchman to monitor everyone. This building design could be applied to prisons, schools, factories, asylums, and hospitals. Bentham theorized that this “Panopticon,” as it became known, would confer power to those performing the surveillance, largely because those inside the facility would know they were being watched at all times, but were unsure when exactly the eyes were upon them. He argued that this would address any behavior problems. A more modern example can be seen in most retail establishments. The shopper may see a sign with something like, “Smile, you’re on camera” or may see the large cameras themselves in the ceiling. Whether or not anyone is actually monitoring the consumer at that very second is unknown, but it is this threat of being watched that works to convince people not to steal or misbehave.
In some countries, closed circuit televisions (CCTV) cover much more than an individual store or restaurant- these cameras capture streets, sidewalks, the subway, and entire neighborhoods. So, a heavily CCTV-saturated place like the UK should be the safest on earth, right? Actually, an evaluation undertaken by highly-regarded Campbell Collaboration suggests this mass surveillance only has a “modest” impact on crime rates….
Judging a Book By More Than the Cover: Book Banning and Structural Functionalism
How is it possible that books are still challenged in an era when porn, beheadings, and shootings are just a few clicks of the keyboard away? What could possibly be within the pages of a novel like The Catcher in the Rye that causes concern these days? Instead, we should ask why attempted book bans occur at all. Could they benefit the community in some way? In this post, Ami Stearns uses structural functionalism to examine the true functions of book bannings in communities across America.
When I tell people I research banned books, they are always quite stunned. Not at my choice of study, but at the fact that books are, yes, actually still banned. Not only that, but when I rattle off a few banned books (Hunger Games, Of Mice and Men, The Great Gatsby, The Absolutely True Diary of a Part-Time Indian, In the Night Kitchen, Captain Underpants, Where’s Waldo, and basically everything Judy Blume and J.K. Rowling ever wrote), people are perplexed. We can use sociological theory to explain not only why books are banned but how they are still considered harmful in the Internet age.
Here is the quick story behind frequently banned and challenged books. The U.S. government no longer bans books- not since the 1940s. Instead, the “task” of forbidding certain books falls to local jurisdictions- usually schools and libraries. This means that, technically, the government does not ban us from reading any materials, but any citizen can issue a “challenge” to a book on the shelves of a library or assigned by a schoolteacher. Then, the city or school can make a decision on whether or not to censor that book aka banning the book. From Texas schools issuing challenges to a total of 32 different books in 2013-2014, to Idaho schools pulling one controversial book from the state school curriculum, books are still relevant and clearly, still considered powerful….
Active Shooters and Masculinity
A common denominator among active shooter events is the gender of the shooter. In this post, Ami Stearns talks about the theater shooting that occurred right across town two months ago. She illuminates the association between males and gun violence using sociological theory.
I had been living in Lafayette, Louisiana, for less than a year when the theater shooting occurred. In the aftermath, two young women, along with the gunmen, had been killed. My new community erupted in shock. How could this happen here? Unfortunately, active shooter incidents, such as the incident in Lafayette, seem to be a part of the American landscape. Federal agencies describe an active shooter as “an individual actively engaged in killing or attempting to kill people in a confined and populated area.” We recognize these massacres by their geographical names: Columbine, Newtown, Aurora, the University of Texas… these proper nouns of everyday places are now metaphors for senseless, indiscriminate, and horrific violence.
We search for explanations in the face of these types of crimes. Public figures swoop in to the scene of the tragedy to deliver comforting words and promise that such an event will never happen again. Talking heads debate access to mental health resources and suggest stricter gun laws. Religious leaders lament the breakdown of family values while historians suggest America was founded on a subculture of violence that makes mass shootings inevitable. Often, pop culture is blamed- video games, music, and films. Earlier articles on Sociology In Focus centered on explaining mass shootings through the theoretical lenses of structural functionalism and symbolic interactionism. While entire books could be written on many of the factors above, this particular post will focus on one factor so obvious, it is frequently overlooked: gender.
FBI data analyzed for active shooting events within the years 2000-2012 found that ninety-four percent of the perpetrators were male. Other studies, using a more broad period of time, estimate that as many as ninety-seven percent of the shooters were male. (A Mother Jones data set also reveals that the racial backgrounds of the majority of the shooters between 1982-2015 were approximately sixty-two percent white, but I will leave the discussion of race within these tragedies for a separate post). While these active shooter events are, admittedly, extremely rare, the extraordinarily high rates of males involved in these events demand a more critical exploration….