Between Sko and Skole (Shoe and School)

The Danish word for shoe is sko & the Danish word for school is skole. These words are related insofar as they contain similar letters, but what matters for this post is that I am utterly fascinated with both as locations of cross-cultural analysis of people’s use of objects and spaces. Indulge me as I take a stab, after returning from a stint of consulting work on kids and education in Denmark (along with some shoe shopping), at cultural understanding using foot fashion and education as my sites for some casual sociological investigation.

I spend what may be an inappropriate amount of time looking at people’s shoes (sko), especially in less familiar places. Inappropriate because the looking can make me a bit judgy or jealous or fixated on consumption or miss my stop on the Copenhagen Metro because I’ve been looking down for seven stops in a row. But this looking is nonetheless useful, partly because I find shoes to be interesting markers of gender, culture, class, age, and ability to predict weather patterns. And partly because of my own feet. When I was a kid, doctors told me that my feet were in so much trouble that I may grow up not being able to walk. I had to wear special built-up ugly shoes that were never quite normal looking enough for my desire to fit in on the school playground, even despite my addition of shoelaces with cartoon balloons on them in the third grade. And so, my adult overconsumption of nice shoes, and fascination with others’ footwear, may just be me compensating for a childhood of self-consciousness in school and shoe ugliness.

When I work overseas, I find myself not only looking at people’s shoes (“Those New Balance shoes on the hip European teenagers are a slightly different style than the ones found on American endurance athletes and 50-year-old men with wide feet”), but also wondering if it is at all possible to make comparisons across cultures about anything, given that I have not actually walked a mile in a Danish person’s shoes. And I only know the language well enough to read “The Little Red Hen” to kids in Danish, not to translate Kierkegaard (“Ikke mig?” gryntede den eksistentialisitiske gris. “Not I?” grunted the existentialist pig). But the comparisons kept popping into my head as I looked down at the array of footwear on the streets of Copenhagen.

Here is what I tentatively noticed about Danish shoes:

  1. What I saw in Denmark a few years ago is what I see in the U.S. today, with regard for popularity of certain styles. Having worked in Denmark a few times over the last few years, I come back to the U.S. and keep an eye out for how long it takes the fashion there to reach the shores of Eastern Washington. Turns out, about two years. This means the boots I bought will look great in Walla Walla in 2017.
  2. Every other pair of shoes is black. Black tennis shoes. Black heels. Black loafers. Black boots. All appropriate for bike riding, by the way. This may be an urban thing, but when you live in a country dedicated practically and politically to the good of the collective, and where standing out as an individual is frowned upon (just look up “Jante Law”), it makes sense that a relatively uniform color palette is present in footwear.
  3. No sandals until it’s at least 75 degrees. This may also be an urban thing, but it’s probably more about the weather. Bare feet among the grown-up urbanites seem to be taboo until the sun comes out for that one day that locals affectionately refer to as “Danish Summer — The Best Day of the Year.”  I had to buy black boots in June because it was 55 degrees and raining. But hey, new black boots. To be unveiled in Walla Walla in 2017, when they’ll be in style here.

In addition to the shoe thing, I spend what I believe to be a perfectly appropriate amount of time thinking about schools. Appropriate because schools are important. Also appropriate because I have a kid in a school. I went to school. I teach in a school. My parents and my husband’s parents worked in schools. I’ve done consulting work for schools. I teach a class about how schools are sociologically interesting. In a celebrity academic death match, I could totally “school” someone if the category was “schools.” But, as anyone reading this who is a K-12 classroom teacher would tell me, I should learn more. Walk a mile in a teacher’s shoes…

In an effort to learn more, my fascination led me to some consulting work that allowed me to visit Danish schools (skoler), talk to Danish teachers and pedagogues (this term refers to education professionals who work more with the social and emotional well-being of kids, often in preschools and afterschool programs, both of which a majority of Danish children attend), and interact with Danish schoolchildren.

But given my tentativeness about the aforementioned shoe claims, how can I make claims about something as important as schools (not that shoes aren’t important — they’re very important — I wear shoes every day!)?  About how Danish schools, for example, are better or worse than American schools?  For that matter, how can I make claims about *all* schools in each respective country given the vast variety even within national borders? Sociology is about comparisons, but sometimes we create problems when we make the categories we’re comparing a bit more real than they actually are. For vocab nerds, this is called reification. And it’s a more subjective process than pointing out differences between cross-national New Balance shoes. But you can’t take the sociologist out of the girl, so…

Here is what I noticed about Danish schools:

  1. What I see in Denmark today is what I saw in the U.S. a few years ago, with regard for national revisions to how schools and their corresponding professions operate.  Recent nationwide Danish education reform (The National Reform Programme), now a year old, was the subject of much conversation and controversy among education professionals with whom I spoke. The reform means changed mandates from the state regarding teachers’ roles, more time at school for kids, and more collaboration between teachers and pedagogues to promote holistic and child-centered learning, all in an effort to improve children’s academic success and enhance collaboration across different education professions. Opinions on this depended on people’s position in the system, but everyone’s reaction was intimately related to the fact that change is difficult, and autonomy in work is important for job satisfaction. Much of this sounds familiar: Politicians saying reform is better to compete internationally. Teachers saying they are disrespected. Pedagogues saying they’ve got to take on new tasks that they’re not trained for. Administrators saying that the overall picture is better for kids even with the bumps of the last year. And kids saying it’s hard to be in school until 3 p.m.
  2. Every other child has blond hair, and the continued preservation and promotion of Danish culture in governmental policy (the new government saw a huge increase in parliamentary seats among the Danish People’s Party, which is decidedly anti-immigration), means that schools in Denmark are still, and will likely still be, quite homogeneous. This is despite the fact that there is more variation in quality of school and affluence of surrounding neighborhood than many are aware of.  As someone who lives in a community where bilingual education is present given the racial-ethnic make-up of our community, it was fascinating to hear Danish education professionals discuss how the lowering of the age at which kids learn English (now 1st grade) was touted as necessary by some researchers and politicians for effective brain development and language acquisition, but the same argument was never made when Turkish children started entering Danish schools. This is a country with the challenge of a small increasingly multicultural society that is trying to preserve its long heritage while upping its status on a global educational scale.
  3.  No shoes required on the playground, even if it’s 55 degrees. Smart scholars who’ve written lots about Scandinavian childhoods portray images of freedom, independence, and play as necessary ingredients for a good childhood. This is how democracy works — the kids create their own democracies first, without hovering adults.  Add to that a society with universal health care and a pretty large safety net, and people are also less concerned with safety rules, fear of unlikely dangers, and kidnapping.  All of this helps explain the fact that I saw a third grade child on a school playground, during school, swing from the monkey bars and run around with no shoes on, with no teacher watching, and with no fear or concern that he’d get in trouble.

If only I could have been that Danish kid wearing no shoes on the playground when I was in third grade.  He didn’t have to worry about feeling self-conscious. He doesn’t have to worry about getting in trouble. Or what other kids think.  But then again, because I had to wear weird shoes, I grew enamored with the lifelong quest for keen observation of objects and spaces, and a thirst for understanding the complexities of oversimplified renditions of difference.

Between Now and Then

Below is the Baccalaureate Address I gave at Whitman College. With advice for students, and perhaps for us all.

It’s time for celebration! It’s time for family and friends! It’s time to fit in as many fun activities as we can before we leave!  It’s time to give a speech that I’ve been told cannot last more than ten minutes!

It’s time…to graduate!

When we think of time, it is easy to think of special events such as graduations because they occur infrequently, they signify change in a life stage, they bring people together from different time zones, and they contain numerous schedules and timetables for the accompanying festivities. Sociologists think of time in terms of how it has been constructed as meaningful in varying cultural and historical settings, and in terms of how it is used to signify boundaries in our social lives. For example, as Phyllis Moen suggests in her book It’s About Time: Couples and Careers, in order to figure out ways to accommodate our changing work and family roles in contemporary U.S. society, we must first look at how taken-for-granted rules about work time and non-work time are enacted – how time is part of the infrastructure and culture of our work and family lives. What is a work day? What is a holiday? Can I spend time checking personal email at work if I do so surreptitiously on my smartphone? How does our understanding of work matter if we have discretion over our time or need to punch a clock? And, as many people may wonder, is there really such a thing as a weekend when I am accessible by email 24/7?

One of my favorite sociologists, Eviatar Zerubavel, has said that we live our lives in “social territories” along a continuum that consists of different kinds of time – namely, we live in public time and private time. I will add that we define certain times of day as more about close friendships or intimate relationships, others more about formal tasks. Some of us may use time set aside for spiritual growth, community involvement, or taking care of our bodies. Some of us take on a little too much and end up sacrificing activities or trying to do too many things in the same time period. We multitask within our social territories.

How we understand the use of time depends on what activities, people, and spaces we think are attached to certain times.  For students, if you want to participate in a sociology exercise, think back to your few years here at Whitman and count the number of instances where you have been speaking, dressing, and socializing differently depending on whether you were in my class at 11 a.m. on a Monday in Maxey Hall or in an off-campus apartment at 11 p.m. on a Friday. This example signifies that how we organize time parallels how we organize what we do, the people we are with, and the locations of both.  We use time to signify territories of our selves. Territories that are sometimes separate, and that sometimes overlap.

Life transitions do not move in a linear way. Anyone who is a parent here knows this, especially if they can think of stories when their children grew and then regressed and then grew and then regressed, sometimes in the timespan of a couple weeks. For the students here, this weekend may feel like a big transition with a huge directional arrow pointing from the past toward the future. But the way life actually works is that we always circle back and the directional arrow is not necessarily one that points from the past to the future in a straight line. We use the memories of who we were to construct who we are.  Our present selves are always made up of what we perceive has already shaped us.  Norwegian family scholar Marianne Gullestad has said that certainly what actually happens to us as children affects our adult lives, but our subjective understandings of our childhoods as adults have tremendous power in shaping how we act and think in our adult lives. How we think about what happened may affect how we grow just as much as what actually happened does. We go and grow through life transitions always building on our past selves, never completely starting over, and rarely in a straight line.

For students, you have spent these last four years using images of your future selves to have crafted what you opted to do here at Whitman.  Your present experience as students has been impacted and inspired by your vision of your future selves.

So what does this all mean? If we think about the word “then,” it is really not just about the past. It is also about the future. “When you were little, what were you like then?” reads just as easily as “Think about the future…what will you be like then?”  During a weekend like this, it is easy to think about how time flies – the “now” quickly becomes the “then.”  But it is also easy to see how this life stage transition signifies a jumping off point for present “now” becoming future “then.”  If now and then were on a continuum, I do not see a straight line. I see multiple axes, three dimensions, circles, satellites, and the location of “now” and “then” in multiple simultaneous places.

Anaïs Nin said, “We do not grow absolutely, chronologically. We grow sometimes in one dimension, and not in another; unevenly. We grow partially. We are relative. We are mature in one realm, childish in another. The past, present, and future mingle and pull us backward, forward, or fix us in the present. We are made up of layers, cells, constellations.”

I am so grateful to have been part of your lives, dear students, for these last few years. I hope you agree with me when I say that it is a gift to be part of the layers, cells, and constellations that make up Whitman College. Looking at you now, during this celebratory time, makes me very happy. I wish the world for you. As time goes by, I will think of you and how you’ve been these last few years. I’ll imagine where you’ll go in the future.

And while I thank you now, I’ll see you then.

Between Mother and Other

For this post, I offer the wisdom and idiocy that I have realized as a result of my entry into motherhood over a decade ago.  The “between” part of this post lies in the fact that everything listed below occurs as a set of paradoxical experiences that happen simultaneously. To be a mother is to be between everything.

A paradox of motherhood is that it is simultaneously celebrated as sacred and often attacked as the cause of bad things.

Matching mother-daughter outfits are simultaneously underrated and overrated.

Celebrating motherhood simultaneously makes us think about fatherhood as part of the caregiving experience while treating it as a totally different experience from motherhood.

Mother’s Day makes me simultaneously happy to be the center of attention and stressed that I can never think of the right way to attend to myself.

Motherhood is simultaneously about teaching our children that we need to build community at the same time we end up breaking down any possibility of a community of mothers and families because of inequalities and prejudices surrounding race, class, physical ability, mental health, and sexual orientation.

Motherhood is simultaneously thirsting for physical contact with my child no matter how big or dirt-ridden he is, and trying to stay clean and uninjured.

Working mothers must simultaneously prove effectiveness for multiple audiences, often with competing needs, and sometimes at the same time.

Mothers who have the ability to stay at home with kids are simultaneously labeled as “stay-at-home” while they are most often out in the world advocating on behalf of their children and modeling patience and logic and love so that their children will enact that same patience, logic, and love in their own adult lives and communities.

My representation of motherhood as a sociology professor simultaneously reinforces that having children is the default for women, and that we ought to strive to uncover how default paths serve to lessen the value of alternatives.

Motherhood is a location that is simultaneously filled with complete confidence in what feels right and a lack of confidence that we’ve done the right thing.

Motherhood is simultaneously feeling the most valued for selflessly meeting the needs of the next generation, while living in a world where value is attributed primarily in terms of self-interested economic gain.

The celebration of caregiving by mothers simultaneously makes visible the efforts of all mothers, and renders invisible the efforts of other caregivers who are not mothers, and of mothers who must leave their own children to care for other mothers’ children.

Messages that mothers get to take care of themselves simultaneously offer good ideas to create balance, yet often define the ultimate goal of self-care as meeting the needs of others.

The innovations of reproductive science have simultaneously made motherhood possible for many who otherwise would not have been able to conceive, carry, or care for children at the same time other scientific innovations have made us wonder whether technology is the right tool to use in our raising of children.

Celebrating motherhood for those of us mothers who have miscarried what would have been our other children is simultaneously painful and wonderful.

Being a mother makes me feel like a superhero and a villain simultaneously.

Having a child allows us to teach and learn simultaneously.

Mother’s Day simultaneously asks us to feel special at the same time we are part of a very large demographic.

Motherhood simultaneously has made me feel like a slow old mammal whose primary function is the provision of milk and a high-functioning brainiac who can process complex thoughts and emotions.

Motherhood has made me feel simultaneously ugly and beautiful as a result of housing, in my body, a giant baby for 41 weeks who forever changed the topography of my midsection, and of bringing into my world an additional person who complements my outfits.

Making a list of motherhood paradoxes simultaneously calls attention to the complexity of motherhood and essentializes it to be a category that can be written about as if it were monolithic.

Having a child is simultaneously everything.

Between Band Camper and Bad Counselor

A recent issue of a sociology magazine noted that among the topics least desired by sociologists to undergo sociological research was band camps, which, necessarily, made me want to write about them. Also included among the unpopular topics was “small yippy dogs,” but I have no experience with this topic and find small yippy dogs to be among God’s least useful creatures, just ahead of cats and mosquitoes.

Leave it to sociologists to craft a study asking people what they should and should not study. I suspect next will come an analysis of the inequalities between the types of people who study the unpopular topics and those who study the ones that get lots of votes.

I write this, then, for the unpopular underdog. Which leads me to my experience with band camp…

I went to band camp (well, really, it was a more well-rounded music camp)  in high school, and then, just after graduating from the college that hosted the camp, I served as the “Dean of Girls” for that year’s camp. These two times, at Band Camp, were strikingly different.

The first time was when I went as a camper just before my senior year in high school. And let me tell you, up until that point, I hated camps with a passion. Put a bunch of people in one geographic location away from the usual rules and norms of everyday life, add hormones and popularity contests and forced Bible study, mix and stir, and out comes miserable Michelle.

Anyway, this camp was different. I went with a good friend who was my roommate during the camp and a great clarinetist.  It was held at a darn good college 100 miles from home that had a well-deserved reputation for excellent music programs. Plus, my parents and older brothers went there, so I’m sure I was chomping at the bit to see what all the fuss was about that was recounted in their Nostalgic Tales of College that I had heard at the dinner table since birth.

When I arrived, I noticed the alpha campers with their in-depth understanding of hemidemisemiquavers (those are 64th notes) and correct embouchures, walking around telling people, often in falsetto voice to show off, that this was their fourth time at band camp. But I noticed more people like me — nerdy, a bit envious of what I was sure was immense talent among everyone except me, and filled with colorful rubber bands on my braces to distract people from looking at my face acne.

Turns out, it was wonderful. I sang. I learned to play the harp. I learned stuff that helped me understand why playing the oboe was so excruciating (with and without braces). I made friends who showed up again when we started our first year at that college a year later.  I gained some good cultural capital by navigating the cafeteria, interacting with college professors, living in a dorm, and budgeting my money so I could buy the perfect college sweatshirt as a souvenir.

After high school, I went to that college and I sang in their choirs and I befriended other music nerds and we sang in the hallways with our falsetto voices and we held contests to see if we could recognize different time signatures in Peter Gabriel songs. And then, in my senior year, I was asked to serve not just as a counselor for this camp, but as the Dean of Girls — the counselor who oversaw the other counselors in the girls’ sections of the dorm.

Important point: I had never served as a camp counselor before this.  And I received no training. Clearly my musical prowess as demonstrated in my collegiate music activities revealed, to the camp leadership, that I would be able to handle a hundred high school kids with bad embouchures, braces, acne, hormones, and egos. And that I would be able to handle the counselors who were handling the hundred high school kids. Important point #2: I have never served as a camp counselor since this.

I won’t go into too many details, but you’ll be able to paint the picture well if I offer you these key phrases that represent the week: tie-dye gone wrong, yelling (while channeling my best awkward authoritarian voice) at the counselor who was supposed to prevent two high school lovebirds from participating in a midnight make-out excursion, and countless tears in my dorm room over the realization that I should never have a job where I need to discipline young people or yell at colleagues who didn’t do their jobs right. I realized this after juxtaposing the camp leadership position with my experience serving as the alto section leader in my college choir — helping my fellow singers with the songs, not worrying about whether they were following school rules. Needless to say, the camper and counselor experiences varied greatly. I went from happy and comfortable and confident to angry and sheepish and unnecessarily authoritarian to deal with kids who were doing things that I had only dreamed of doing when I was a camper.

And so, I teach college students about the value of cultural capital in my sociology classes, and I sing and play piano when I can. I tried the oboe again and learned that I still find it excruciating. I wish I could play a harp again someday. I don’t spend much time disciplining students, for fear that my inner unnecessarily authoritarian voice will come out again. I tried my hand at administration, which I see as the grown-up version of Dean of Girls, counselor of the counselors, and didn’t particularly enjoy it.

In a weird way, I see teaching as the happy middle ground between student and administrator, between camper and band camp dean. I get to facilitate the building of community, teach and learn the content with eager people, and leave the disciplinary stuff to people far better equipped than I. Like when I was alto section leader.

I’m grateful for these two times, at band camp, because they showed me that I’m at my best when I am in a community of fellow awkward people making music, rather than trying to corral the chorale of the unruly.

Between Minecraft and Spine Graft

I lost part of my parental backbone when I acquiesced to downloading Minecraft to a fourth device in our household. I suspect it’s because my heart, at times yearning for the years when my now 11-year-old son was smaller, was warmed by a yet-unsurpassed display of cuddly tendencies from him as he cared for his Minecraft plushies that serve as compadres in his online Minecraft adventures.

(For those unfamiliar, imagine a fairly non-violent video game that is like movable building blocks in an endless world where you create whatever you want [with friends who are sitting next to you in real life and running around next to you on the screen], and you run into characters in the game that are now sold as stuffed animals at Target). (And for others of you who understand the word plushy to mean something that is more appropriate for an adult audience, that’s not what this post is about).

My son is a kid. Indeed, he throws the plushies around and adds crash noises when they collide.  But he also gently tucks his plushies into bed.  He covers them in tiny chenile blankets that used to flank his crib bumpers. He hugs them. He sets them on the desk when he plays his game online. He gives them names like Bomby and Mr. Stamps. He pushes them into my cheek and says they’re giving me a kiss. He asks me to mend them when a friend’s dog bites a hole suspiciously near the place where bodily waste would come out if plushies could poop.

He cuddles with them. Like a little kid. Like a nurturing parent.

And so is born in our household a bridge between childhood and teenagehood. And as each day goes by and he asks if he can go explore the Minecraft world behind what seems like an increasing number of closed doors, I realize that he is soon going to be the teenage explorer, followed by the adult explorer. And there’ll be less cuddliness and innocence. At least for me to see.

When my son was a baby he was not tremendously cuddly.  He was born huge and energized and really just wanted to spend his time looking around, moving, and exploring, since he had spent a lot of time cooped up in utero.  I think it just felt good to stretch. He looked out into the world more than he looked at me. But now that he’s got long legs and feet almost as big as mine and more strength, he sits on my lap and hugs me and leans into my neck when we sit and talk about his pixelated world.  Even more than when I could hold him in my arms. He is turning into a man and I see nurturing and softness and cuddliness and kindness. And I get to see this now that he has gangly arms and legs and a voice that’s lower.

I did not know that the soft little cuddly creatures that are supposed to represent hard-edged pixelated creatures in a video game would serve as symbols of the bridge between childhood and teenagehood for my son. Or that they’d make me see a softness in him that I saw less of when he was tiny. They are making me look backward and forward at the same time, and I have decided to notice it and love it. Not as someone who looks at Freudian mother-son stuff, though that could be interesting. But as a sociologist who loves to look at how life stages can tell us more about ourselves than we often notice. Transitions are often the times in life when we see things more clearly, or notice things that we didn’t see before.  When we change jobs, we see our vocational strengths and weaknesses more clearly. When we move, we see our living preferences more clearly. When we are sick and then hopefully recover, we notice our bodies more. When we lose someone, we understand ourselves better.

For boys, sometimes we fail to notice the softness, a failure that often gets more pronounced over time.  Unless we don’t fail to notice.  Unless we don’t fail to recognize that transitions show how someone can be both soft and strong, both pixelated and plushy. And that they have always been this way.

If we construct our “giving in” to our kids’ whims as losing our spine or missing a backbone, then I can best classify my back bones as made of part rock, part cartilege, and part pudding. In the case of Minecraft, which by some accounts is a pretty great way for him to spend his time and learn to create worlds with his friends and get inspired to maybe someday be an engineer, and by other accounts means he’s turning his brain and social skills into mush, I have decided to turn this “giving in” into a set of moments in time to consciously notice his transition into a new stage of life, and to frame it in a way that, perhaps, dispels how we think about boyhood and manhood.

These are the moments when I get to see hard-edges of a pixelated late kidhood meshed with soft edges of plushies nestled in his arms. Isn’t that what all of our lives are like?  Hard edges and soft edges both. I choose to work harder at noticing the soft edges of boyhood.

 

Between Smörgåsbord and Polyamory

I decided to break up with a restaurant this week, but they don’t know it. They may never know it, because I am only one of their many partners (note to people reading this who are my friends who have restaurants — it isn’t you).  We — this restaurant and I — are in a polyamorous relationship, which means that we are both okay with having multiple partners, and if we break up with each other we still have others to love. I refer to my adoration to the multiple eating establishments within a three mile radius of my house as Big Restaurant Love.  The ones I love the most are ones where I like what they cook, I like how it feels to be in their presence, and they like having me there (or they pretend really well).  Setting aside the problematic analogy of an economic exchange with a relational one, the big picture here reveals that they are in more relationships than with just me, and I am in more relationships than with just them.

But I can’t quite say that our relationship roles are the same. Because I am mortal, I participate in Restaurant Serial Monogamy, which is an academic term that means I am physically present in at most one restaurant at a time. But they get to have multiple partners in their presence at the same time.

In the case of my recent restaurant break-up, I thought about leaving a post-it note on the table with the phrase “It’s not me, it’s you. We’re through.”  But instead I left a tip, I left a smattering of chicken bits, and I left the premises. Besides, I didn’t have any post-its.

So what went wrong? Because I am a card-carrying member of the Clean Plate Club, I did eat most of my food. It cost more than my previous two lunches combined, after all. Plus, I was hungry and the server never returned to ask me how it was until I had tried most of it.  Despite eating it, I found the chicken to have the texture of rubber, I wondered whether gray was the new hip color for pico de gallo, and every time I looked at the guacamole, it made me think of a little pile of boogers.  And I’ve never even seen a pile of boogers.

Anyway, more than the food texture and aesthetic, the problem at this restaurant was that the food was overpriced, the server seemed mostly annoyed that I was there, and there was no appropriate response to my eventual reasonable complaint (which did not include reference to boogers, by the way — that was just in my head).  And it was evidently enough to make me decide I am not going back.

Sometimes when I teach about relationships, I talk about the principle of least interest with my students. This concept can be loosely defined to mean that whoever has less interest in a relationship actually has more power because they can afford to leave.  While this grossly oversimplifies relationship struggles, removes all semblance of romance from intimate relationships, and is technically grammatically incorrect (it should be the principle of less interest, at least if there are two partners), there is something to this.

Let’s pretend for a minute that breaking up with a restaurant requires the same consideration as breaking up with a person. In the deliberation of either, you can wonder what criteria should be used to decide a break-up is in order.  If it’s a short-term relationship, it’s easier — the other party hardly knew ye.  If you’ve become a loyal customer and then figure out you hate the place, you may turn that seething hatred that grows over time into a decision to give a long passionate speech at an opportune moment and storm out the door. “This is the last time you forget the paprika on my Eggs Benedict! I’m through with you!”

The moment of truth for a restaurant comes when there are too many people with the least interest. The moment of truth for the patron is when she realizes that her angry departure will have little to no effect on their well-being.

I was a waitress for half of one shift at a steak and eggs joint in 1991, so I know how hard it is to work in a restaurant and establish high quality relationships with customers. Okay, I don’t, but I do leave generous tips.  It only takes me about 90 minutes to establish empathy for any given occupation if I try it. (note to self: write blog post about how mosquito bites make it particularly difficult to have a job that consists of evening neighborhood canvasing for an environmental organization in 99% humidity, a job which I held for 90 minutes).

In the town of 5,000 people where I grew up, we customers had a lot to lose by disliking a restaurant. Since there were so few, disliking one meant removing huge degrees of freedom for our weekend nights. But where I live now and with the resources I now have, I live the luxurious life of restaurant choice, which means I have the privilege to pick and choose. And I can go to multiple places in one week and they don’t even question my loyalty!

But then again, if I never come back to some of these “choice” places, they may not notice.  So, who has more or less interest now?

I’ll tell you who has the least interest in my well-being — the one who decides that a pile of boogers can pass for guacamole.

 

Between Coffee Pod and Coffee Pot

Despite my being the sleepiest human in my household, I am the only one who drinks coffee. Not swayed by the dangers of coffee stains on my teeth from viewing 1970s Topol toothpaste commercials (for cigarette stains too!), I succumbed to the warm caffeinated elixir sometime in my first years before tenure. I figured that coffee was a more sophisticated choice than Mountain Dew with my breakfast. Since then, coffee has been a part of all of my days. Like a tiny bitter friend who beckons to me when I wake, whispering “I know I’m not great for you. But you love me anyway. And I make you wake up, which is a requirement for you to function as a productive human being who does not intentionally stab people with a ballpoint pen.”

It was just a few years ago that I gave up the 12-cup pot coffee maker for a single-serving one, made by the Pod People. I did this so that I would stop drinking 12 cups of coffee in the morning, or, in more glamorous days, continually soaking ladyfingers in cold coffee for what I will label “Michelle’s failed tiramisu adventures of autumn 2009.”

Beyond the subject of the production of coffee, which I’ve talked at length with sociology students about in terms of labor, migration, inequality, and terrain and temperatures closer to the equator, I have always found the consumption of coffee to be more sociologically intriguing, at least as it plays out in people’s lives that I see regularly. And I’m not just saying this because I live in the Pacific Northwest, birthplace of complicated and pretentious coffee orders. I couldn’t quite put my (lady)finger on what was intriguing about it until a handful of friends started recently lamenting their problematic attempt at upgrading the office coffee station with single serving cups full o’ coffee grounds made by the Pod People. The process has gone something like this:

Office folks: “Let’s use this new-fangled Pod coffee machine, and then we can each have different flavors and do less dishes. And then Susan in accounting will be able to have tea pods, too.”

Other office folks who are politically minded but also like hot chocolate: “I dunno. It’s kind of wasteful. But maybe we’ll end up using less water. Okay. And then I can also have hot chocolate.”

Office manager: “Okay, great, seems like less mess and will make people happy. Plus, adding the word ‘pod’ to our office vernacular must mean we’re technologically savvy. Pods also seem better for capitalism and the illusion of options for my employees. That, along with more caffeine, may increase productivity.”

And then,

Office manager: “Hey, somebody keeps using all of the vanilla hazelnut, which is my favorite, and our monthly coffee budget has been used up and it’s only the 5th day of the month.  From now on everyone has to buy and bring their own Pods.”

And then,

All, in their minds: “Why don’t any of us get along anymore?”

You see, a communal coffee pot in a workplace, wherein people who finish the last drop of the pot participate in the social contract to replenish the next one, is a dying breed in the list of “normative office activities that yield community-building” breeds. Now replaced with Pods, it is no longer the case that we need to think of the next person as we pour ourselves a cup of Joe.

My office building still has the communal pot. Used to the motion of grab-pod-then-lift-then-push-then-immediately-get-coffee, however, I am not a daily consumer of the office coffee. For me, my unlikelihood to frequent the pot is precisely because it makes coffee about other people.

It’s not that I don’t like other people. It’s that I define coffee consumption as a fairly solitary action. (Give me a beer, and that’s a different story, and usually a different time of day). Oh, and I’m also self-centered and lazy.

Coffee is my morning solitude. My transition into a day that is filled with people, not a way to commune with said people. I am the anti-social coffee drinker, which I use as a lubricant to become social. And by social, I do not mean chatty and outgoing, as when I drink beer, but rather I mean able to complete a sentence and open both eyes when finishing that sentence. I include my family in this, by the way, as in “Don’t ask me a question about Minecraft, your report card, the broken toilet, or that pile of doggie vomit in the corner until I’ve had a cup of this magic.” This is quite different from what I remember seeing in adults as I was growing up – church-goers sitting after a wedding or funeral drinking cups of light brown Folgers and eating jello salads; teachers gathered over coffee with my dad in the teacher’s lounge with stained mugs that said things like “World’s Best Paid Teacher” and “My Dog Ate Your Kid’s Homework” and “You Can Count On Math Teachers.”

I wrote my PhD dissertation on home-to-work boundaries, and based part of it on the work of my friend Christena, who actually studied, among other things, how people used beverages to signify the transition between home and work (warm bevvies with caffeine in the a.m., cold ones with alcohol in the p.m.). This was a great way for my brain to understand the material manifestation of time and social boundaries, and to understand the role coffee plays in my life. My home self sits alone in a big rocking chair in the quiet with coffee in the morning. My work self meanders in the hallways pleasantly running into colleagues and students and chatting, usually without a coffee cup in hand.  (My coffee shop self is the subject of a different post, but suffice it to say here that coffee shop social interactions can be both isolated and social, and it’s always fun to watch the isolated folks get their alone moments bursted open by that friend who just wants to chat for 30 seconds but then keeps talking because their latte is not ready yet).

Anyway, what I wonder is whether the infiltration of Pod People coffee machines will change the nature of collective behavior in workplaces. It may be much easier to mandate individual accountability and reinforce self-centeredness and the illusion of choice with Pods than it is with a machine that requires thinking of the next set of mugs the 12 cups will fill.  However, it may be that this vision idealizes the generosity of actual people who drink from a communal pot. After all, there are at least three coffee-spattered signs hanging above the communal pot in my office building saying things like “Hey, you! Yes you! Fill the coffee pot!” or “There is no ‘I’ in coffee – fill up the next pot for your friends.”

Add a Pod People machine, and those signs (and their social contract implications) go away. “Coffee for one” replaces “All for coffee, and coffee for all.”

As for me, the consumption end of the coffee bean is more about my mental health and transitions during my day than it is about a social contract.

I wonder, though, on a larger level, what ramifications the individual Pods may have on the collective coffee pot among friends and colleagues.  If we all turn into Pod People, might it stop occurring to us to think about when someone else’s coffee cup is empty?