8/25/2008

from poetry to poop in 150 words

Evolutionary biologists explain that the large frontal cortex in humans, which gives us an exceptional capacity for pattern matching, was originally a survival mechanism. If we're not stronger or faster than our predators, then we need to be able to interpret their signals and make smart choices. Our brains are designed to read meaning into our environment -- whether we're reading gestures, words, sounds, or objects, we are constantly interpreting things, but usually at a level far below our conscious awareness. The top-level interpretations we are aware of are the result of hundreds more subtle patterns we've already decoded. This inherent capacity of the brain explains a lot of understanding that comes to us through intuition, or hunches -- things our conscious mind can't really explain but other levels of our brain have already grasped.

This skill also means that we start to see connections and cause-effect relationships among items that don't necessarily have them. It's a small step from "which one of these things is not like the other" to "when I see round objects I feel happy." Put five items in front of a person and sooner or later you'll start to make judgments about them -- and see connections, comparisons, and relationships that may or may not "be there" at some other level of evaluation. The most beautiful connections we call poetry, metaphor being its most condensed form. The seemingly most obvious connections we might call taste or preference: what seems appealing or disgusting to me seems self-evidently so, yet with a little perspective one realizes that it's just subjective perception. From another angle, we get superstition or mysticism-- human beings are biologically prone to making connections between unrelated things (hence the "lucky socks" students wear to test day) and also therefore prone to mistakes about judging the probability of certain events (any day's news reports gives you plenty of examples).

I've been reading a lot of different material lately about how these pattern matching and judgment skills in the brain operate. (from Blindsight to Fooled by Randomness to Spark.) As a literary critic, I rely upon these skills all the time, and part of my job as a teacher is to help students learn how to become more reflective and critical about the patterns they notice and enjoy. But I've been enjoying trying to observe how often that patterning kicks in -- how frequently I find myself with preferences or assumptions that really don't belong there.

Our oldest dog, known here as Old Girl, likes to eat poop when we're out on a walk. She prefers cat droppings, but will eat other dogs' poop too (but never our own family dogs, never in our own yard -- there's undoubtedly some territorial as well as gustatory element at work in her desire). This is a disgusting habit, but it is almost impossible to completely prevent her from getting some. She's crafty enough to squat to pee -- which is after all one reason for going on the walk -- and then suddenly duck her nose into a pile of cat poop. Old Girl would be approximately 97 years old in human terms, so at some level it's kind of like getting great-granny to give up bacon and cigarettes (that is, if I had such a great granny, but I know people who do). So my walks with Old Girl nearly always involve some struggles over her access to certain stinky items.

Poop is, I think, unequivocally disgusting. Possibly less disgusting than vomit, but I don't really want to put that to the test (and there are lots of digestive variables involved). But then this morning I realized my brain has actually made some more subtle distinctions. Because, I found myself telling Old Girl, cat poop rolled in sandy dirt is MORE disgusting and shouldn't be ingested, even by a poop- and dirt-loving old dog.

Ah, the brain at work. (or something)

8/22/2008

laugh when you can

The past couple of weeks with Elderly Parent have been kind of difficult. I finally got her to understand that I wanted her to see a neurologist -- and why (for an Alzheimer's/dementia evaluation). Since then, she's been understandably agitated, angry, sad, and afraid. But given EP's upbringing and personality, her response has been to cut herself off from her friends and to demonize me. She phones me every day or two to go through the same litany of how terrible a person I am, how I've betrayed her, and how she's doing just fine and doesn't need to see a doctor. I was prepared for this, since it's actually fairly familiar behavior from her, but that doesn't make it any easier when I see her name pop up on my phone.

Getting the appointment scheduled has been a lengthy process, since a referral was required from her primary care doctor, whose staff are bad about returning phone calls. But then yesterday EP tells me that an appointment was scheduled for her with the neurologist. That's great news! But she doesn't know what day or time it was for.

Because scheduling an Alzheimer's screening on the phone with the patient is such a great idea. Right?

I've got to laugh about these things because otherwise it is just too irritating. (And I called that office today and they were happy to tell me when the appointment was for. Thank goodness for frequent flyer miles, since now I'll be making two trips out there this month.)

8/16/2008

faculty leaves

This story in yesterday's Chronicle, about how the University of Georgia has suddenly canceled some faculty research leaves for this Fall semester, has really been on my mind. First, of course, is my sense of sympathetic outrage on the behalf of those faculty who thought they were on leave this semester and now suddenly have been told that they will be teaching next week. And on behalf of the administrators who have to find/create courses for them to teach, and the students in hastily-put together courses taught by someone who thought she'd be writing her book. That's really not an ideal learning situation.

I understand about budgetary cuts, and how research leaves might seem like something extraneous that could be sliced. But although the Chronicle article didn't specify which colleges/departments were being affected, such cuts typically affect faculty in the humanities much more than they do faculty in the grant-supported sciences, since we have fewer opportunities to take time away from teaching and put it towards research. So there's that concern. And what's even more frustrating (in my sympathetic response) is that the cancellations are being unevenly applied. According to the article, some faculty who had planned to be doing research in other locations for the semester are still being granted their leaves under a "hardship" clause. So if you already have your research completed and need writing time, or if your research materials are available to you through interlibrary loan, your leave is taken away.

I'm identifying very strongly with the Georgia faculty affected by these cuts because I have a one-semester faculty research leave for Fall 2008. My university, like Georgia, doesn't have sabbaticals. And in fact our leave program is only for our college and is thus contingent upon the good sense and good graces of our Dean. It's something that could be suddenly taken away from us at any point.

And I'm spending my leave semester here. I don't need to go do archival research anywhere else for this project. I'm presenting at a conference in the fall, but otherwise I'll be here -- I need uninterrupted reading and writing time in order to get this book written. As is also true at Georgia, we get a one-semester leave at full salary or a full year at half salary. Economically and logistically it really wouldn't be feasible for me to go somewhere else. And besides, I don't want or need to for my research. So I've very strongly identified with those faculty mentioned in the article who suddenly find themselves teaching when their colleagues who planned to travel still are on leave. In my department, it's usually only faculty who are married to high-paid professionals who can afford to take the year at half-salary and spend it abroad. Those with family responsibilities or smaller incomes usually spend their leaves at home.

I've been maintaining a regular gratitude practice for several months -- expressing my gratitude for the blessings in my life each morning and evening. And the luxury of my research leave has definitely been on that list. But unlike my yoga practice, for instance, which is also on that list, I also feel a sense of responsibility and some anxiety about my leave. Will I have enough to show for it? am I using my time in the best way possible? And now I kind of feel like not only am I writing for my own reasons, but on behalf of those whose leaves were cancelled.

8/14/2008

relative femininity

Dr Crazy's been writing some thoughtful posts recently about (among other things) the genre of personal academic blogging and gender normativity. The concept of the "personal" is inevitably coded differently for men and women -- and often assumed to have a connection to the domestic and/or feminine.

I'd think it's quite safe to say that Dr C experiences her gender identity (whether on her blog or IRL) very differently from me -- the ways each of us inhabits her female identity is very different. (I'm much older, less girly, and half of a short-haired, comfy-shoe wearing lesbian couple -- among other things). But I've also been struck by how cushioned I am in my day to day life from ever even contemplating my gender identity these days. No one is likely to comment on my childlessness or my appearance, and most of the time I don't really think about it.

What brought all this particularly to my attention was that GF and I recently attended a funeral and reception involving her extended family and several generations' worth of family friends. I so rarely see so much plastic surgery (kind of shocking when the 60 year olds look younger than I do) up close, so much cleavage, or so many high heeled shoes. And I work in an English department with its share of stylish women. But they're stylish in the way that academics are stylish, feminine, and attractive -- which is to say, very very different from the rest of this city's inhabitants.

I've always thought of femininity as a relative construct. Next to my partner, I'm definitely the femme. In my department, I think I'm about a 5 on a 10 point femininity scale (makeup, but no skirts; boots, but no pumps or sandals; earrings, but no necklaces). I think for most of these women at the funeral, I'm not even on the same scale that they are.

Then, after the funeral, we went to the mall for one of our twice-yearly expeditions. Another version of the same thing. What are these women wearing and why? What does a woman like me have to do to get noticed at the Clinique counter to buy a refill of my sunscreen? (Sometimes it's mighty difficult. I left one department store where the clerks refused to pay attention to me and went down to the next one where they were happy to take my money.) In my usual routine of university, yoga, gym, grocery store, post office, and library no one cares what I look like. So it's a bit of a shock to be reminded of what mainstream culture thinks.

I should be clear: I don't want to look like scary plastic surgery lady or a girly girl. I'm mostly content with how I look and I'm certainly content with how I perform and experience my femininity. But I guess in losing some of my youthful self-consciousness I've also just stopped thinking about these things so much. What that means I'm not yet sure I've considered.

8/11/2008

the google rabbit hole, with a twist

Over the past couple weeks I've fallen down what I think of as the online rabbit hole a few times -- you, know, the process by which you think of an old acquaintance and google them and then look up somebody else, and then, and then...it's 45 minutes later and you've done nothing except experience a little Schadenfreude and a little envy. I haven't yet figured out what this process is a symptom of, for me -- I don't do it all that often, but every once in a while I get curious about people from the past. And the fact that a friend of mine was recently extolling the virtues of Facebook to me in a way that actually made me kind of curious (I have completely resisted it thus far since I hardly need new ways to waste time online, to which the above will attest), combined with Facebook's now displaying public profiles (name and picture is all) when you google somebody (btw this is only if you have your facebook profile set to do this, it's not for everyone) means that I found some more people in the latest rabbit hole adventure. And I'm contemplating a limited foray through Facebook (esp since my 20 year college reunion is coming up next spring).

But then last night GF and I were rabbit hole adventuring together -- it started out with a reasonable research project, to find an obituary for a family member -- and then it turned into full out googling of various relatives etc. And then of course ourselves, and our googlegangers. And here's the twist: since the last time I googled myself, my university page has fallen substantially in the results (which is fine with me, it's not like anyone else is actually looking me up that way) because there is now an adult film actress with my name. Too funny. (Especially if I could print the titles here, but I'm not going to.)

8/08/2008

the ideal and the real

I have this fantasy picture in my head of what working at the office should be like. I guess it's an amalgam of several real faculty offices I've known -- none of them mine -- seasoned with a few fictional or film versions. Key elements include a window (preferably with trees or hipster urban cityscape outside), bookshelves, a large desk surface, a comfy reading chair, elegant lamps for light, edgy artwork/wall hangings. and a coffee pot. Such a fantasy office is of course located in a charming Gothic style academic building (but completely fitted with modern amenities), filled with genial colleagues, brilliant students, and an atmosphere of intellectual discovery.

The reality is something a bit different. My office (which actually I'm quite pleased with, having only moved into it a couple months ago) is small, windowless, and contains two metal bookshelves circa 1969. The 6 year old carpet is the newest thing in the office (I found papers in the filing cabinet dating back to 1977 and the metal desk was I think designed to act as a temporary shelter from nuclear attack) -- I know this because it had to be replaced when a fire broke out in that office because of the wiring on that side of the hall. You see, someone had plugged in a coffee pot. We were scolded by physical plant and told that the building really couldn't handle coffee pots along with computers (and how are you supposed to have an English department then?). So now there's a circuit breaker for that side of the hall -- but it's located in the men's room. (Again -- we're the English department -- we have more female faculty than any other department on campus -- and they put the circuit box by the urinals.) I have a couple of posters in my office that I really like and I'm hoping to add a few more. But the room's base look suggests the office of a bookeeper in a industrial tubing factory (thanks to the yellow walls and perforated ceiling tiles) and it's a challenge to disguise it as anything more rarefied.

The (not Gothic, not stylish. not historic) building smells of mold, rather than discovery, and every single person I spoke with on my last trip to campus was in the throes of depression. I know it could be a lot worse -- I'm very fortunate to not have to share an office. But it's so far from my mental picture of how working at my campus office could be pleasant and supportive rather than draining. It's not just the physical environment of course - - we're a commuter university, both in terms of our student population and our faculty. Most of us come in only on our teaching days and do our research from home. It's a tricky circular effect -- the building is bleak so few of us are there so then it seems gloomy and disheartening.

But in any case, the space is still mine to use as long as the state sees fit. To make sure I get to keep it during my leave semester, I have to show up every few days for a couple of hours and make sure that our office manager sees me visibly using my office. It's kind of grim but at least it's mine. And every day allows me opportunities for cultivating gratitude -- that I don't have to share it, and that I can close the door, now that I'm out of administration.

8/03/2008

middleagey

I picked up a novel last week at the library and turned, as I often do, to the back flap to learn about the author, whose name was unfamiliar to me. This was The Story of Forgetting by Stefan Merrill Block, which I read and liked (and although it may appear this blog is quickly sliding down into some morass of Alzheimer's-only commentary, it was pure coincidence that I pulled the book off the new fiction shelf) (although I suppose maybe not a coincidence that I read it). But jeez oh pete, he was born in 1982.

I was startled, but decided to read the book anyway -- it's not the first novel I've read by someone younger than me (although possibly the first by someone that much younger) -- and as I thought about it I figured I inevitably see actors and listen to musicians younger than myself. But there's still something embedded in my head that goes way back to I think the 3rd grade -- the sense that cool people are inevitably older than me. For so long, all musicians, novelists, and actors were older than me, that it came to seem inevitable.

And although I hear music on the radio from new young bands, I'm entrenched enough in my habits that most of what I still listen to is from people of my own generation or older: Robert Smith, Bono, Madonna. I just checked, and as I thought, John Digweed and Tiesto are about my age. Most of the contemporary fiction I read is from people around my own age or older.

I guess all that's getting ready to change...