November 26, 2012 · Write a comment · Categories: Musings · Tags:

My family set up the Christmas tree over the long weekend, next to the fireplace as usual, while the kids were home from college. It’s an old artificial tree that we have had since they were little. This year, instead of putting on plain basic strands of lights, we bought new LED lights shaped like evergreen cones. I found their soft colors and decorative shape to be a pleasant change from the old-style Christmas tree lights.

So far, so good. But we have two sconces over the mantel, with bright incandescent bulbs. Turning them on at the same time as the new Christmas lights produced a horrid glare. The problem was easily solved, of course, by turning off the sconces and leaving the room dark except for the tree lights.

But I wonder what might have happened if we’d had incandescent bulbs only on the other side of the room, farther away from the tree. We might not have consciously noticed that the two different kinds of lights clashed. Maybe we’d have spent the entire holiday season feeling that there was something not right, but never knowing what had put our nerves on edge.

Modern technology can irritate people’s senses in ways that are below the threshold of conscious perception. We’re all born with senses that evolved over many millennia to process natural inputs such as those from forests, grasslands, and other natural vistas. But instead we find ourselves in cities that look very different from the surroundings our ancestors knew. Humans are a very adaptable species; but although we can learn to function in many different environments, it’s kind of like installing new software on a computer with an older operating system. There are bound to be incompatibilities, like not being able to deal with my new LED lights and the existing incandescent bulbs at the same time, as well as unexpected glitches.

As science advances our knowledge of the human brain and how it processes sensory inputs, I expect researchers will learn how to design more comfortable environments. That should go a long way toward reducing our stress levels. Perhaps the homes and workspaces of the future will be creatively designed to give us feelings of serenity, confidence, and joy.

November 1, 2012 · Write a comment · Categories: Musings · Tags:

When I was five years old and thought I knew everything, I put together a construction-paper traffic light with green at the top. Then I told my kindergarten teacher that all traffic lights ought to be made that way because the green light was the most important. After all, green meant that you could get to places faster, while red meant that you just had to sit and wait. Red lights were boring. Who needed them, anyway?

It took me a few more years to grasp the concept of danger well enough to understand why red lights had to be on top. Green lights, although useful for showing where intersections are, don’t require that we do anything differently. A red light grabs our attention, shouting: Danger! Stop! Right now!

I’ve sometimes wondered whether humans evolved to react to praise and criticism in much the same way. No matter how many compliments we get, we’re likely to take them for granted. As with green lights, they just go by as part of the social landscape, confirming in general that we’re on the right road. We don’t give them much weight in our minds. Criticism, however, weighs much more heavily; it can sting for many years. Even if we consciously know that an old mistake doesn’t matter, it still bothers us long after the fact.

Why do we find criticism so troubling? I suspect we may be hard-wired to process it as a danger signal, which would have made sense in the small villages of the past. Because primitive humans’ daily tasks were very simple and repetitive, there wouldn’t have been much reason for either compliments or gripes about the quality of a person’s work. As long as it got done, it was probably good enough. Criticism would have consisted of pointing out dangerous errors, such as picking a poisonous mushroom or not noticing a lion’s tracks near the river. The message would have been: Danger! Pay attention! Don’t ever make that mistake again, or somebody is going to get killed!

In today’s world, criticism usually involves trivial oversights or harmless differences in appearance and social behavior. Instead of the rare and memorable event that it might have been in our ancestors’ villages, it has become commonplace. Remembering criticism for years no longer has any significant survival value; on the contrary, it’s much more likely to shorten our lives by making us susceptible to depression and anxiety.

What’s to be done about it? Many people take medications to cope with depression and anxiety. Others self-medicate with alcohol, street drugs, cigarettes, coffee, et cetera. Finding comfort in food also is common. Some of us seek to modify our reactions to distressing situations by way of traditional psychotherapy or behavioral therapy. Another approach is to distract ourselves from our worries, such as with yoga, meditation, art and music, fiction, video games, gambling, sports, and hobbies in general—or perhaps by compulsively working long hours.

Of course, however effective they may be, all of these approaches to dealing with depression and anxiety are simply coping mechanisms. They don’t solve, or even acknowledge, the underlying problem of living in a highly stressful environment that bears little resemblance to the conditions under which humans evolved. As individuals, there’s not much we can do to avoid criticism and all the other stresses of modern society unless we choose to live as hermits or otherwise drastically isolate ourselves, which, needless to say, would have major drawbacks.

So the question to be asked is this: How can we change our culture to bring about a healthier social environment? On the specific issue of criticism and its effects, I believe we need to end the bullying and casual insults that pervade our social sphere. Today’s politics has degenerated into a lot more name-calling than substance. The Internet is full of nastiness. Although school officials and employers are starting to recognize that bullying is a serious problem, much more still needs to be done. It’s no wonder that so many of us struggle with depression and anxiety. But when we do, we should keep in mind that it’s chiefly the culture, rather than ourselves, that has something wrong and needs to be fixed.

As we go through our days dutifully checking off the various tasks on our calendars, we may look around and notice that a few things have fallen by the wayside. Perhaps we haven’t written any blog posts for months, or the supplies we bought for a project we planned last year are still sitting at the back of the closet. Whatever it is, we start wondering where all the time went. We’re likely to tell ourselves, in a familiar modern lament, that our lives have gotten too busy and need to be brought back into balance.

Sometimes we really do get overscheduled to such an extent that we can barely function. But more often, I believe, the actual issue isn’t one of time management at all; it has more to do with all those nagging anxieties at the back of our minds, which accumulate until we can’t turn our mental focus to anything else.

We can make checklists for every imaginable daily task ’til the cows come home—but although that may help to manage the distraction and lack of focus often described as executive-functioning issues, I suspect there’s much more to the underlying problem than simply needing to organize our schedules more efficiently. We live in a hugely complex pressure-cooker society that has caused many of us to become, in the literal sense of the word, unbalanced. That is to say, we don’t feel confident in our ability to balance all the demands our society expects us to satisfy. And so our thoughts start to run in anxious frightened circles that distract us from getting our tasks done, causing us to worry even more—and the vicious cycle spirals downward.

In a bygone era, the natural rhythms of the days and seasons kept our ancestors’ lives in balance. Physically, they worked much harder than most of us can imagine. Their days were filled with strenuous, time-consuming chores as they struggled to bring in enough food to survive the winter. Their fears were much more immediate and concrete than ours: starvation, plague, tribal warfare, being attacked by wolves and bears. But although they experienced miseries that most of us thankfully will never have to face, their tasks were simple and predictable enough so that they didn’t have our modern-day anxieties. Their subconscious minds weren’t filled with worries about what they ought to be doing differently, how well they could measure up to society’s demands, et cetera. Whether they ate or got eaten on any particular day was up to Fate; they made whatever sacrifices they believed would keep the gods happy, and left it at that.

How can we cultivate our ancestors’ untroubled mindset in a world that has become vastly more complicated? I would say it begins with centering ourselves in the moment, so that our thoughts don’t habitually wander along negative paths. Meditation, exercise, and mindfulness can be helpful approaches to banishing persistent worries. They don’t necessarily require large amounts of time; it’s more a matter of arranging our daily routines in ways that provide for moments of peaceful reflection.

This morning, before I sat down to write this post, I got myself a cup of raspberry-flavored coffee and a whole wheat English muffin with raspberry jam. I thought about what good fortune it was to have these small comforts, how pleasant the coffee smelled, and how pretty the raspberry jam looked—bright sparkling red in the morning sunlight, with little seeds all throughout. One can’t simultaneously contemplate a raspberry seed and worry obsessively about some upcoming task or other. That simple fact seemed to be enough, at least for the moment, to bring my entire world into balance.

When I first read Little Women as a child, I had no appreciation for the scene where Jo March burns all her creepy stories about crime and monsters, which she wrote for a tabloid called the Weekly Volcano. I thought it was ridiculously old-fashioned to say that such stories harmed the public morals; and I felt sure that if I had been in Jo’s place, I wouldn’t have meekly burned up my own creations, no matter who disapproved of them.

It wasn’t until many years later that I began to understand what the scene was about. The Volcano isn’t just a clever name for a fictional tabloid; it’s a metaphor that represents all the anger, fear, and other molten-lava emotions bubbling away under the surface of the human consciousness. Because stories first create a dramatic conflict and then resolve it, they can’t be effective without touching the reader’s emotions in one way or another. Thus, an author has to consider what sort of emotional response a story is likely to get. Will the story take its audience for a harrowing stroll on the volcano’s edge? If we choose to dwell on sordid or gruesome material, then we bear some responsibility for the unhealthy feelings we stir up in our readers.

Of course, that doesn’t mean stories should have nothing in them but sunshine and joy; nor are we obligated to preach sermons to our readers. Rather, reading a good book can be like making a good friend. Fictional characters can give us comfort, inspiration, and helpful advice, just as our real-life friends do. Like real people, our fictional friends may have to deal with crime, death, and other less than pleasant aspects of the real world. As authors, we wouldn’t be honest with our readers if we pretended such things didn’t exist. And although monsters and the paranormal may not be literally real, they give us an opportunity to exercise our imagination and gain insight into our society’s collective psyche. Plus, they’re just fun to read.

So I wouldn’t say that any particular genre of fiction is harmful, in itself. What makes the difference is how the characters and images affect the readers’ emotions—and to some extent, the author’s as well. In Little Women, Jo’s quest to produce thrills by ‘harrowing up the souls of the readers’ left her feeling disturbed by morbid thoughts because she spent so much time focused on the world’s grimmer aspects: ‘She was living in bad society, and imaginary though it was, its influence affected her…’

If we’re honest with ourselves, both as authors and readers, we know what sort of emotions a story stirs up. If the main characters were real people, would we invite them into our homes for a visit, or would we nervously close the blinds and make sure all the doors were locked? If the latter, then we may find that we would benefit from choosing our fictional acquaintances more carefully.

Even with this summer’s extreme heat and drought, I still had to spray thistles in my yard; the heat doesn’t bother them. Not much bothers them. Thistles are very persistent weeds. Pulling them out by hand is useless because their root system is so thick and deep, they can just send up two new sprouts for each one you pull. Spraying them works much better because the herbicide gets carried down into the roots and prevents any new growth from coming up.

Much the same can be said about ridding our society of its prickly old prejudices and stereotypes. They’ve been around long enough to have a strong root system—that is to say, a large set of cultural assumptions or myths from which they grow. Trying to attack a prejudice without also going after its roots has little effect. Many people put huge amounts of time and effort into arguing, on the Internet and elsewhere, about how ignorant someone else’s beliefs are. But without addressing the cultural context of the beliefs, attacking those who hold them is about the same as trying to pull up thistles one at a time. Some may decide that they’ve had enough of arguing; but they still have no clue what the opposing view is about, and others who share their beliefs get even more vocal as a result of feeling threatened.

To dispel a prejudice effectively, one first has to consider: What is its history? What other beliefs are associated with it? What social structures reinforce it? What role does it play in the cultural drama in which it appears? If our social world is made up of the stories we tell ourselves about it, as has sometimes been said, then we have to understand these narratives before we can rewrite them.

That doesn’t necessarily mean social change begins in the library with a stack of books about history, folklore, politics, rhetoric, and so forth. Much of what’s involved in changing the world—“radical” change, in the original Latin sense of the word, from the root—is an intuitive process. We know what kinds of stories resonate with our culture because we’ve grown up with them and incorporated them into our own lives. When we feel the earth quivering under our feet, we know there’s a fault line close by. As Bob Dylan’s classic song puts it, we don’t need a weatherman to know which way the wind blows.

Today’s world may be more amenable to change than the world our ancestors knew, simply because the pace of change has become so rapid. We are witnessing cultural transformation on an unprecedented scale and, as a result, we don’t have strong expectations that our lives will stay the same. We’re more willing to consider ideas that would have been dismissed out of hand by past generations. But we may also feel so unsettled by the lack of constancy that we cling to old ideas long after they have outlived their usefulness, just because we can’t deal with any more revisions to our mental maps. I’ve sometimes thought that reworking our cultural narratives is much like composing social stories to help an anxious child get used to new places and events. For both, what’s needed is a reassuring storyline and enough repetition to make it familiar and comfortable.

Because our society has become so competitive, we’re often advised to focus on doing our best, rather than worrying about whether we have accomplished as much as others. Many people find this advice helpful because it frees them from the stress of always comparing themselves to others and falling short in some way. In a world of more than seven billion human beings, interconnected by modern technology, we are bound to find others who have accomplished more in almost any endeavor. Striving to outdo everyone is likely to be an impossible goal. Even those who manage to set a world record, through great effort, often find that someone else surpasses it in a matter of months.

In general, I agree that it makes good sense not to be overly concerned about measuring up to others’ accomplishments. But there is also a perfectionist trap in “do your best” because no matter what we do, there is probably something we could have done better. By definition, our best can’t be sustained as a long-term steady state. When we’re having a good day, we can do our best; but there will be other days when we’re distracted, or we didn’t get enough sleep, or we haven’t adequately processed a complex task and don’t feel able to deal with everything it involves. That happens to all of us, and we shouldn’t feel guilty because we’re not doing our best at a particular moment.

My preference, rather than always striving to do my best, is simply to do what needs to be done. By nature I’m picky anyway; in my writing, I often change a lot of words until it’s flowing just the way I want it. I don’t need the additional stress of always worrying about whether I’m doing the best possible work. Instead of obsessing all day about the small details, it’s generally more useful just to get the task finished and move on to something else.

Of course, that does not mean rushing through things with only the bare minimum of effort. Doing what needs to be done requires allowing enough time to do it properly. That way, if a task isn’t going well for whatever reason, we can just set it aside for a while and come back to it later, when we’re feeling more focused. And then if we’re still having problems with it, we have enough time to ask for help. Our hyper-competitive modern culture has left some people feeling that they always have to do everything by themselves, or else they’ll be incompetent failures; but in fact, there’s no shame in asking for help when we need it, and we may discover that those we ask are glad to help.

I’ve learned from participating in online creative writing groups that everyone has different perspectives on what makes good work, and usually they’re not shy about sharing their opinions. Of course, what one person prefers is not necessarily going to suit someone else; but if we can get past our defensiveness about being told that there’s room for improvement in our work, we are likely to find others’ views at least somewhat useful.

A corollary of the observation that everyone has different perspectives is that when we try to do our best, it’s not a clearly defined goal. What is our best, anyway? Do we really know? All of us have had the experience of being proud of an accomplishment, only to realize later that we made an embarrassing mistake. As we go through life and learn more about our world, we see many things differently. What we consider our best work at age 50 is not what we thought at age 25, for instance. And by that I don’t just mean we develop more skills; we also gain more insight into the social context and consequences of our acts. Are we at our best when we outperform our coworkers, or when we take a little time that might have gone into our work and help them to improve theirs? When we put huge amounts of effort into accomplishing a very ambitious task, at the cost of stressing ourselves out and spending very little time with family and friends, is that our best? Does the highest salary automatically equate to the best career choice, and if not, what other factors are important to consider?

Although striving to put our best efforts into everything we do may sound like a noble goal, in practice it’s highly likely to cause us to suboptimize—that is, to accomplish things that look good in themselves, but that actually detract from our well-being because we haven’t fully understood how they fit into the big picture. Instead of feeling obligated to work as hard as we can on each particular task, we should consider how the task fits into our long-term goals, and then set our priorities accordingly.

Modern life can be so hectic that people often get stuck in routines that have outgrown their usefulness, without even thinking about it. Routines have a calming effect because they reduce the number of decision-making points we encounter in an increasingly complex world. They’re essential to protect us from the paralyzing anxiety that would otherwise result from having too many choices to make. But if we’re not careful, we can miss out on a lot of things we would have enjoyed, just because we didn’t take the time to reflect on how our routines might be improved and updated.

Here’s a simple example of how that happens. I routinely buy the bagged salad mix at the supermarket because it saves the time and effort of assembling the individual items, while also ensuring that I won’t find myself short of any particular salad vegetable. For many years, I always topped the salads with shredded cheese and bacon bits, without any dressing, which is how my husband prefers them. That seemed fine, and I didn’t give it much thought. When we ate out, however, I enjoyed the house salad at a restaurant that prepares it with a vinegary dressing, dried cranberries, and walnuts.

It never occurred to me that I could do something similar at home until a recent grocery shopping trip. I was in the condiments aisle buying more bacon bits when I noticed a new salad topping on the shelf—a mix of dried cranberries and almonds. That was like a moment of revelation. I just wanted to shout “Yay!” and jump for joy right there in the supermarket aisle. Although I didn’t really do it because of our cultural expectations about proper behavior for middle-aged women (alas), I put the new topping in my cart and had a smile on my face for the rest of the day.

Of course, I could have bought dried cranberries and nuts separately even before the supermarket began selling the new salad topping mix; but the thought never crossed my mind. I had gotten so much in the habit of making my salads the same way as my husband’s that I just did it by rote.

By definition, routines are things that we do as a matter of course, without need to ponder the details. Our conscious minds pay very little attention to such familiar actions. So it takes a deliberate effort to consider what’s involved with a particular routine and how it might work better if done a different way. Improvements that other people may find obvious are likely to elude us, just because our habitual acts always seem normal and reasonable in our own minds. We also tend to exaggerate how hard it might be. It’s the big things that come to mind when we think about change, such as buying a new car or house, rather than little variations in our daily routines. Change seems difficult, expensive, and far away.

I believe it helps to set aside a few minutes every day to consider the question: What can I do differently, in the here and now, to make myself happier? Often the answer is something that can be done easily and for little or no cost. We can, for instance, tidy up those cluttered areas that give us the subconscious feeling our lives have gotten out of control. Last week I cleaned out my desk drawer, which (I am embarrassed to admit) had been accumulating junk for over a decade. Now every time I open the drawer, it feels peaceful and orderly, instead of the horror-movie adventure of the Junk Drawer from the Black Lagoon invading my workspace.

Although small changes like this may not seem to make much difference in themselves, the cumulative effects can be very powerful.

As part of my work yesterday, I read a California appellate court case that discussed how the courts distinguish between libelous falsehoods and constitutionally protected opinions. A court looks at the totality of the circumstances—both the language of the statement and the context in which it was made. Whether it is an assertion of fact or a statement of opinion depends on how the average reader would interpret it. Statements made on blogs and Internet message boards often are seen as opinions, even if they might be regarded as factual assertions in another context.

For example, if a mainstream news organization published an article falsely stating that a company’s management had defrauded the shareholders, the article would be libelous. But if a news website published an accurate news story about a company’s financial performance, and then a disgruntled investor posted a comment calling the management crooks, the comment wouldn’t be libelous because the average reader would not take it seriously. The culture of Internet posting is one in which readers expect to find exaggeration and name-calling. As a consequence, most of what’s posted on blogs and message boards is not actionable, even when the character of the statement is such that it would clearly be libelous if published in a more respectable venue.

I’m not among those who lament the supposed passing of a golden age of civility. On the contrary, I believe we’re much better off in a society where most people confine their expressions of hate to yelling at each other on the Internet, as opposed to throwing bricks or forming lynch mobs. The past century, with all its ugly prejudices, was very far from being an age of grand public civility. Still, in light of the Internet’s potential to bring us together, it seems a pity we haven’t made better use of it.

In the dark ages before the Internet, creative writing was a very personal and often disorganized hobby. When inspiration struck, writers scribbled their stories in diaries or notebooks with a ballpoint pen, maybe sharing them with a best friend or two. An occasional article might be thought worthy of the time, paper, typewriter ribbon, envelope, and postage required to type it up and nervously send it off to an editor of a big-city magazine, who would likely reject it (by way of the obligatory self-addressed stamped envelope) because there was so much competition. Writers daydreamed of being published and gaining worldwide acclaim, but most didn’t even have a small circle of friends regularly reading their work.

Now anyone can put together a blog or join an online writers’ group and share stories with readers around the world—it’s instant gratification. The old constraints of scarce publishing resources are no longer a problem. One would naturally think that creative writing ought to be easier, more fun, and less stressful than in the past. But in line with the human penchant for complicating just about everything, it often doesn’t feel that way. Instead, writing has become another sad entry in the long list of modern social pressures.

When we’re not posting new material to our lists and blogs regularly, we’re left feeling guilty and embarrassed. We compare ourselves to the most prolific writers we know, and then we beat ourselves up for being so lazy and inadequate. Like yo-yo dieters obsessing over their meal plans, we devise schedules for when and how much we should write; and inevitably we don’t stick to them. (True confession here: I meant to write this post last weekend, but instead I ended up reading a goofy novel about ghostbusting witches.) We imagine our neglected blogs as virtual vacant real estate, foreclosed upon and boarded up, with a few spambot tumbleweeds rolling down the dusty street.

How did we do this to ourselves? I’m reminded of Mark Twain’s classic observation on human nature in The Adventures of Tom Sawyer, where Tom has to whitewash a fence while the other boys are free to play. He pretends that he’s having great fun, and soon his friends are lining up to pay him for the privilege of helping. Tom has discovered “that Work consists of whatever a body is OBLIGED to do, and that Play consists of whatever a body is not obliged to do.”

We have, in effect, turned our writing into work—even when we’re not being paid. We feel obliged to do it because otherwise we’ll lose face with our online acquaintances and plummet to insignificance in the Google rankings. Sometimes the pressure gets to be too much for us, and then we close our blogs and quit our lists, slinking away in shame and despair—only to start all over again in a year or two.

Of course, there’s no reason it has to be this way. Like all cultural constructs, the notion that prolific writing determines our social worth has only as much power over us as we allow it to have. No stone tablet has been handed down from above commanding “Thou shalt not fail to update thy blog.” We can shift our mindset to change our stories back into the playful hobby that they originally were, once upon a time.

May 11, 2012 · Write a comment · Categories: Musings · Tags: ,

Last weekend I moved two hostas that I had planted in my front garden almost a decade ago. They were a gift from a neighbor who found that she had extras while she was doing her spring planting. Because I already had a few hostas of a different variety, I assumed that the new ones would be about the same size. Unfortunately, that proved not to be the case, as often happens with assumptions.

For the first few years, I admired the big glossy leaves of the new hostas, which were noticeably larger than the leaves of the other variety. After a while they grew together to form a big clump, and I thought that was okay because the older hostas also had grown close to each other. My husband mentioned that he liked the big ones. They looked very impressive, robust and healthy.

But they just kept on growing. I realized that I had a problem when they started overgrowing the front walkway. Because hostas are round plants with leaves growing out from the center, they can’t be trimmed along one side without ending up lopsided; so cutting them back was out of the question. Last summer their leaves stretched halfway across the concrete next to my porch steps. Visitors and pizza delivery people had to tread carefully to avoid stepping on them. Now that they had become so enormous, I was left with an embarrassing display of gardening foolishness in full view of all the neighbors. There was no doubt those hostas would have to be moved farther back in the garden to give them more room to grow.

I wasn’t looking forward to that chore, though, and I kept finding reasons to put it off. The heat of the summer wouldn’t be a good time to move plants; and once we got into the cooler autumn weather, there was always something going on that made a convenient excuse. Then it was winter and they dropped their leaves, allowing me to ignore them until the spring. I finally got around to moving them last weekend.

Relying on assumptions when we don’t have enough facts is, of course, human nature. It served our ancestors well for most of our history, when people often had to make immediate decisions on which their lives depended. Was it a hungry wolf in those rustling bushes, or was it a deer? Did that group of men from another tribe, coming into view over the hill, have plans to attack the village? Making snap judgments was a very useful survival skill in those days.

Now we have easy access to information, and most of us aren’t likely to find predators (human or otherwise) lurking near our homes when we step outside. Still, both our decision-making processes and the structure of our society took shape when life was much more precarious. We make assumptions all the time, just as our ancestors did; and when they are challenged, our first reaction is fear. We’re afraid of what might happen if we let ourselves get distracted thinking about other possibilities, only to find out that there really was a wolf in the bushes after all.

So when we’re told about a group of people who need more room to grow in our collective cultural garden, we don’t want to hear it. We react with denial: those big leaves can’t be taking up that much space, can they? Maybe we step on them sometimes, but hey, there’s got to be some way to shove them back where they belong and make sure they stay there. After all, they weren’t so much in the way before. And just think of the nuisance it would be to dig new holes!

Then after a while, our society grudgingly decides it’s time to stop putting off the chore, just as I did with my hostas last weekend. Even though I’d been dreading it and making excuses for the better part of a year, it wasn’t really that hard after all.