What does it mean to deserve?

Today’s culture is always telling us that we deserve more. Advertisers deluge us with images of shiny new products, declaring that we should indulge because we’ve earned it. Self-help authors say that we can attract great success by repeating affirmations along the lines of “I deserve to be happy” or “Next year I’m going to earn X amount of money because I am worth it.”

While that’s better than going around with our heads full of negative messages about not being good enough, it still leaves us measuring our worth against what other people have. Because deserving has to do with merit, if one person deserves something and gets it, then by implication others who don’t have it are not as deserving. Maybe they didn’t work as hard or couldn’t stay focused on those happy thoughts. From there it’s just a short step to believing that if someone is poor, unhappy, or sick, it must be their own fault.

Nobody ever wins that blame game, though. It doesn’t matter how many new cars we have in the garage, how well our investments are performing, or how healthy and happy we feel at the moment. Simply put, there is no way anyone can go through life always having more health, wealth, and happiness than the other seven billion people in the world. So if we’ve got the attitude that those who have less are to blame for their own misfortunes, then we naturally end up blaming ourselves for not being as rich and famous as those who have more—and there are always plenty of billionaires and celebrities in the news to make us feel undeserving, if we’re so inclined.

Who needs all that judgmental drama? We’d do better to take the concept of deserving back to its roots—to the original Latin word meaning “serve.” Historically, a deserving person was a good servant. Earning money had nothing to do with it—most servants earned little more than their keep, and many were slaves. Deserving, in its original root meaning, was about being loyal to one’s master and devoted to one’s work.
 

Roots of trees in a forest.

(Creative Commons image via flickr)
 

Although we no longer live in a world of masters and servants, we still spend much of our time serving others. Whether it’s by working for wages, owning a small business, caring for our family members, creating beautiful art, or volunteering with a charity after retirement, there are many ways to be a good servant. Giving our work to others is in our nature as human beings, as members of a social species. That’s how we create meaningful accomplishments and leave the world a better place for having been part of it.

Money measures something else entirely. In a capitalist system, money is supposed to be a means of efficiently allocating resources. When sales of a product or service increase, more people invest in it. Some of the profits go toward developing more advanced technologies; then new industries emerge and create jobs, more people can afford to buy products and invest, and the economy keeps on expanding. Of course, it’s not always as efficient as it could be; but that is generally how it functions.

Most investors, with the notable exception of socially-conscious funds, couldn’t care less about whether a company’s goods provide a benefit to humanity. They just want quick profits. A company may have a wonderfully innovative product that would solve many of the world’s problems; but unless enough buyers can be found at a high enough profit margin, nobody’s going to invest in it just because it is deserving in the abstract. The free market is not about making moral judgments and rewarding those who have faith in their products; it is only about getting bottom-line results.

Doing our work with passion and love—being good servants—is not measurable in terms of money or fame. All other things being equal, a passionate worker would make a better impression on people and would be more successful in the conventional sense. But of course, all other things are never equal. We live in a very complicated world where unexpected stuff happens all the time, so why blame ourselves or anyone else for not reaching some arbitrary level of success? When we stay focused on doing our work of service, other things will fall into place in due course.

It’s in the air. We can taste it on every breath—all that restless energy flowing through the culture, dancing wildly into the future, calling us to expand into infinite possibilities and create all the beautiful things in our minds that are just waiting for us to make them real. Life is supposed to be joyful, it tells us. There is so much more we can create and become. We only have to imagine it.

So we meditate, we visualize, we say affirmations, and we believe it’s within our grasp. We try to harness that wonderful creative energy to run before us like the chariot horses of the Roman arena, hooves pounding in a glorious cloud of dust as the finish line nears. We read self-help books that encourage us to follow our passions, whatever they may be, and trust that the Universe will reward our devotion—by sending a big fat bank account our way.

And I’m left wondering just how the passionate desire to create beautiful things got tangled up in our minds with Wall Street-style greed. Why are we so tempted to visualize ourselves quitting our jobs and effortlessly raking in a gazillion dollars from the books we haven’t written yet, the songs we haven’t composed yet, or the movies we haven’t produced yet? Why does everything have to be monetized on a grand scale? Seriously, are those sparkling joyful surges of creative energy telling us to grab the most toys? What’s going on here?

In general, whenever a narrative spreads so widely through the culture, it reflects something that already exists in present-day reality. I don’t mean to say that we’re all ruled by greed, but that we live in a time of vastly expanding possibilities. Modern technologies and cultural changes have empowered us to pursue careers we couldn’t have imagined in the past. So naturally we’re not willing to resign ourselves to a lifetime of soul-numbing drudgery just because people once thought there was nothing better. That, I believe, is what’s at the root of the “financial freedom” narrative—it’s not so much about old-fashioned greed, but about seizing those shiny new potential-filled moments and making the most of them. Money represents possibility.

And that’s all very well in itself; but the downside of equating money with possibility is that when the gazillion dollars never actually show up, it can feel like a personal failure, rather than simply reflecting the fact of a sputtering global economy that still needs time to expand. All that amazing creative energy takes a backseat to obsessively visualizing the big fat bank account and wondering what went wrong.

There is also a subtler trap, which is that we have been culturally conditioned to use the word “dream” to describe doing what we enjoy on a regular basis. That language puts it into the realm of distant fantasy, rather than within reach in everyday life. As a result, we can’t just be happy writing a blog or a novel or whatever because that’s what the creative energy calls us to do in this moment—no, that’s not good enough, we’ve got to have the glittering fantasy of being a super-wealthy celebrity. Otherwise, the culture might always dismiss our efforts as insignificant and leave us stuck forever in a dismal wage-slave existence.

Of course, we don’t really need that big fat bank account before we can take control of our lives. Nor is there any requirement to be a glitzy celebrity before anyone appreciates our creative projects. In fact, there are many celebrities and rich people who are notorious for their totally messed-up lives and always get laughed at in the tabloids. So if wealth and fame aren’t really where control, possibility, and respect come from—then how do we go about getting them?

Simply put, it’s all in the details. Not so much in those bright gleaming visualized details of future grand accomplishments, but in the words and images we use to frame our everyday acts—the details in present tense. I don’t, for example, dream of being a writer. I write stories. Fact. I write a blog. Fact. I control what I write and what I publish. Fact. These aren’t dreams—they are options I have chosen from among the many possibilities open to me in the here and now.

And though I can’t control what other people think of my writing, I have a fairly good idea of how to find readers who appreciate my work. First of all, I need to be consistently kind and courteous, showing others the same respect I’d like to get from them. It’s also important to take the time to put together quality work, free of careless errors. I need to be open to learning from constructive criticism and improving myself and my writing. And last but not least, I just need to get into the flow, relax, and have fun! If I became rich and famous, I still would need to do these things to earn genuine respect. There are no shortcuts to be bought.

I set up my blog on Kindle two weeks ago—not because I had any expectation of making money from it, but just because it seemed like a fun thing to do, as well as a convenience for any readers who might prefer following blogs on Kindle. Here’s a partial screenshot of my page on Amazon, which itself has a screenshot of my blog. I decided to put it into this post because the recursive effect looks cool.
 

Screenshot of my blog subscription page on Amazon.com. 

So far nobody has subscribed to it, but that’s okay because it is just a fun present-tense detail of the intentional life I’m in the process of creating. If I had been seriously planning to monetize my blog, I probably would’ve felt disappointed about not becoming an overnight gazillionaire. And we all know what we attract when we feel disappointed and unsuccessful—yup, more of the same. Simply enjoying the small details may not be as glamorous as the rich celebrity fantasy, but I believe it works out better as time goes by and more of those details fall into place.

Although one wouldn’t know it from the sensational news headlines, both war and violent crime are at historically low rates and are still falling across the globe. For the first time since our ancestors emerged from caves and got organized enough to raise armies, most of the world’s population has never seen the horrors of war firsthand.

Yet battlefield metaphors and imagery are commonplace in modern life. Our cultural narratives lag behind our modern realities. The stories we tell ourselves to make sense of our world are drawn largely from our ancestors’ everyday lives, as well as their customary word choices. We still have many folk sayings that refer to horses, for instance, even though people have been driving cars for more than a century. Our history shapes our thoughts much more than we realize it does.

Put another way, the world’s long history of war has left us with the cultural expectation of going to war. Subconsciously, we think of ourselves as soldiers, even though most of us haven’t actually served. We watch popular movies full of epic battles, read sword-and-sorcery novels, and play war games on our computers. Public policy decisions often are characterized as going to war, such as “war on drugs.” If we have a medical condition (or a family member does), we’re likely to think of it as an evil monster we must fight bravely to slay. Today’s political factions are always battling over one thing or another. Social advocacy is commonly described as fighting for a cause.

Last week I visited the blog Rambling Woods, an amateur naturalist’s site, and read a post about monarch butterflies. Monarchs migrate annually, and they lay their eggs only on milkweed, which no longer grows in cornfields because of herbicide use made possible by genetically modified corn. An article referenced in the blog post encouraged people to fight to save the monarch migration by planting milkweed.

There’s an area in my backyard where I would like to plant native wildflowers, and I commented on the blog that I’ll make sure to include milkweed. It’s certainly a worthwhile project, helping to restore the monarch population while also planting attractive landscaping. But if I had to pick one word to describe this image, “fight” would not be the first one that came to mind.

 

monarch on milkweed

(photo credit: publicdomainpictures.net)
 

Six years ago, I began doing volunteer work for the Autistic Self Advocacy Network (ASAN), a nonprofit organization that teaches leadership and self-advocacy skills, publishes educational materials, and addresses public policy issues relating to autism from a disability rights perspective. I serve in the position of board secretary, which consists of preparing agendas for board meetings, keeping the minutes, and generally keeping the board’s documents organized. It’s a mundane job, but every corporate board needs a secretary; and I see it as a way to do some good in the world.

Although one might think an educational charity wouldn’t be controversial, ASAN, like many other groups, has had to deal with the unpleasant reality of today’s battle-primed social environment. People who fight for causes naturally expect to find enemies; it’s implied in the metaphor. There are many causes relating to autism—research, services, education, disability rights, and more. So it’s not surprising that people would have strong feelings about particular issues and that there would be arguments.

But there was some major ugliness online a few years ago that went way beyond ordinary arguments, turning into virtual scorched-earth warfare among the supporters of various factions. Rambling conspiracy theories, nasty gossip from the gutter, et cetera. No doubt the people responsible for that stuff saw it as perfectly justified—after all, they were at war and fighting to destroy their enemies, and war isn’t supposed to be pretty.

The situation calmed down after a while. I’m still reflecting on the broader issues, though—all the negativity we take for granted in society, and the addictive nature of the resulting drama. War is exciting; that’s why it plays a central role in so many of our stories. But when we constantly feel that we’re at war, it becomes exhausting and harmful. After all, war is scary and people get killed; so having one’s thoughts full of battlefield images naturally leads to feeling that one’s life is in danger, with all the resulting anxiety.

At first, it’s empowering to imagine ourselves as righteous soldiers fighting valiantly for our causes. We feel strong and motivated. We pour our energy into the fight, and we get things done. It may take years before we realize how depleted we’ve become—both mentally and physically. Even when we’re not actively battling against our perceived enemies, we still have those old arguments replaying themselves in our heads, uselessly sucking up even more energy. Then we’re left with chronic run-down feelings, and possibly more serious health problems besides. When we reach that point, we end up not getting much done at all, either for our causes or in our personal lives. Joy becomes a distant memory.

 

occasion

(word-art image courtesy of Bits of Positivity)
 

When I was nine years old, my grandma gave me a set of old Christian novels she had bought at a garage sale. Presumably she meant to instill good old-fashioned moral values in my impressionable young mind. I have to admit, I was more interested in Nancy Drew mysteries at that age; but I did read the books after a while. One of them, White Banners by Lloyd C. Douglas (1936), must have made more of an impression than I realized at the time. I’ve had it in my thoughts recently because it explored the practical benefits of avoiding battles in everyday life.

The author’s premise was that when we choose to walk away from disputes, we usually gain more than we lose. In addition to building character, it gives us more time and energy to put toward useful work. The title comes from a passage describing this approach to life as flying white banners, not white flags of surrender. Even though winning a dispute may feel like a great victory, chances are it’s not as productive as the work that might otherwise have been accomplished.

I would also say that avoiding unnecessary conflict promotes self-awareness. Often we don’t even notice all the battle metaphors in our thoughts. Because they’re everywhere in our society, they seem like the normal way to look at things. It takes a conscious effort to consider other perspectives and to shift our thoughts in more positive directions. Instead of fighting for our causes on an imagined gory battlefield, we can simply choose to put on our gardening gloves and get busy planting those seedlings. The work will get done just as effectively (and perhaps more so) without the drama; and it’s a much healthier way to go through life.

It’s in our nature as a storytelling species to filter our experiences through the narratives we create to explain them. As humans, we go through life full of self-talk, whether or not we do it consciously. When we plan an event we know will likely be stressful, such as traveling to a place we’ve never seen before, we rehearse it in our minds and tell ourselves why it will be okay.

Our culture goes through much the same process of creating new stories to explain advances in technology, changes to our traditional social structures, and greater diversity in our communities. Having to deal with so many unexpected changes can make us very anxious, just because life feels so unpredictable. We need simple, calming explanations that fit reasonably well within our existing mental maps and leave us confident of being able to manage the changes.

Now that we’re a half-century into the modern civil rights era, our culture has mostly gotten used to the idea that we shouldn’t expect everyone in our communities to look and behave exactly the same. Although we still have much work to do on clearing away old prejudices, our society has made much progress toward the goal of accepting diversity.

But many of us find it harder to accept ourselves for what we are. Mass-market advertising preys on our insecurities by suggesting that we won’t have any friends unless we wear the latest trendy fashion or drink the right brand of beer. Whatever our physical traits may be, there are cosmetic products or treatments aimed at improving them, along with ads that proclaim how embarrassing it is to look like our natural selves. If we don’t fit in with some clique at school or in the workplace, we could get bullied for being “weird.”

It’s not always easy to recognize such manipulation and bullying for what they really are. Often we blame ourselves, thinking that we’d have more friends and get along better if only we could be more like other people. Then we blame ourselves again for not doing a better job of dealing with our gloomy feelings and our anxiety. We don’t take enough time to consider all the factors involved.

Defining one’s personal identity and finding self-acceptance can be even trickier in the context of disabilities, mainly because our culture hasn’t yet fully accepted them as part of human diversity. Instead, our culture has created narratives about normality and what might happen to anyone who doesn’t fit neatly within its boundaries. As a result, anything outside those boundaries—wherever they may be at any particular time—can be hard to accept as part of one’s own identity.

Well-meaning people sometimes offer advice along the lines of “accept the condition, but don’t let it define you.” Such advice generally means not letting one’s potential be limited by low expectations. As with person-first language, the aim is to put less emphasis on the condition, in hopes of avoiding the negativity often associated with it. Put more simply, this advice is: Don’t settle for being defined by all the bad stuff our culture says.

Some may see this as acceptance—but it has the drawback of leaving all that bad stuff out there, unchallenged. And when we don’t actively challenge prejudices, we often end up internalizing them. That is why pride campaigns work toward reclaiming words and asserting control over their definitions. Whether we’re talking about disabilities or any other human characteristics, leading an authentic life requires acknowledging their place in defining our identity. We can’t truly accept ourselves as long as there is something we keep tucked away at the back of the closet, never mentioned above a whisper.

When we put acceptance into action we’re telling new stories, both to ourselves and to the world. We’re creating new definitions that embrace all of who we are, rather than just the parts that fit someone else’s idea of who we should be. This is how our culture grows and evolves. Seen in this light, the telling of authentic narratives is a gift to the world, broadening its boundaries and strengthening its diversity. No one should ever have to feel afraid or ashamed to speak a personal truth.
 

This article has been published on the Autism Acceptance Month site, which posts new articles and resources every April, with a focus on “sharing positive, respectful, and accurate information.”

There’s a lot more to changing the world than just pointing to a problem and saying “This is wrong—fix it now!” Yes, identifying the problem is necessary; but it’s generally not sufficient. That is because the existing situation, however unjust or illogical, has (or had) some degree of social utility—otherwise, it never would have happened. So when a particular way of doing things isn’t working well in today’s society, we should first examine how it was meant to work, and then consider how the problem might be solved while still accomplishing the intended goal.

Several years ago, I had a conversation on a forum with a woman who complained about her husband’s inconsiderate behavior. She was a short woman with a mobility impairment, and she couldn’t access the higher shelves in her kitchen cabinets without great difficulty. When she needed something from one of those shelves, she generally asked her husband or one of her children to get it down for her. Of course, it would have been much easier if all the items she regularly used were on the lower shelves; but when her husband did the grocery shopping, he often put some of them on the higher shelves without thinking about it. Although she had reminded him many times, he never paid enough attention to get it right, and there was always something she wanted that was out of reach.

The husband evidently had good intentions—he wanted to take care of his family by bringing home the groceries and putting them away. He probably felt that he was being unfairly criticized because the grocery shopping was enough of a chore in itself, without also being expected to remember what shelves his wife had in mind for everything. He wasn’t trying to be a jerk, but simply couldn’t keep track of all the details of what items she wanted where. Nagging him was counterproductive because it wasn’t likely to improve his memory and would only make him resentful.

I suggested that she reorganize the kitchen, with her children’s help, one day when her husband wasn’t at home. To the extent possible, everything would be moved to the lower shelves. Then the upper shelves could be filled with bulky extra items, such as multiple packs of paper towels and toilet paper bought on sale. That would ensure her husband couldn’t put any groceries there. She would also save money by stocking up on paper products while they were on sale. And because her husband paid so little attention to detail, he probably wouldn’t even notice that anything in the kitchen looked different. From then on, he would always put the groceries on the lower shelves, without even thinking about it, because that’s where all the free space would be.

In the context of changing the behavior of societies, rather than individuals, filling the available space also works well. Prejudiced assumptions and insensitive attitudes can be dealt with by ensuring that the public discourse reflects many different perspectives. This approach often results in more success than yelling at the majority group that they’re a bunch of bigoted jerks who don’t understand how privileged they are. Even if it’s true, they are not going to want to hear it, and they’ll dismiss the criticism as unfair and unreasonable.

But if people going about their everyday business just happen to find other viewpoints taking up the cultural space where the prejudices used to go—well, then it’s not so easy to stuff those big awkward prejudices into a space where they don’t fit anymore. And when there are a lot of diverse perspectives occupying society’s cultural-narrative shelves, there’s probably going to be something that looks more useful. So those outdated prejudices simply end up being set aside, like worn-out clothing or obsolete technologies, because they no longer have a place in today’s world.

I came across the phrase “empathy in development” while reading an article by nonprofit leader Molly Melching entitled To change society, first change minds. Although the article is mainly about efforts to achieve sustainable development and social change in Africa, the author’s wise observations can be applied much more generally to the process of bringing about systemic change. She describes empathy in development as involving four key elements, as set out below:

First, begin with human rights — empower people to claim their rights to health and well-being with confidence. Two, start where people are — have empathy and respect while you understand their history, their language and culture and their priorities. Three, do not try to force change — lay the groundwork for dialogue, introduce people to ideas, identify shared values and allow them to decide what the change will be and when they will make it. If you start by just fighting what they are doing, you’re going to get resistance. Finally, and perhaps most crucially, remember the solutions already exist within the communities with which you work.

I believe these insights are equally valuable in the context of bringing about social change in one’s own country. It’s not just those working abroad who need to take a respectful approach when dealing with differences of culture and perspective. Especially in today’s rancorous political climate, it has become all too easy to dismiss other points of view. When we treat our own beliefs as obvious facts, then surely there must be something wrong with anyone who disagrees. Maybe they’re ignorant, corrupt, lazy, immoral, or otherwise deficient in some way. Wholly oblivious to the irony, we may claim they lack empathy or suffer from black-and-white thinking. After all, there’s got to be some reason why they aren’t behaving like sensible people.

How we respond to others’ differences is itself culturally determined in many ways. Even when we see ourselves as respectful and open-minded, we may not be fully aware of the underlying narratives that frame our worldviews. In any society, the words commonly used to express an idea tend to shape how people think about that idea. I’m not just referring to political buzzwords aimed at provoking emotional responses. On a much more basic level, our vocabulary reflects the structure of our society, whether or not there is any conscious intent involved. As we go through life, we routinely make assumptions and take actions based on this familiar structure.

In modern-day Western culture, development connotes a distinction between the complete and the incomplete. Nations are either developed (suggesting a past-tense, finished process) or developing (they’re not yet like us, but they’re working on it). Although today’s development narrative avoids the obvious biases of language used in the past to describe other cultures, such as “backward” and “savage,” it still describes a linear scale that puts us at the top, with others striving to reach our level. It further suggests that development should proceed along the same trajectory, rather than having many possible paths.

A similar dichotomy can be found in the language our culture uses to describe individual development. People are seen as either normal (that is, fully and properly developed) or struggling to overcome their challenges (not like the normal folks, but trying to be). And of course, it goes much deeper than just choosing one word over another. Changing an occasional word or phrase doesn’t do much if the underlying narrative stays the same. Our language is full of terms that started out as politically correct euphemisms and ended up being used as insults, just like the words they replaced.

I believe this cultural framework is a large part of why empathy in development can be so hard to attain, whether we’re aiming to change other nations or to change the behavior of people within our own society. As soon as we decide they need to be changed, we subconsciously start to think of ourselves as superior. We’re developed, they’re not. We’re normal, they’re not. We’re enlightened, they’re not. So naturally we should tell them what to do, since we know and they don’t. Eventually they’ll come to understand it was for their own good…

This mindset has become so ingrained in our culture that sometimes we’re not even aware of it. Thoughtful reminders such as Molly Melching’s four principles of empathy in development are much needed. And I suggest another point on which reflection would be helpful: Remember that we, ourselves, are still developing. There is nothing shameful about acknowledging this simple fact. The opposite of development is not perfection—it is stagnation. As we interact with others and explore our world, we continue to learn, both on a collective and an individual level. This necessarily means that when we seek to change others, we are also being changed. So when we find our views in conflict with those of other people and cultures, it may be useful to consider not only how we can change their minds, but also what we can learn from the situation.

What are you?

The answer most of us would give, according to the customary social script, is an occupation: truck driver, teacher, sales clerk, or whatever we might happen to be doing for our paycheck. If we’re older and no longer working, then the answer changes to retired, perhaps with our previous career description tacked on. As students who haven’t yet entered the workforce, we might talk about our particular course of study and a career plan based on it. When we’re married and spending our days taking care of small children, we occupy a traditional niche in society as a stay-at-home parent.

But there are no good answers in this script for those without jobs who don’t fit the categories of retiree, student, or homemaker. Unemployment doesn’t just leave a person with no money—to a large extent, it also strips away his or her identity. Our society has plenty of words to describe the jobless, but that lexicon is viciously pejorative: bums, slackers, moochers, takers, lazy, useless, and a burden to others. So when we’re unemployed, that means we’re not only faced with the stress of looking for a job, not finding one right away, and going through our savings (if we’re lucky enough to have some). We also have to deal with the perception that anyone who doesn’t have a job is a worthless social failure.

And right now, although things are slowly improving, there are a lot of people who don’t have a job. We live in a society that is struggling to adjust to the massive impacts of globalization and modern technology. At present, the world’s economy is fragile and all too easily disrupted. There are many more people looking for work than the number of jobs available.

It’s not always going to be like this. As the world becomes fully industrialized and birthrates continue to fall, we can expect that many industries will face chronic labor shortages. People who are looking for jobs will have no problem finding them. But we’re not there yet; and in the meanwhile, we have to ask ourselves—on both a collective and an individual level—how we’re going to deal with today’s difficult job market.

Without getting into the political debate about whether the government ought to focus on job-creation programs or tax cuts, I’ll simply note that both sides recognize there is more involved than money. Politicians, whatever their party affiliation, commonly talk about work in terms of a person’s dignity and ability to contribute to society. Work is generally understood to make up a large part of our identity.

Before the modern era, when there was very little social mobility, defining people’s identity in terms of their occupations made a lot of sense. If your father was a blacksmith or a carter, you probably were too, if you were male; and you would never do anything else, unless you had the bad luck to get conscripted by a passing army. Everyone in your town would refer to you as John the smith or Tom the carter. A man’s occupation was a quick and easy way to distinguish him from others who had the same given name, back when common folks didn’t have surnames.

Now we live in a complex, unpredictable society where most workers will change jobs many times. Career retraining has become commonplace as old industries shrink and new ones emerge. It’s not unusual to get a college or university degree in one field and then end up employed in another. Modern workers are more likely to migrate to another city or country, and we have more diversity in our personal characteristics and interests. As a consequence, a person’s job says less about his or her identity than at any time in history.

And yet, we still ask children what they’re going to be when they grow up. The dominant cultural narrative is much the same as it was centuries ago, defining our personal identity and value in terms of how we earn our pay. If we get laid off and can’t find another job, or if we’re stuck in a low-paying job and have had no luck applying elsewhere, it’s hard to look at the situation objectively and not feel like we’ve been rejected by society in general. Our culture takes it for granted that a person’s dignity and value depend on employment status.

There are many variables that go into determining that status, however, and often they’re not under our control. We can’t reasonably be expected to predict an economic downturn that causes our company to go bankrupt or a technological advance that makes our work experience obsolete. Prejudice or nepotism can cause a less qualified applicant to get hired instead of us. Maybe our employer decides to cut costs by moving production overseas. We can’t prevent any of these things from happening, so why do we allow them to change how we feel about ourselves and about other people in our community? We might do better to think about redefining our values, in more ways than one.

Researchers have found that people who often complain about being old or fat have more health problems than others of the same age or weight. And when older people leave their usual environment and go somewhere that they associate with youth and physical activity, their health improves. For example, blood pressure might be significantly lower after spending a few weeks at a hotel in the mountains, surrounded by hiking trails and furnished with dated décor reminiscent of one’s younger years.

When articles describing these studies appear on news websites, readers often post skeptical comments downplaying the effects of attitude. People complain more because they’re in worse health, not the other way around, the commenters suggest. And they argue that when someone’s health improves during a vacation, it has nothing to do with feeling younger—it’s simply because of a better diet and more exercise.

Some readers gripe that the scientists are being unethical by conducting studies that have the effect of encouraging people to lie to themselves. After all, if someone is old or fat, that’s the truth. It’s nonsensical to pretend otherwise, they say; and it gives people false hope that magical thinking will cure serious medical problems.

My take on it is that categories like “old” and “fat” are chiefly matters of opinion. Their boundaries can and do change as our cultural expectations shift over time. A century ago, when the average lifespan was much shorter than it is today, people thought of themselves as growing old earlier in their lives. And before the modern era, when food was much harder to get, a substantial waistline often was thought desirable—both because it was a sign of prosperity and because it improved survival odds in times of famine.

We also differ in how we sort ourselves into categories based on our life experiences. For instance, I would call myself middle-aged because both of my children are grown and are close to getting their university degrees. To my mind, it wouldn’t make any sense to describe myself as a young adult when my kids are now young adults. But nowadays, because of second marriages and fertility treatments, there are plenty of people my age who started their families just recently. They are likely to spend much of their time associating with young parents of toddlers and, as a result, to think of themselves as being nowhere near middle age.

Another factor in how we classify ourselves, which is even more individual, has to do with the connotations that we attach to the words. One person might despair upon approaching middle age, believing that it means the best part of life is over. A more optimistic person might view it as having many more years of a long and happy life remaining. Although they’re both using the same term to describe themselves, what they mean by it is totally different.

As to the health effects of what we say about ourselves, I believe the skeptics have a valid point that there’s more to it than positive or negative thinking. When someone is in better health after a vacation, it probably has to do with being more active than usual. The person isn’t just sitting around the whole time repeating affirmations, visualizing a younger and healthier self, and so forth.

That said, however, it’s all interrelated. When we think of ourselves as healthy people in the prime of our lives, we’re likely to act accordingly, getting regular exercise and taking better care of ourselves. To a large extent, humans are creatures of habit. What we say about ourselves is a strong factor influencing what habits we form, which in turn goes a long way toward shaping our circumstances.

Sometimes I see articles discussing how far the birthrate has fallen in many countries and pointing out that this is a worldwide trend. Families are much smaller than they were in the past, and many young adults are opting not to have children at all. The authors often make dire predictions as to what will happen if this trend continues for another millennium or so, leaving a tiny human population on the brink of extinction.

There aren’t enough reasons to want children in today’s society, they say. In past generations, large families had economic value because children worked on the family farm. As they grew older, they took care of their aging parents, who might otherwise be left destitute upon becoming unable to work. But nowadays, a child is just an expensive, time-consuming luxury item. Even in countries where the government provides good child care and pays generous stipends to parents, birthrates remain low. Simply put, modern-day humans have many other things they’d rather be doing than raising families.

While I agree with the short-term prediction that the world’s population will soon reach its peak and then begin falling, I don’t see this as a cause for alarm. As I see it, the resulting labor shortage and high salaries will be very good for wage-earners. Raising a family on one parent’s salary, while the other parent stays home with the children, will be an affordable choice. Lost career opportunities won’t be as much of a concern because the average lifespan will continue to increase. A parent who stays home raising a large family until age 50 might reasonably expect to have a productive career until age 100, or perhaps even longer. Employment discrimination will be much less of a problem because of the labor shortage. Because the young adults of the future will not have to face today’s social and economic constraints with regard to families, their choices may turn out to be very different from what we’re seeing now.

When I wrote this post, my main purpose wasn’t to reassure worried readers that humans are not heading toward extinction. Nor am I suggesting that all children would be better off with a parent who does not work outside the home. Rather, this post is meant to illustrate how current trends often become absurd when they’re extrapolated out too far. We lack a sufficient frame of reference to predict what will happen in the long term because our baseline assumptions soon become outdated. Thus, although a calm, well-reasoned focus on solving present-day problems may not get as much attention as shrieking about a coming apocalypse, the former approach generally results in wiser policy decisions.

Even with this summer’s extreme heat and drought, I still had to spray thistles in my yard; the heat doesn’t bother them. Not much bothers them. Thistles are very persistent weeds. Pulling them out by hand is useless because their root system is so thick and deep, they can just send up two new sprouts for each one you pull. Spraying them works much better because the herbicide gets carried down into the roots and prevents any new growth from coming up.

Much the same can be said about ridding our society of its prickly old prejudices and stereotypes. They’ve been around long enough to have a strong root system—that is to say, a large set of cultural assumptions or myths from which they grow. Trying to attack a prejudice without also going after its roots has little effect. Many people put huge amounts of time and effort into arguing, on the Internet and elsewhere, about how ignorant someone else’s beliefs are. But without addressing the cultural context of the beliefs, attacking those who hold them is about the same as trying to pull up thistles one at a time. Some may decide that they’ve had enough of arguing; but they still have no clue what the opposing view is about, and others who share their beliefs get even more vocal as a result of feeling threatened.

To dispel a prejudice effectively, one first has to consider: What is its history? What other beliefs are associated with it? What social structures reinforce it? What role does it play in the cultural drama in which it appears? If our social world is made up of the stories we tell ourselves about it, as has sometimes been said, then we have to understand these narratives before we can rewrite them.

That doesn’t necessarily mean social change begins in the library with a stack of books about history, folklore, politics, rhetoric, and so forth. Much of what’s involved in changing the world—“radical” change, in the original Latin sense of the word, from the root—is an intuitive process. We know what kinds of stories resonate with our culture because we’ve grown up with them and incorporated them into our own lives. When we feel the earth quivering under our feet, we know there’s a fault line close by. As Bob Dylan’s classic song puts it, we don’t need a weatherman to know which way the wind blows.

Today’s world may be more amenable to change than the world our ancestors knew, simply because the pace of change has become so rapid. We are witnessing cultural transformation on an unprecedented scale and, as a result, we don’t have strong expectations that our lives will stay the same. We’re more willing to consider ideas that would have been dismissed out of hand by past generations. But we may also feel so unsettled by the lack of constancy that we cling to old ideas long after they have outlived their usefulness, just because we can’t deal with any more revisions to our mental maps. I’ve sometimes thought that reworking our cultural narratives is much like composing social stories to help an anxious child get used to new places and events. For both, what’s needed is a reassuring storyline and enough repetition to make it familiar and comfortable.