Tag Archives: thinking

Beyond Belief

(c) 2001-2010 Esther Derby

Let me tell you a little story, a true story, about how our beliefs influence what we see in the world and affect our ability to solve problems.

Two years ago my friend Julia, who was forty-four and a bit portly at the time, starting experiencing troubling physical symptoms. She was fatigued, depressed, and generally uncomfortable. After several weeks, she went to the doctor. The doctor didn’t find anything specifically wrong.

Julia was sent home with a vague diagnosis and a prescription for Prozac. After a while her mood lifted and she felt less tired, but the discomfort continued. Finally, after several months and several more visits, her doctor determined she had a fibroid tumor that was increasing in size. He decided to remove the tumor.

Julia wasn’t happy to be facing surgery, but was relieved that after seven months of discomfort there was a diagnosis and a concrete plan. Two days before surgery, Julia went in for an ultrasound to precisely locate the tumor.

Based on the ultrasound results, Julia’s surgery was cancelled. Julia was sent home to prepare for the birth of her daughter—who arrived, full-term, two months later.

Now I’m willing to bet that you guessed the end of this story by the middle of the second paragraph. It’s obvious…if you don’t already have any particular beliefs about Julia.

Julia and her doctor, however, did have a belief, built up over years, that Julia would never become pregnant. And over the course of six months of office visits and medical exams, no one ever suggested pregnancy as the cause of Julia’s symptoms.

We could say that the medical staff were incompetent, but I would say they suffered from a belief problem. Their belief caused them to overlook information that was readily available—and also limited their application of the information they were using as they diagnosed the cause of Julia’s symptoms.

What does this have to do with software?

We all have beliefs about the world and other filters that affect what information we take in. Our beliefs, built up through education and experience, form the internal maps that help navigate the world we live in. Our internal maps can enable us recognize and categorize the vast flood of sensory inputs and think quickly. And often they are very helpful as general models of how the world works.

Other times, our beliefs keep us from seeing what is blindingly obvious to someone with a different set of eyes. It’s “as plain as the nose on your face” to someone looking at it without our particular set of blinders.

Take Tom the test manager, for instance, assigned to a team that had always operated on participative and consensus-based decision making. Tom’s framework for managing relied on his belief that, as a manager, he should entertain input from the group but make all final decisions on his own.

Soon after Tom was assigned to the group, the team was assigned to finish an evaluation of testing tools. Tom read the reports and listened to the group discussion, then closed his office door and decided which tool he favored. At the next team meeting, as he discussed his decision, he reminded the group that “we decided this at our last meeting.” Tom didn’t notice that most of the other heads in the room were shaking back and forth, indicating “no, we didn’t.”

Was Tom a bad manager? Maybe, but it’s hard to say based on one incident. What we can see is that because of his belief about how decisions should be made, Tom didn’t ask questions that might have given him direct information about how the group operated, and he also filtered out valuable non-verbal information that would have given him additional clues. As a result, he was far less than effective in working with the team…at least until he became aware that his map didn’t match the territory.

We often don’t consciously account for the existence of our internal maps, which makes them more likely to trip us up—just as Julia and her doctor, and Tom, our test manager, stumbled when their maps didn’t show all the ups and downs of the territory.

Our thinking process happens so fast that it’s extremely difficult to pause the process in the middle and ask, “What unconscious beliefs, filters or maps are influencing me right now?” The challenge is to pause between the time we reach an initial conclusion and the time we act on that conclusion…kind of like how we test a piece of software before we ship it to understand the quality of the product and the risk associated with releasing it in its current state. I have four questions I use for this pause in the mental process:

1. What have I seen or heard that led me to this belief?

This question reminds me to really look at what data my response is based on. If I hear myself saying something like “because its always been like that…” I send up a tiny little internal red flag.

2. Am I willing to consider that my belief or conclusion may be mistaken?

If I’m not willing to consider that I might be wrong, it’s a sign that I’m reacting out of a belief I’m pretty attached to…and it’s a clear sign I need to go to the next question.

3. What are three other possible interpretations of this situation?

If I can’t think of any other interpretations, it’s time to get some help shaking up my assumptions. I find a colleague I trust and we brainstorm as many different interpretations as we can.

4. What would I do differently if one of these other interpretations were true?

This gives me a wider range of responses to choose from, and increases the chance I’ll choose one that will help solve the problem.

When I start to test my conclusions, I can surface and examine my beliefs—my assumptions—about the situation. If I’m willing to admit that my initial interpretation might be inadequate, I can gather more information and represent the situation more accurately. And when I do that I open up the possibility of making better decisions, working more effectively with people, and—coincidentally—building better software.

This column originally appeared in STQE magazine in 2001.

Stuck in Neutral

When action grows unprofitable, gather information; when information grows unprofitable, sleep. Ursula K. Le Guin, The Left Hand of Darkness

A few months ago, I began writing a “Technically Speaking” column for this issue and just couldn’t make myself finish it. I started over on another topic, but couldn’t work up much energy for it. I started yet a third time, and found myself easily distracted.

The week my column was due, I was still starting over and over and over. I began to panic: “I’ve been writing this column for three years! I should be able to do this!”

I was stuck.

As it happened, I was scheduled to have lunch with my friend, Bob King, a software architect and a writer. Eventually, our conversation turned to writing.

I explained my problem to Bob. “What should I write about?” I asked.

“Why don’t you write about not knowing what to write about?” he suggested.

“Yeah, right,” I said, dismissively.

After lunch, I though about Bob’s suggestion a bit more, wondering if “un-sticking” could make an interesting column. Then I remembered that I’d recently received an email from a colleague who needed some un-sticking with a performance issue. And I remembered another colleague who had spent an entire day stepping through every line of code, determined to root out a bug. By the end of that day, she was completely out of ideas and still hadn’t found the bug—she was stuck.

“Oh,” I thought. “I’m not the only one who feels stuck. Perhaps Bob was on to something.”

So I compiled some strategies that get me moving again when I am at a dead end.

Notice what is happening. Am I making no progress despite several attempts? Am I repeatedly telling myself, “I should be able to do this?”

Take a break. Work on something else unrelated to the problem at hand, especially something physical. Tasks that occupy your body and require concentration occupy the conscious portion of your mind and leave the unconscious parts to work on the problem.

Verbalize the problem. I can’t tell you the number of times I haven’t been halfway through the explanation of a problem when the light bulb switches on. When no one else is around, I explain the problem to my dog. He’s a good listener, and remarkably astute.

Ask for help. Sometimes another person with a different point of view or a fresh set of eyes will see what I cannot.

Sleep on it. Sleeping on a problem can allow the unconscious mind to go to work. This strategy is not suitable in all work environments.

Make a list. Write down all the known facts about the problem. You may find that you need more information; or discover new patterns in the data.

Look where the problem isn’t. Quite often, a problem is exactly where I’m sure it isn’t. It couldn’t be in the subroutine I wrote, could it? Check those places, too.

Brainstorm. Think of twenty ways to solve the problem. Don’t dismiss anything until you’ve considered it carefully.

Sometimes it takes more than one of these jiggles to get going. Looking back on my most recent stuck-experience, I see that I applied four strategies. I noticed that I wasn’t making progress, but was repeatedly telling myself I should be. I took a break by going to lunch with Bob. (And eating is sort of physical, though in general I can’t endorse eating as a problem solving strategy for all the obvious reasons.) I asked for help. At first it didn’t seem like Bob was giving me the help I wanted. Help is often like that. Finally, I considered an idea that seemed way-out long enough to see its merit.

And I arrived here: at the end of another column and unstuck.


This column originally appeared in STQE, May/June 2003.