Preface:  In Jean Ramsey’s and my book, Reframing Change, we explain how to test your assumptions at the interpersonal level.  Bill Brenneman, today’s guest blogger, specializes in helping work teams identify and test assumptions that may cause severe, even life-threatening, situations.  In this post, Bill provides an introduction to his rigorous field of work.

———————————————————————————————————————

As Jean and Jean point out in Reframing Change, Chapter 2, assumptions can lead us astray without our knowing it is happening. This is a problem–and an opportunity–that come up every time my colleagues and I try to find the cause(s) of complex problems or failures in industrial or organizational settings. This is why:

It is a widely held assumption (there’s that word already) that professional and technical people are fact based and logical. In fact, it is just as common for engineers and other technical folks to find themselves “in the answer” (rather than “in the question”) as it is for non-technical folks. Often, people with long years of experience in both technical and non-technical fields are seen as (are referred to as, and think of themselves as) “experts.”

What do we generally go to experts for? Questions? Generally not. We go to experts for answers. Uh oh. Looks like exactly the sort of trouble Jean and Jean observe in interpersonal interactions among coworkers.

I mention this so that we can lighten up about the fact that no matter how hard any of us try, we will, as human beings, necessarily make assumptions, no matter how smart or “expert” we may be. Assumptions keep things running by allowing us to move quickly in performing activities that are familiar and more or less routine. So is there a problem?

Actually, making assumptions is much less of a problem than not knowing we are making assumptions. In other words, when we make the assumption that we are not making assumptions, we open ourselves to serious errors. Assumptions ‘squared’ turns out to be the big problem.

Antidotes to the Assumption of No Assumptions

There are two main ways to guard against the danger of running on assumptions for too long before checking. The first way is the path Jean and Jean call “Putting yourself in another’s shoes,” that is, empathy. I have found that my willingness and ability to do this has come from the hard reality of having too often found myself to be wrong when I knew I was right. Experience is a good teacher when we pay attention with humility.

The second path Jean and Jean share is the one I would like to expand on. It is “Choosing to Test Assumptions.” This is first cousin to another technique they suggest of generating alternate hypotheses. This approach is useful in interpersonal issues, but it is essential when dealing with problems. The distinction is the increased rigor required when the facts we are checking are hidden deep under the surface of a complex occurrence.

How might we use rigorous fact-based analysis to test our assumptions when, for example, a machine stops working? When all of the obvious (generally assumed) failure causes have been checked out and have been ruled out by physical data and tight logic, but the equipment still doesn’t function, what do we do?

The first thing we need to do is clearly define the problem. This is the point where, for example, Ford people say, “There’s a problem because it’s a Chevy,” and Chevy people say, “There’s a problem because it’s a Ford,” etc. Or both might say, “There’s a problem because Jimmy doesn’t have the right attitude.”

In any serious inquiry into “What is causing what?” such obvious bias-based assumptions have to be rejected, but this isn’t always easy. People often hold on to their assumptions about machines as strongly as they hold on to their assumptions about their home team. In that situation, we need a strong discipline to get everyone working together to find true cause and effect.

The discipline I describe here is called Fact-Based Causal Analysis, a fancy word that means “getting to the root cause of the problem without fooling myself with my assumptions along the way.”

So how do we define a problem? The way to do this is to ask, very specifically, “What is the ‘object’ that is not working?” and “What is the standard it is not meeting?

In the care of a car the “Problem Statement” could be:

“The car — Isn’t starting.”

Or, more specifically, “When the engine turns over – There is no ignition.”

Or, “The starter —Isn’t engaging.”

Or, “When I turn the key – Nothing happens.”

Or, The report – Is not complete on time.”

Or, “He – turned away when I started to wave [instead of waving back as I expected].”

Once we have a statement of “The Problem” we can take the next step in the rigorous process.

In the second step, we ask the big question, “Why?” This is when we can all feel the assumptions coming on:

“It’s a Ford” [or a Chevy]

Or, “He screwed up.”

Or, “He’s mad at me.”

We could be charitable and call these all hypotheses. But a real hypothesis is not about having the answer, it is about being in the question so that we can be accurate by searching for facts that confirm or contradict the hypothesis.

In a rigorous process this is just the beginning, as we look at the available data and brainstorm the possible hypotheses that might explain the occurrence of the problem. That is why the problem statement has to be as narrowly focused as possible or our hypotheses will be all over the map.

Obviously, these two steps are just the tip of the iceberg about a truly rigorous process. My aim here is to introduce the existence of such processes so you can look for and access a good process and the resources to make it work. That is the only way to have hope of finding true cause in complex problems, most way more complex than these quick examples.

In future posts, I will explain how to use some of the truly useful technology for testing assumptions at the systems level using Fact-Based Causal Analysis.

————————————————————————————————

Bio: Bill Brenneman consults in two major areas:  (a) organization effectiveness, structure, and strategy and (b) fact-based deep cause failure analysis.

 

s Jean and Jean point out in Reframing Change, Chapter 2, assumptions can lead us astray without our knowing it is happening. This is a problem, and an opportunity, that comes up every time my colleagues and I try to find the cause(s) of complex problems or failures in industrial or organizational settings. This is why:

It is a widely held assumption (there’s that word already) that professional and technical people are fact based and logical. In fact, it is just as common for engineers and other technical folks to find themselves “in the answer” as it is for nontechnical folks. Often, people with long years of experience in both technical and nontechnical fields are seen as (are referred to as, and think of themselves as) “experts.”

What do we generally go to experts for? Questions? Generally not. We go to experts for answers. Oh, Oh. Looks like exactly the sort of trouble Jean and Jean observe in interpersonal interactions among coworkers.

I mention this so that we can lighten up about the fact that no matter how hard any of us try, we will, as human beings, necessarily make assumptions, no matter how smart or “expert” we may be. Assumptions make things keep running by allowing us to move quickly in performing activities that are familiar and more or less routine. So is there a problem?

Actually, making assumptions is much less of a problem than not knowing we are making assumptions. In other words, when we make the assumption that we are not making assumptions, we open ourselves to serious errors. Assumptions ‘squared’ turns out to be the big problem.

Antidotes to the Assumption of no Assumptions

There are two main ways to guard against the danger of running on assumptions for too long before checking. The first way is the path Jean and Jean call “Putting yourself in another’s shoes,” that is. empathy. I have found that my willingness and ability to do this has come from the hard reality of having too often found myself to be wrong when I knew I was right. Experience is a good teacher when we pay attention with humility.

The second path Jean & Jean share is the one I would like to expand on. It is “Choosing to Test Assumptions.” This is first cousin to another technique they suggest of generating alternate hypotheses. This approach is useful in interpersonal issues, but it is essential when dealing with nonhuman problems. The distinction is the increased rigor required when the facts we are checking are hidden deep under the surface of a complex occurrence, for example, a machine that stops working. When all of the obvious (generally assumed) failure causes have been checked out and have been ruled out by physical data and tight logic, but the equipment still doesn’t function, what do we do?

The first thing we need to do is clearly define the problem. This is the point where, for example, Ford people say, “There’s a problem because it’s a Chevy,” and Chevy people say, “There’s a problem because it’s a Ford, etc.” Or both might say, “There’s a problem because Jimmy doesn’t have the right attitude.”

In any serious inquiry into “What is causing what?” such obvious bias-based assumptions have to be rejected, but this isn’t always easy. People often hold on to their assumptions about machines as strongly as they hold on to their assumptions about their home team. In that situation, we need a strong discipline to get everyone working together to find true cause and effect. The discipline I describe here is called Fact-Based Causal Analysis, a fancy word that means “getting to the root cause of the problem without fooling myself with my assumptions along the way.”

So how do we define a problem? The way to do this is to ask, very specifically, “What is the ‘object’ that is not working?” and “What is the standard it is not meeting?” In the care of a car the “Problem Statement” could be: “The car — Isn’t starting.”

Or, more specifically, “When the engine turns over – There is no ignition.” Or “The starter —Isn’t engaging.” Or “When I turn the key – Nothing happens.”

Or, The report – Is not complete on time.” Or, “He – turned away when I started to wave [instead of waving back as I expected].”

Once we have a statement of “The Problem” we can take the next step in the rigorous process by asking the big question, “Why?” This is when we can all feel the assumptions coming on,:“It’s a Ford” [or a Chevy]” or “He screwed up.” Or, “He’s mad at me.”

We could be charitable and call these all hypotheses. But a real hypothesis is not about having the answer, it is about being in the question so that we can be accurate by searching for facts that confirm or contradict the hypothesis.

In a rigorous process, this is just the beginning as we look at the available data and brainstorm the possible hypotheses that might explain the occurrence of the problem. That is why the problem statement has to be as narrowly focused as possible or our hypotheses will be all over the map.

Obviously, this is just the tip of the iceberg about a truly rigorous process. My aim here is to introduce the existence of such processes so you can look for and access a good process and the resources to make it work. That is the only way to have hope of finding true cause in complex problems, most way more complex than these quick examples.

In future posts, I will explain how to use some of the truly useful technology for testing assumptions at the systems level using Fact-Based Causal Analysis.

Filed under: testing assumptions

Like this post? Subscribe to my RSS feed and get loads more!