Study after study has shown that we are not rational creatures—although that doesn’t stop us from thinking of ourselves as being very rational. By rational, I simply mean acting in accordance with the facts of reality. We typically see ourselves as making rational judgments and decisions by evaluating information objectively and arriving at logical conclusions. In truth, our judgments and decisions are prone to error.
It’s not that we are always trying to spin reality and manipulate others. It’s not that we are always making mistakes in our logical reasoning or that we think too much with our emotions. We do all of those things, of course, but there is a more fundamental reason for our irrationality—our cognitive biases.
A cognitive bias, according to the CIA (no less), is “a mental error that is consistent and predictable.” Our cognitive limitations cause us to rely on various simplifying strategies and rules of thumb (heuristics). These rules of thumb—which rarely rise to the level of consciousness —help us to handle complexity and ambiguity, but can lead to errors of judgment. An example the CIA gives is as follows:
“The apparent distance of an object is determined in part by its clarity. The more sharply the object is seen, the closer it appears to be . . . the reliance on this rule leads to systematic errors in estimation of distance.”
—Psychology of Intelligence Analysis by Richard J. Heuer, Jr., Center for the Study of Intelligence, 2007
When visibility is poor, distances often are overestimated. When visibility is good, distances often are underestimated. Such biases shape how we think and act in the world, including how we evaluate evidence, perceive cause and effect, and estimate probabilities and risks. Some cognitive biases are:
- Attentional bias: Tendency of our perceptions to be influenced by our recurring thoughts.
- Backfire effect: Tendency to respond to evidence in opposition to our beliefs by strengthening our beliefs.
- Confirmation bias: Tendency to agree with people/information that agrees with our already existing views.
- False consensus effect: Tendency to overestimate how much others agree with us.
- Gambler’s fallacy: Tendency to believe previous events will influence future outcomes, e.g., a flipped coin comes up heads five times in a row, so we are inclined to predict an increase in the chances that the next coin toss will be tails (statistically, each toss is independent, and the probability of the outcome is still 50/50).
- Negativity bias: Tendency to have a greater recall of unpleasant—rather than pleasant—memories.
Biases and Borders
As someone who writes about the borderless workplace, I’ve become intrigued with how our cognitive biases in relation to borders influence our thinking and actions. Two psychologists at the University of Utah (Arul and Himanshu Mishra) have conducted some fascinating experiments about how our mental maps influence the way we think about risk.
They asked a large number of volunteers from 32 U.S. states to imagine building a mountain home in the Pacific Northwest—either in North Mountain Resort in Washington State, or in West Mountain Resort in Oregon. While they were thinking about their choices, the volunteers were given news about an earthquake, but the information differed. Some were given the news that the earthquake had hit Wells, WA, 200 hundred miles from both vacation home sites. Others received the news that the earthquake had struck Wells, OR, also 200 miles from both locations. Even though the volunteers knew that both home locations were exactly 200 miles from the disaster, the volunteers perceived in-state home locations to be significantly riskier than out-of-state locations. In short, they disregarded actual distance and made their risk assessment based on arbitrary political borders. An interesting border bias!
The researchers decided to explore the issue from a different perspective. They recruited volunteers from Salt Lake City, UT, and gave them the news that a radioactive waste facility was being built 165 miles away. If not managed properly, the radioactive waste would contaminate soil, water, and air within a radius of hundreds of miles. Some of the volunteers were told the facility was going to be built in Sevier Lake, UT, while others were told it was to be built in Spring Creek, NV—both 165 miles from Salt Lake City. In this experiment, the researchers gave the volunteers different maps. In some, the UT-NV border was drawn as a thick, dark line; others received a map with the border drawn as a light, dotted line. When the waste was to be stored in Spring Creek, NV, residents of Salt Lake City perceived a much greater risk if the UT-NV border was drawn as a light, dotted line. When the border was drawn as a thick dark line, it reinforced the bias that borders are impermeable, and that somehow the border offered a greater degree of protection.
Changing Perceptions of Borders
How we perceive borders shapes our thinking about them, and consequently our behavior toward them. We tend to think, of course, that others perceive the boundaries in the same way we do.
As our workplaces transform into ones that are virtual and technologically without borders, how will our perceptions of borders change, and consequently how will our thinking and behaviors change toward them? In the past, organizations relied heavily on formal up, down, and across borderlines to run efficiently. New communication and collaboration technologies are severely testing traditional borders between, for example, company/customers, products/services, function/function, and national borders/digital communities.
It used to be that our organizations would establish borderlines for us in organization charts and strictly demarcated job descriptions. In organizations seeking to be competitive in a much more complex and dynamic business environment, we each are called upon to construct, deconstruct, and reconstruct boundaries that serve the organization’s purpose at a particular time.
Ironically, while the workplace becomes borderless, we must become increasingly border savvy.
Terence Brake is the director of Learning & Innovation, TMA World (http://www.tmaworld.com/training-solutions/), which provides blended learning solutions for developing talent with borderless working capabilities. Brake specializes in the globalization process and organizational design, cross-cultural management, global leadership, transnational teamwork, and the borderless workplace. He has designed, developed, and delivered training programs for numerous Fortune 500 clients in the United States, Europe, and Asia. Brake is the author of six books on international management, including “Where in the World Is My Team?” (Wiley, 2009) and e-book “The Borderless Workplace.”