My six year old nephew is fascinated with optical illusions. We happily spent over an hour recently chatting about illusions and I was able to introduce him to some he had not seen before – the Müller-Lyer illusion and the Ames room. He was then intrigued to hear that any one of us can misjudge things at times because we might miss certain information even if it is right in front of us. Unless we’ve seen them before, it’s usually not until we take the time to measure the Müller-Lyer lines and understand the technical construction of the Ames room that we have that ‘ah hah!’ moment and our perception changes.

 

Heuristics and Cognitive Bias

Optical illusions can be perceived differently depending on the individual. They give us an insight into how our minds work and can remind us that at times we need to expend more conscious effort to see and understand available information in a new way. However, as the conscious processing of data is so cognitively demanding, doing this continually would be too exhausting. The human brain has therefore developed a way to tackle this issue helping us make more automatic decisions and thereby conserve mental energy.

As humans, we have an ability to use ‘heuristics‘. Heuristics are cognitive shortcuts that pull on the knowledge and experiential information we’ve committed to memory. They are unconscious rules of thumb that reflect our personal view of the world and help us to navigate our lives in a way that makes most sense to us. If we’ve seen a similar situation before and a fast response is needed, it is reasonable to base our judgments on past experience that worked at that particular time. Heuristics therefore enable us to have an automatic response to help solve difficult problems very quickly, albeit imperfectly.

In an emergency situation, being able to judge what’s happening and make a fast decision is obviously a very handy skill to have. There are however times at work when we use heuristics to make judgments when it might be better to slow our thinking down to consciously examine the new data at our disposal (also see Stephanie Garforth’s article on this subject). As situations are rarely identical, continually using what has worked before may not always serve us well when confronted with a brand new issue. The context may have changed and new information may be available that could alter our view.

Unless we take the time to explore data more thoroughly (and perhaps check our thinking with others at times) our use of heuristics can lead us to having a biased perception because our thinking is out of date. If our thinking is out of date, it can come as quite a shock to discover we have become cognitively biased because we’ve missed some key pieces of information.

An obvious example of where we might fall foul of an automatic response in the workplace is in a recruitment situation. We may use a stereotype heuristic with someone we are meeting for the first time to make an automatic assessment about them. Should a candidate (or indeed a potential boss) remind us of someone we either did or didn’t like from the past, this could start informing our judgment even before the interview begins. It’s important to pay attention to intuition but if we’re not attending to more objective information too we could risk missing the best person for the job or hire similar people into the team where having a diverse mix could be more effective. It’s an issue even experienced managers could encounter.

In his book ‘Thinking Fast and Slow’ Daniel Kahneman1 cites research by Alex Todorov that suggests we can use a judgment heuristic to sum up the facial features of strangers and make assessments of how likeable or competent someone might be. In Todorov’s study, the faces of politicians were used. 70% of the politicians rated by participants as having seemingly higher competence had previously been successful during an election process. Results from a range of elections in other countries showed similar ratings. Informed vs. less informed voters were compared. Compared with those who felt informed, voters who felt they lacked political information were seemingly three times more likely to rely on their automatic response and choose a politician simply on the basis of how competent the politician looked.

 

Institutionalised Heuristics?

When making more informed choices, we are likely to base our decisions on the combination of our subjective beliefs and the objective data we have available to us. We use our capacity for reason and logic in addition to our mental rules of thumb. However, as our cognition becomes adapted to our environment (e.g. Brunswik, 1943, 1955)2 it is highly possible that we also create institutionalised heuristics around our workplace culture i.e. the values, beliefs and practices exhibited at work (see Gary Ashton’s article for more detail). As these rules of thumb differ between organisations, navigating change and overcoming resistance can be particularly difficult unless time is taken to understand and work with this. When seeking to motivate staff for example, a common heuristic may be to change the bonus plan. This maybe adopted without question as the single best way forward. Although earning more money may increase motivation for some, it may be flexible working, being in direct contact with clients or understanding where personal value is added that motivates staff in other organisations. Our own view could be no more than an illusion if we fail to understand alternative perspectives.

 

Leadership and Management Implications

Becoming over reliant or confident in our automatic responses does indeed lead us to having faulty judgment and impaired decision making at times. If it’s not essential to use an automatic response, slowing our thinking down to examine a broader set of data could enable a more fitting decision to be made. Allowing time for this enables us to see the issue differently and consider whether an alternative solution could be more effective. Consciously attending to alternative views and data helps to update an old heuristic and offers other options. If we don’t do this whenever possible, we fail to keep pace with the reality of others and get stuck in a single perspective that is no longer useful. In an organisational context, the dangers of outdated thinking on a broad leadership and management scale are all too obvious. It can become pervasive, leading a previously good organisation to become less attractive to new talent, fall out of step with the needs of its customer base and become less competitive and efficient.

 

How Can we Challenge our Thinking?

A highly experienced lawyer friend of mine told me recently that a junior member of her team asked her why she continued to read broadly when so busy as she seemingly ‘knew everything’. As a mentor, my friend kindly reminded her young colleague that being a good lawyer depended largely on the best possible synthesis of a range of relevant information. Some information was entirely new and some may have been updated since it was last referenced. Whether we are managers, mentors or acting as a coach, we can help ourselves and others see the importance of updating our knowledge and thinking.

An aspect of the work OE Cam does is executive coaching and development. As a coach, rather than suggesting what could be done and being directive, it’s usual to ask open questions. This helps the person being coached (the coachee) to understand what options they could pursue using their own insights, knowledge and experience e.g. ‘what could you do?’ or ‘what makes you think that?’. However, to help coachees expand their thinking further, and explore a different range of possibilities, it can also be useful to ask what someone else might do. If the coachee particularly admires someone as a role model, this helps to challenge thinking in attending to a different perspective.

In a recent workshop, we conducted a lighthearted exercise and discussion about different perspectives that exist within a team. This was to illustrate how powerful it can be to bring together different viewpoints to see something from a new angle and potentially solve an issue in a new way. We asked delegates to think of an issue they had and then consider the perspectives of Lord Alan Sugar, Jessica Ennis-Hill, Albert Einstein and Yoda (from Star Wars). Although this provoked some laughter, our bright delegates quickly realised we were asking them what others with different business, scientific, sports psychology or spiritual perspectives might say. Although a little off the wall for some, the objective was to brainstorm within the group to encourage thought around a greater range of options and possibilities. What kind of data might be important to these people? How might they view the world(s)? Challenging our thinking in different ways may seem bizarre at first but even asking ourselves impossible questions (e.g. ‘what would you do if you could not fail?’) can at times spark an idea that had not been previously considered a possibility.

Our ability to think quickly and use heuristics enables us to save cognitive effort and valuable time when making decisions. Although highly efficient in an emergency, over reliance on an automatic response when unnecessary impacts our judgment and decision making ability. The risk of not taking the time to examine new information or alternative viewpoints is that we could become out dated and unintentionally biased in our thinking. The individual, the team and ultimately the organisation could then lose competitive edge. Being consciously aware of how we might favour a certain approach when making a decision and having the courage to challenge our own thinking at times can serve us well. Retaining an interest in involving others with different thinking styles and asking ourselves some different questions when solving a problem can enable a fresh perspective and help us to see a greater range of options. Who knows, challenging ourselves to think a little differently at times might help us tune in to our inner Yoda or Einstein?

 

  1. Thinking, Fast and Slow’. (2011) Daniel Kahneman
  2. Organismic Achievement and Environmental Probability’, Psychological Review, (50), pp. 255-72. (1943) Brunswik, E.
  3. Representative Design and Probabilistic Theory in a Functional Psychology‘, Psychological Review, (62), pp. 193-217. (1955) Brunswik, E.

susan.carroll@oecam.com