Summary of Wilful Blindness – Why we ignore the obvious at our peril by Margaret Heffernan
(Summarised by Paul Arnold – Facilitator and Trainer – If you like to be put on the free mailing list each month then email me at email@example.com)
THE BOOK IN A NUTSHELL
In a time of massive information explosion and increased complexity, we run an increased risk of becoming blind to events (such as ENRON, Bernard Madoff, The BP Deepwater Horizon oil spill etc).
This is caused by a number of factors such as our brains limited capacity to assimilate all information consciously, our own self-perception of the world, our innate need to belong, an over focus on financial returns versus people, fear of change and our obedience to authority.
It takes courage, perseverance and an outsider’s perspective to dare to challenge.
Organizations need to wake up to threat of blindness and set in place safeguards to help them see more clearly.
‘Go go go said the bird: human kind cannot bear very much reality’
– T.S Eliot, Four Quartets
The book explores a wide range of different factors that contribute to our often unconscious drive to be blind to certain events.
The limits of our minds
We are naturally biased and cannot help it.
We are living in an over stimulated environment. Our brains are unable to cope with all the inputs – so it filters out irrelevant information; focusing on those things we feel are important. Attention is a zero sum game – when we focus on one thing it means we must lose focus on other things.
Information may be exploding but our brains are not – the limited capacity of our conscious brain means we cannot process all the relevant information, thus often compromising the quality of decision making.
The sheer complexities of many businesses (especially the very large multi-nationals) means they are impossible to fully manage (or police) – no one can know everything that is happening in an organization.
Furthermore when we get tired our brains slows down. After 24 hours of sleep depravation, the glucose levels in the part of our brain used for making sense of information drops by 12-14%. Our cognitive capacity is the same as someone who is over the legal limit for alcohol. Thus a tired experienced worker can start performing like an inexperienced person.
The abuse in Abu Ghraib was partly caused by tiredness that made them lose their moral compass. The soldiers were working 12-hour shifts, seven days a week, for 40 consecutive days. They were ‘drunk’ with exhaustion.
The deadline driven culture also impacts. Adrenalin causes the closing down of parts of the brain, reducing our openness to new ideas. Hence our culture of overworking is counter-productive and leads to greater errors.
Our mindsets/Maps of reality
We all have a ‘map’ of reality that helps guide our decisions and behaviour. We naturally (and unconsciously) cluster together with other like-minded people.
Likewise we chose to read newspapers and magazines that reflect our point of view – which further validates and strengthens our position.
When Cass Sunstein bought together a group of like-minded people together they made each other’s views more extreme. Likewise when mixed with people of the opposite persuasion it further polarized each side’s position. Sunstein further demonstrated that when people read a balanced article, they are twice as likely to seek out information that supports their current opinion. Thus even when presented with wider information, we choose to ignore it. A shared perspective breeds increased certainty, comfort, and risk taking towards that viewpoint.
Likewise the Internet was supposed to widen our knowledge base, but instead it tends to cement our current point of view – 85% of blogs link to other blogs who share the same political inclination. The greatest strength of the Internet actually seems to be its ability to connect people who share the same attitudes – irrespective of geography. Whether it’s a love for orchids or an extreme religious sect, the Internet allows people with similar views to connect with other like-minded people globally.
Organizations naturally ‘self select’ people who ‘fit in with map of reality (People with a different point of view get weeded out), leading to a consensual (and closed) view – they see what they want to see and do not see what they do not want to see.
Blind to the facts
Love is blind – when we love someone we see them as smarter wittier etc than other people do – we choose to ignore or discount their real faults.
fMRI scanning has revealed parts of the brain used in critical judgment are switched off when deeply in love.
And when presented with ‘facts’ we will often choose to ignore them rather than sway from our point of view. Alan Greenspan was so seduced by his own model of the economic market, even when it was shown to be wrong, he still blamed other reasons rather than his theories. Likewise Tony Blair and WMD.
The denial the Catholic Church (and the Irish people) over child abuse was evidence of the power of belief. No one wanted to believe that such atrocities were possible. This is clearly a recurring theme that we have witnessed throughout history.
Contradictory information creates imbalance in the body. Neuro chemicals are released that helps delete or distort the divergent data. Information that supports the current belief is conversely rewarded via dopamine release.
Asch ran an experiment where people in a group had to choose which of the lines on the right was the same length as the line on the left. Everyone except for one person was a stooge. When the stooges started saying the wrong answer, the respondents followed suit. The respondents stopped trusting their own senses and relying instead on the ‘wisdom’ of the group. Of interest, fMRI scans of brains activity found the respondents actually started seeing the wrong length line as the right answer – i.e. they became blind to the truth. Thus people see what they believe and not believe what they see.
Industries (such as the tobacco industry) get so ‘locked’ in their point of view, they cannot see any other perspective. Alice Stewart battled for 25 years to get the medical institutions to accept the carcinogenic effects of X-rays – even though there were voluminous amounts of irrefutable evidence in support of the case (X-rays were new and the health world had fallen in love with its potential – Emotions drown out rational facts).
Having an outsiders point of view
It often takes a person who is outside the system to see what others are blind to.
For example Steve Bolsin was the pediatric doctor who transferred to Bristol Royal Infirmary and uncovered the high child mortality rates in the unit.
Whistleblowers often are loners or tend to be on the fringe of groups. This is one of the reasons why they tend to be Cassandra’s (cf Greek Myth) – they see things but are not listened to, as they do not have the authority or credibility to be listened to.
The power of groups
We have an innate drive to be in groups. Physiologically we are programmed to belong. When rejected the same chemicals associated with pain are released in our bodies. And when accepted into groups, opiates (‘pleasure’) are released. When we are in a team, it helps validate our self-image and our views (which we will go to great lengths to protect).
So it is no surprise that we try to stay inside groups and not be rejected. This leads to high levels of compliance (sometimes even when it crosses our own moral boundaries).
The famous Zimbardo prisoner/warden role playing game run amongst Stanford University students demonstrates how quickly we can let go of our own moral compass to remain inside a group.
The greater we identify with a group, the more influential the group norms will have on us. We become more like the people we spend the most time with – we unconsciously take on the group’s values and beliefs (especially those with rituals and ceremony which celebrates and reinforces their culture – e.g. WestPoint, Sandhurst, Church, Private schools, and many organizations).
The military create situations of regular extreme exhaustion and tiredness the body’s ability to resist drops – thus it’s much easier to ‘infiltrate’ the mind under such circumstances. This is further exacerbated by regular assessments and threats of being expunged from that society.
Furthermore, people with low self-esteem, ‘pleasers’ and people on the edge of teams (especially those previously rejected) are much more likely to be heavily compliant.
Organizations will also help to enhance conformity by behaviours that threaten a person’s membership. For example, in one HBOS sales department there was a weekly celebration or humiliation depending upon hitting ones sales targets – cash for those who did, or a cabbage for those who did not.
One of the issues is that the greater the ‘esprit de corps’ the team gets, the less likely its member will be prepared to challenge the status quo (e.g. the closing of ranks in the Abu Grahib incident).
People will often ‘turn a blind eye’ because everyone else is also turning a blind eye. Kitty Genovese was stabbed to death in the middle of a street in New York. 38 people witnessed the attack but no one did anything about it, assuming someone else must have already called the police.
In 1998, Larry Froistad confessed to 200 people on a chat room that he had burnt his house down to murder his 5-year-old daughter. Only 3 people reported him.
In an experiment, volunteers were asked to fill in a questionnaire. As they did, the room started to fill with smoke. When one person was alone, they left to room to seek help within 2 minutes. But when there were two other stooges in the room who did nothing. Only 1 in 24 participants acted within 4 minutes. Thus the wisdom of crowds can heavily influence our behaviour as we blindly assume they know more than we do.
However, if people think they are the only one who could help, then 85% of people do something. But when it is seen as a shared problem, then only 33% take action.
The more people who witness an improper situation, the less likely anything will be done about it. Thus, the ENRON collapse was not caused so much by a few corrupt people in the organization, but by the thousands of silent bystanders who did nothing.
Bandura identified we learn from modeling others. Thus we learn from an early age (often from our parent’s reaction to events) that it’s best not to ‘get involved’.
Obedience to authority
Our compliance (and hence willingness to turn a blind eye to things) is partly driven by our learned response to obey authority figures (cf the famous Milgram experiments).
In an experiment, 22 nurses in Ohio were instructed to administer an obviously excessive dose of medicine by a physician they did not know. 21 out of the 22 gave to dose – with no real resistance.
Up to 25% of all plane crashes were caused by ‘destructive obedience’ – e.g. not being prepared to challenge the authority of the Captain.
Part of this is caused by the way some leaders react when challenged. For example, at the BBC the author had a boss who threw phones against the wall when he was given adverse news. GM’s CEO Roger Smith was notorious for getting rid of people who dared to counter his point of view.
Organizations also apply a range of overt and covert strategies to ensure compliance – ranging from dismissal, protocol, bribery (cf Universal Zonolite Company in Montana over the asbestos issue) through to humiliation.
Furthermore, many leaders personal success in the past means they are more certain of their decisions. In research it’s been shown that they feel more optimistic than others.
We all fear change. It is common for people to adopt an Ostrich like approach to issues in the hope they may go away.
Often in relationships, the betrayed partner or mother of an abused child will often ‘hide’ away from the facts, as they fear an uncertain future. Abused children in the Irish Catholic schools scandal were reluctant to reveal the situation for fear of losing contact with their family – the most important thing in their lives. ‘Survival’ means people will cling to a less than perfect situation. Research has shown that people tend to downplay the consequences of the current situation and overplay the consequences of a new situation. The irony is our blindness to remain safe often pushes us into greater danger.
Fear is a powerful inhibitor of action. The perceived cost (time, effort, reputation, stress, job etc) is often too great for many people to push through the innate resistance any system puts up against whistle blowers). For example it took 5 years to expose the MP expense scandal. Steve Bolsin the whistleblower at Bristol Royal Infirmary lost his job and eventually had to move abroad to get re-employed.
In one study, 85% of Executives admitted they did not raise an issue with their boss at some point in their careers. That’s because people do not want to provoke conflict, be labeled a troublemaker, lose their jobs, not know how to resolve it or thought it would make no difference anyway.
The worry is in poor economic climates, the concern over job security rises which means people become more compliant.
The increasing complexity and interconnectedness of large organizations means no one sees the whole – so the organization is blind to the consequences of its decisions. This is further compounded by layers of management who filter out non-conforming information from senior management.
In 1998, John Browne, the CEO of BP ordered a 25% blanket cut in fixed costs across all refineries – regardless of their condition or consequences. Such was the force of the directive, it blocked out any other considerations – including ethics, legality and safety). This eventually led to a large explosion at their Texas oil refinery in March 2005 that killed 15 people.
The Vioxx arthritis drug killed over 2,000 people due to departmental failure at the FDA. The vast power of the Office of New Drugs dwarfed the concerns expressed by the Office of Drug Safety.
The increasing tendency to bring in contractors and consultants exacerbates the situation as they are less inclined to raise issues or lack the power of influence. The tragic Challenger space shuttle accident was eventually traced to an inability of the ‘O’ rings to cope with extreme low temperatures. These rings were subcontracted out to Morton Thiokol. However, when they raised the issue with NASA their concerns were brushed to one side due to the political pressure upon NASA to deliver.
Likewise, Primark (as did Nike) turned a blind eye to the use of child labour in their ambition to drive costs down.
Finally we know that time and distance also increases the opportunity for blindness.
In the Milgram experiments, only 40% of the respondents where prepared to shock the stooge with the full voltage when they were in the room versus 65% when in another room. It seems when out of sight people are literally out of mind. Face to face significantly changes our behaviour. The Generals in the 1st world war never visited the hospitals, and Albert Speer never visited the concentration camps he oversaw.
Blinded by money
What you bonus/reward a person for will drive focused behaviour towards it. And this can lead some people to lose sight of what’s right or wrong. Evidence shows the more you get bonused the blinder you become.
Doctors who own a stake in testing labs order more tests. Likewise private doctors are more likely to recommend expensive arthroscopy for arthritis than two other cheaper procedures – even though there is no difference in effectiveness between the three treatments.
In the sub-prime mortgage economic meltdown, everyone was aware of the potential problems but no one wanted to challenge the issue as it served everyone’s interests at the time.
White-collar fraud causes far more economic harm than armed robbers, car thieves etc. Workers defraud approximately 5% of business revenues, dwarfing the amount lost through crime.
When we care more about money, we care less about people.
Possible solutions to blindness
-Be aware that we are all naturally biased
-See it personally e.g. ‘What if it was my Grandfather in hospital?’
-Stay in touch with average people. Money buys isolation from the masses – e.g. traveling first class, living in a gated detached property, and going to exclusive destinations. Many business leaders are divorced from the consequences of their actions due to their privileged world of isolation. Ratan Tata (the largest employer in India) rides in the front of his car and daily converses with his driver. He lives in a simple flat. Tata’s motto is question the unquestionable.
-Demand transparency (the internet is also ensuring this is happening).
-Institutionalize dissension. Debate a problem from a number of different angles. The Vatican appointed a ‘Devils advocate’ and BA’s Colin Marshall appointed Paul Birch as his corporate ‘Fool’.
-Run anonymous audits – so the truth can be heard.
-Always question the alternative ‘What evidence do we have that this is not true?’
-Demand alternate solutions – set different teams exploring the same problem. It sends a clear message that there is not just one way of doing things.
-Set metrics and benchmark standards (cf the Bristol Royal infirmary problems came about through tracking standards across different hospitals, surgeons and operation types).
-Give people permission to change their minds.
-Just being prepared to ask the questions, can often stop organizational blindness.
This book is full of anecdotes that support her treatise of organizational (and personal) blindness to issues. Whilst an entertaining and interesting read I did find it repeated itself and could have been more succinct.
The book for me is too heavy on the problem (we knew this anyway) and too light on the solutions.