Three Tips For Overcoming Your Blind Spots
by John Dame and Jeffrey Gedmin | 12:00 PM October 2, 2013
Ernst Cramer, the late, great editor-in-chief of the German daily Die Welt, once recounted how as a college student in America in the midwest, just after World War II, he questioned in a math class whether the textbook was not mistaken in a particular instance. The lecturer reflexively, and rather sternly, dismissed the possibility. Several months later, Cramer was working on a farm during summer vacation when he looked up to see his professor jogging across the field from a parked car in the distance. "Cramer," a repentant voice yelled, "you were right – the book was wrong and they've changed that section!"
Cramer related the tale as an endearing anecdote from his early experience in America. We're similarly charmed by the graciousness and integrity of Cramer's math professor. But we're re-telling the story here because the professor's first response reveals two failings all too common in managers. One is the reflex always to bestow uncritical faith in authorities (including one's own superiors) and handed-down rules; the other, the quick dismissal of seemingly irreverent assertions.
We all have such blind spots, and they are weaknesses we should combat. Even that idea is unfashionable in an era when we are urged to focus on polishing strengths. In the world of professional music, it's often the opposite. Most conductors, for instance, begin rehearsals by directing the orchestra immediately to the most difficult passages in a given piece, and spend most time on them, because they are areas of weakness.
But how do managers work actively to fight weaknesses of which, by definition, they are insufficiently aware? We'll offer a few tactics we have used deliberately to counter the effects of three infamous cognitive biases.
To fight confirmation bias, have a devil's advocate.
Confirmation bias refers to our tendency, when receiving new information, to process it in a way that it fits our pre-existing narrative about a situation or problem. Simply put, if you're already inclined to believe that the French are rude, you will find the examples on your trip to Paris to validate your thesis. Disconfirming evidence – the friendly waiter, the helpful bellman – gets pushed aside. They're just "the exception." Warren Buffett says, "What the human being is best at doing, is interpreting all new information so that their prior conclusions remain intact." He knows he is prone to it himself.
Attorneys, debaters, and politicians engage in a kind of confirmation bias when, in order to make a case, they select certain data while deliberately neglecting or deemphasizing other data. But confirmation bias can cause disaster in business and policy when it leads a decision-maker to jump to conclusions, fall prey to misguided analogies, or simply exclude information that inconveniently disturbs a desired plan of action.
What to do? The only remedy is to make sure you have a full and accurate picture available when making important decisions. When you have a theory about someone or something, test it. When you smell a contradiction – a thorny issue, an inconsistency or problem – go after it. Like the orchestral conductor, isolate it, drill deeper. When someone says – or you yourself intuit – "that's just an exception," be sure it's just that. Thoroughly examine the claim.
Dealing with confirmation bias is about reining in your impulses and challenging your own assumptions. It's difficult to stick to it day in and out. That's why it's important to have in your circle of advisers a brainy, tough-as-nails devil's advocate who – perhaps annoyingly, but valuably – checks you constantly.
To cure hindsight bias, keep a diary.
As we move through life, we all keep a running record, at least at some level in our memory banks, of what worked, what didn't, and why. The trouble is, most of us tend to have selective memories. Hindsight bias is confirmation bias's equally problematic sibling. Again, we're cherry-picking from a body of data, in this instance to confirm a theory about why something that has already happened (the 2008 financial crash, the re-election of Barack Obama, the decision to hire a senior executive or implement a business strategy) played out as it did.
There's nothing wrong with having theories, mental models, and frameworks of analysis. On the contrary. The problem begins when critical, independent thinking ends and we fail to keep testing our templates. Hindsight bias impairs our ability to draw the right conclusions, as we imagine after the fact that a situation in the past was avoidable, or a decision simpler than it actually was at the time. This is a point made compellingly by the Swiss businessman and novelist Rolf Dobelli in his new book The Art of Thinking Clearly – a fascinating examination of 99 cognitive inclinations that most of us carry around, generally unaware.
Here's one way to check hindsight bias: Keep a diary. And record minutes from important meetings. We have a friend who just for fun asks dinner guests in his Capitol Hill home in Washington – he entertains some pretty heady gatherings – to scribble on a piece of paper their predictions about politics, business, and world events. He tucks the scraps in a drawer, let's them settle for a year or so, and then pulls them out for a reading over coffee and dessert. It's pretty funny stuff. What becomes painfully clear is that we failed to predict much of anything – claims after the fact notwithstanding.
To overcome "groupthink" start with hiring.
In his 2008 book "Outliers: the Story of Success," Malcolm Gladwell shares a cultural theory of plane crashes. He notes that Korean Air had more crashes than virtually any other airline in the world for a period at the end of the 1990s. Why? It seems likely that Korean traditions of hierarchy created the tendency – including in the cockpit when something seemed out of place or not quite right – to defer to superiors.
Companies like developing their own culture. It's important. Yet a culture that binds too tightly suffocates, chokes off independent thought, and can create a Stepford-like environment. If you find yourself feeling exhilarated because everyone around you is thinking just like you, you should consider that a huge red flag. It may well be that people are self-censoring for fear of exclusion or retribution. There's also ample research – psychologist Irving Janis is the pioneer in this area – that when groups become too close-knit they fall prey to illusions of invincibility.
Fighting groupthink should start at the hiring stage. Look for people who share your basic values and purpose, but who are also tough, independent, and able to tell you what they think. Moreover: check that decisions at all levels in the company are being made on the basis of rationality, not merely flowing from authority or a tendency (however subconscious) to conform.
Which brings us back to editor Ernst Cramer, who also liked to tell the story of how he was first hired by legendary German publisher Axel Springer. The two men had a meeting at Springer's Berlin office that, in Cramer's view, did not go very well at all. It seems there was a serious bone of contention, a rather vehement disagreement on a political issue that went back and forth between the two for some time. Neither was willing to relent.
Later that day, Cramer received a call asking him to return to meet with Springer again. The publisher greeted the young editor with the announcement, "Cramer, you're hired." The somewhat stunned Cramer reminded Springer that the two had spent half their time that morning in very spirited debate, to which Springer replied: "Exactly – that's why I need you on the team!"
That's self-awareness. That's taking the blinders off for full vision. And exactly these things lie at the core of growth and great leadership.
JOHN DAME AND JEFFREY GEDMIN
John Dame is CEO of Dame Management Strategies (DMS). Jeffrey Gedmin is CEO of the Legatum Institute.
No comments:
Post a Comment