I am Right, You are Wrong
It's election season and every time I log on to Facebook I am inundated with messages, notifications and newsfeeds (what's the difference between those last two, by the way?) about the elections and the presidential candidates. Some postings are benign but many are pretty inflammatory and a few border on having no basis in reality.
Some are personal expressions, but many are cut- and- paste or links to sites with a distinct point of view. As is to be expected, most of the posts are not critical debates on issues, but charges and countercharges against the opposing "team." I use the word team deliberately because it is similar to sports fans who want victory at any cost-even if only on a technicality.
Since I have an interest in the "science" of decision making, I have been thinking about the "how" and "why" of this phenomenon quite a bit. Most people fall into a decision making trap called confirmation bias in which individuals tend to favor information (even misinformation) that supports their beliefs or hypothesis. This is also called a "confirmation trap" because it is so easy to fall into and can be very difficult to extricate oneself from. Even when confronted, individuals will often deny they have fallen into a trap by simply pointing to even more "evidence" that affirms their opinion, rather than critically examining facts to the contrary.
Oddly, as scientists, we are no less prone to being victims of decision making biases including the confirmation trap.
We all know about the lazy nightshift that Gloria in Hematology is incompetent and Charles will always make up an excuse not to work on the weekend. We can probably recite many examples to "prove" our point while ignoring information to the contrary. It might go as far as finding ample proof that people of a certain age group or race or ethnic group are not a good fit for our lab so they are rarely given a fair shot during an interview. The few that slip through the hiring process are heavily scrutinized and never given the benefit of an objective review.
How can you avoid or reduce confirmation bias? The principle is simple, but the practice is not. The first step is admitting that we are all prone to this sort of fallacious reasoning.
Faced with a "fact" consider the opposite for just a second. Deliberately seek out an opposite view from a colleague whom you know will not agree with you. Ask questions like "What could I have done differently?" rather than "How did I do?" The latter tends to elicit universal agreement, while the former does not.
Try putting yourself in someone else's shoes and consider if you could even conceive of ever thinking or acting the way they do. If it's possible, then maybe you just have a difference of opinion. You do not have to be totally right or wrong. Is the "truth" somewhere in between-or even based on situation and perspective?
As a scientist and manager always concentrate on facts over personality (the act over the actor). Would an act or opinion be viewed the same way if it was attached to someone you felt differently about? Consider: might there a faulty process that has contributed to the adverse result versus is the person "just as bad as I always thought."