I asked a friend of mine why she was a good boss. “I was nurturing,” she said. A big study of managers reached essentially the same conclusion: Good managers don’t try to make employees fit a pre-established box, the manager’s preconception about how to do the job. A good manager tries to encourage, to bring out, whatever strengths the employee already has. This wasn’t a philosophy or value judgment, it was what the data showed. The “good” managers were defined as the more productive ones — something like that. (My post about this.)
The reason for the study, as Veblen might say, was the need for it. Most managers failed to act this way. I posted a few days ago about a similar tendency among scientists: When faced with new data, a tendency to focus on what’s wrong with it and ignore what’s right about it. To pay far more attention to limitations than strengths. Here are two examples:
1. Everyone’s heard “correlation does not imply causation”. I’ve never heard a parallel saying about what correlation does imply. It would be along the lines of “something is better than nothing.”
2. Recently I attended a research group meeting in which a postdoc talked about new data she had gathered. The entire discussion was about the problems with it — what she couldn’t infer from it. There could have been a long discussion about how it added to what we already know, but there wasn’t a word about this.
Some of the comments considered this behavior a kind of Bayesian resistance to change in beliefs. But it occurs regardless of whether the new data support or contradict prior beliefs. There’s nothing about prior beliefs in “correlation does not imply causation.” The post-doc wasn’t presenting data that contradicted what anyone knew. Also, similar behavior occurs in other areas besides science (e.g., how managers manage) in which the Bayesian explanation doesn’t fit so well.
I think it’s really strong. I was guilty of it myself when discussing it! I made very clear how this tendency is a problem, giving the analogy of a car that could turn left but not right. Obviously bad. I said nothing about the opportunities this tendency gives everyone. My self-experimentation is an example. The more that others reject useful data, the more likely it is that useful data is lying around and doesn’t require much effort to find. I have called this behavior dismissive; I could have called it generous. It’s like leaving money lying on the ground.
A related discussion at Overcoming Bias.
Addendum. Barry Goldwater weighs in: “I’m frankly sick and tired of the political preachers across this country telling me as a citizen that if I want to be a moral person, I must believe in ‘A,’ ‘B,’ ‘C,’ and ‘D.’” Indeed, preachers spend far more time on what we are doing wrong (and should do less of) than on what we are doing right (and should do more of). The preacher Joel Osteen has taken great advantage of this tendency. “I think most people already know what they are doing wrong,” he told 60 Minutes.
Comments