Here's an article on a topic that's come up around here before: psychological and cognitive barriers to discovering a new drug. These include confirmation bias, poor risk assessment, an over-reliance on recent experience, etc. The tricky part is that some of these cognitive mistakes might actually be reasonable adaptations to the problems of drug research itself:
The history of science and medicine is full of wrong ideas that prevailed for many years, despite mounting evidence to the contrary: phlogiston, the four humours, spontaneous generation of life and inheritance of acquired traits. These are examples of ‘confirmation bias’, which means that ‘we tend to subconsciously decide what to do before figuring out why we want to do it’ and seek evidence that tends to confirm rather than refute our initial judgment. . . In medicine, such ‘bad science’ can cost many lives; therefore, major institutions and professions have procedures and rules – notably peer review – that seek to protect against the pernicious effects of excessive self-confidence (setting aside, here, the issue of blatant fraud).
In our direct experience, discovery scientists admit that false optimism helps keep them functioning despite the recognized reality that most of their projects fail. This seems an essential trait of scientific heroes of the past yet, paradoxically, might count as a cognitive error in a business setting:
Now there's one that I hadn't considered, although I have thought a lot over the years about the differences between the business side of the industry and the discovery side. I'm not sure that "false optimism" is what keeps me going, though - I try to realize that most projects fail, but I try to make sure that they didn't fail because of something that I did (or didn't do) myself. The authors, though, quote from another study of the same phenomenon, which raises an interesting question:
Given the high cost of mistakes, it might appear obvious that a rational organization should want to base its decisions on unbiased odds, rather than on predictions painted in shades of rose. However. . .optimistic self-delusion is a diagnostic indication of mental health and well-being. . .The benefits of unrealistic optimism in increasing persistence in the face of difficulty have been documented. . . The observation that realism can be pathological and self-defeating raises troubling questions for the management of information and risk in organizations. Surely, no one would want to be governed entirely by wishful fantasies, but is there a point at which truth becomes destructive and doubt self-fulfilling?’
And that brings up a phrase that I use often, that it's easy to sit in the back of a conference room and tell people that their ideas aren't going to work. And you're right well over 90% of the time if you do that, but to what end? I'm going to have to think about this idea of "destructive truth" a bit more, but I wanted to put it out there for comments. I'll return to the whole cognitive bias problem as well, because there's more to it than just this. . .