This is a somewhat reworked version of an early post from my old site, Lagniappe, which I doubt if many of my current readers have ever seen:
I came across a quote from V. S. Naipul which set me to thinking. It's from Among the Believers, his famous (infamous, for some) book on the Islamic world. About the slow post-independence rot of Pakistan, he wrote:
"The state withered. But faith didn't. Failure only led back to the faith. . .If the state failed, it wasn't because the dream was flawed, or the faith flawed; it could only be because men had failed the faith. A purer and purer faith began to be called for."
This seems to me to not only be true, but to be true about many more things than Islamic politics. What it reminded me of was from a David Foster Wallace essay in A Supposedly Fun Thing I'll Never Do Again, where he defines a harmful addiction as something that presents itself as the cure for the very problems it causes. This applies immediately to physical dependencies like alcoholism ("If you had my problems, you'd drink too",) but Wallace goes on to show how it fits the habit of, say, watching five hours of TV a night.
I'm not making Islam = addiction connections here (or TV watching = religion ones, either!) No, what occurred to me is the general problem of systems whose only remedy for failure is to cycle back around again. And if that doesn't work, the only remedy is to do it again, preferably longer, louder, and harder the next time.
A blind spot is built into these systems which allows them to get to the harmful stage. No failure can be the fault of using the system itself - if that were possible, then other courses of action would be possible, too. And that can't be right, can it? But without that choice, you're on a circular highway without any exit ramps.
Look around, and you'll see plenty of these. At your workplace, is there some policy that does nothing but worsen the problem it addresses? And is there any mechanism at all for the policy itself to ever be at fault? Everyone's encountered folks who are so convinced of their own correctness that they just get crazier and crazier. Lots of terrible managers work the same way.
One of the things that has made science, as a system, work so well for so long is that it doesn't rule explanations out very readily. The possibility that a whole system might be at fault is always there; what's more, there are usually some eager researchers ready to try to tear it down. Individuals will make the circular-problem mistake, holding on to untenable theories by making them more and more complicated rather than abandoning them. But it's harder for a whole field of research to get bogged down in this way, and we're the better for it.
The connection between this line of thinking and the pharmaceutical slump of recent years has not escaped me. Is the answer to make even bigger screening libraries, that are then run through even faster? To dig around even more thoroughly in the genome? Or do we need something completely new - in other words, have we failed our ideas, or have they failed us? The fact that we can even ask the questions is the first step in being able to answer them.