A new paper in PLoSOne goes over the existing studies that have tried to put a number on how many scientists falsify data (or have done so at least once) or commit other scientific offenses (ranging from the quite grave to the pretty questionable).
For what it's worth, the meta-analysis comes out with a figure of about 2% of scientists admitting that they've fabricated, falsified, or modified data. Of course, that group itself is a wide one, and deserves to be broken into various levels (which is just what Dante ended up doing, come to think of it, for similar reasons). To my mind, people who are modifying data want to make the numbers look better than they are, and people who are falsifying data want to make the numbers just flat-out say things that they don't say. And the far end of that process is fabrication, where you give up on tweaking and bending and processing, and just make the stuff up. As the PLoS paper says, you slide along from what could be explained as carelessness all the way to what can only be described as blatant fraud.
There are, of course, a lot of difficulties in getting good numbers on this sort of thing, and the whole purpose of this meta-analysis was to try to set a lower bound. There are limits to what people will admit, and limits in how objectively they see their own behavior:
The grey area between licit, questionable, and fraudulent practices is fertile ground for the “Mohammed Ali effect”, in which people perceive themselves as more honest than their peers. This effect was empirically proven in academic economists and in a large sample of biomedical researchers (in a survey assessing their adherence to Mertonian norms, and may help to explain the lower frequency with which misconduct is admitted in self-reports: researchers might be overindulgent with their behaviour and overzealous in judging their colleagues. In support of this, one study found that 24% of cases observed by respondents did not meet the US federal definition of research misconduct.
There's another interesting possibility raised:
Once methodological differences were controlled for, cross-study comparisons indicated that samples drawn exclusively from medical (including clinical and pharmacological) research reported misconduct more frequently than respondents in other fields or in mixed samples. To the author's knowledge, this is the first cross-disciplinary evidence of this kind, and it suggests that misconduct in clinical, pharmacological and medical research is more widespread than in other fields.
He goes on to speculate whether this is due to financial pressures, or different levels of self-awareness or self-reporting. And this brings up another reaction I had to the whole paper, for which I'll have to go back to its introduction:
The image of scientists as objective seekers of truth is periodically jeopardized by the discovery of a major scientific fraud. . .A popular view propagated by the media and by many scientists sees fraudsters as just a “few bad apples”. This pristine image of science is based on the theory that the scientific community is guided by norms including disinterestedness and organized scepticism, which are incompatible with misconduct. Increasing evidence, however, suggests that known frauds are just the “tip of the iceberg”, and that many cases are never discovered. The debate, therefore, has moved on to defining the forms, causes and frequency of scientific misconduct.
I wonder about some of that. Is the image of science really as pristine as all that, at this date? And does the media really help to propagate such a view? I think that the real world is quite a bit messier. I would guess that you'd have to go back to the 1950s (or perhaps before the Second World War) to find a solid majority of people thinking that scientists were pretty much all pristine truth-seekers, and perhaps not even then. And as for media depictions of scientists, those have been mixed for a long time now.
I think that you'll definitely find more objective truth-seeking in the physical sciences than you'll find in most other human endeavors, but science is done by humans with all the failings that humans come equipped with (and it's quite a list). One should always be open to some possibility of misconduct in any field and any situation; lying is one of the things that people do. That's not to condone it, of course - but being shocked by it doesn't seem to be too useful, either.