Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

« Our Friend the Impact Factor | Main | Thought For a Long Weekend »

August 31, 2005

More Fun With Impact Factors

Email This Entry

Posted by Derek

Yesterday's post introduced journal Impact Factors to those who haven't had the honor of meeting them yet. Everyone whose livelihood depends on scientific publication, though, already knows them well, since anything that can be measured will be used at performance evaluation time. IFs are a particular obsession in academic research, since publishing papers is one of those things that an aspiring tenure-seeking associate professor is expected to do. (On the priority list, it comes right after hauling down the grant money.)

But that's not what we value in industry. We know about the pecking order of journals, but we just don't get a chance to publish in them as often as academics do. I'd much rather have a paper in Angewande Chemie than in Synthetic Communications (to pick the top and near-bottom of the reasonable organic chemistry journals), but it won't make or break my raise or promotion hopes. Now, having zero patents might do the trick, but that's because patents are a fairly good surrogate for the number of potentially lucrative drug projects you've worked on.

Nope, it's academia that has to live by these things, and there are complaints. On one level, people have pointed out that impact factors may not be measuring what they're supposed to. Here's a broadside in the British Medical Journal, pointing out (among other things) that the individual papers inside a given journal follow a power-law distribution, too. It's glossed over by the assignment of a single impact factor to each journal, but the most-cited 50% of the papers in a given journal can be cited ten times as much as the lesser 50%.

The less interesting papers are getting a free impact ride, while the better ones could have presumably been playing off in a super-impact league of their own, if such a journal existed. The authors also point out that journals covering new fields with a rapidly expanding literature - much of which is also ephemeral - have necessarily inflated IFs. Does it really indicate their quality? (Well now, say the pro-impact people, isn't this just the sort of carping you'd expect from the BMJ, who live in the shadow of the more-prestigious Lancet?)

But there's also the problem of self-citation. As ISI's own data make clear, lousy journals tend to have more of it. (The text of that article seems to spend most of its time trying to deny what its graphs are saying, as far as I can see.) So if you think that the Journal of Pellucidarian Materials Science has an unimpressive impact factor, wait until you see it corrected by stripping out all the citations from the other papers in J. Pelluc. Mat. Sci. If you accept what IFs are supposed to be measuring, you have to conclude that the huge majority of journals are simply not worth bothering with.

On a different level, there's plenty of room to hate the whole idea, regardless of how it's implemented. The number of citations, say such critics, is not necessarily the only (or best) measure of a paper's worth, or the worth of the journal it appears in. (As that link shows, the original papers from both Salk and Sabin on their polio vaccines are on no one's list of high citation rates.)

It is no coincidence, they go on to point out, that the promulgators of this idea make their living by selling journal citation counts. And by conducting interviews with the authors of highly cited papers and with the editors of journals whose impact factors are moving up, and God only knows what else. The whole thing starts to remind one of the Franklin Mint.

Comments (11) + TrackBacks (0) | Category: The Scientific Literature


COMMENTS

1. SRC on August 31, 2005 10:54 PM writes...

Only accountants believe that statistics can substitute for judgment.

Of course, statistics are so much easier to come by...

Permalink to Comment

2. Spitshine on September 1, 2005 3:38 AM writes...

All impact factors are stupid, silly and only show the over-inflated egos of scientists in general.
Now, excuse me while I reload the statistics for my blog...

Permalink to Comment

3. Tom on September 1, 2005 9:20 AM writes...

As a publisher (please, don't throw sticks and stones), I get to hear presentations by the folks at Thompson/ISI, and there is some good news. Journals with extensive self-citation run the risk of losing their impact factor altogether. There was a certain journal mentioned that required each paper to cite the journal a minimum of X times. Sure enough, the journal's impact factor jumped 3 full points in one year. Thompson/ISI investigated, uncovered the self-citation requirement, and subsequently pulled the journal's impact factor.

Permalink to Comment

4. jsinger on September 1, 2005 12:17 PM writes...

As happens with a lot of metrics, I come away from reading attacks on citation measures with more sympathy for them, not less.

Obviously, a paper in a journal with a higher IF isn't more "true" than one in a lesser journal. Obviously, all sorts of factors skew the numbers. (As a geneticist, I get cited by people who publish infrequent large papers, not a constant stream of JBC two-pagers.) Obviously, you can't rank scientists strictly by such a measure.

But is there any doubt that IF is a good, if imperfect, proxy for a journal's prestige level? Really, if a tenure committee in a science department can't grasp how to use IF and citation counts in an appropriate way, the university has bigger problems to worry about.

Permalink to Comment

5. Derek Lowe on September 1, 2005 1:11 PM writes...

I think one thing that offends me about impact factors is the whole business that ISI has made out of them. Those interview pages I link to give the creeps, for some reason. They seem to have a whole department that thinks of nothing but line extensions for the IF business.

I agree, though, that IFs do track journal prestige pretty accurately, though, just based on how they rank out my own field.

Permalink to Comment

6. paul jones on September 1, 2005 1:40 PM writes...

The problem I see with them is that some folks translate journal prestige into research quality. So if you publish in a journal that has an IF of 2.312, your research is clearly mediocre. Heck, I know a guy who will not write letters of rec. for people if they are not published in JACS. Silly, if you ask me.

Permalink to Comment

7. SRC on September 1, 2005 2:12 PM writes...

Really, if a tenure committee in a science department can't grasp how to use IF and citation counts in an appropriate way, the university has bigger problems to worry about.

How reading the ~!@#$%^& papers? Now there's a concept!

If members of a department can't read the papers and arrive at their own judgment of their significance, then the university should fire the lot of them and start over.

All simplistic numerical measures are trivial to game.

Focus on number of papers? The LPU (least publishable unit) drops, and people squirt out pathetic little notes, effectively serializing their research. (Maybe academics should start blogging their research; imagine the publication lists that would result!)

Focus on number of citations? The first sentence of every paper will read: "A lot of research has been done in my area by me (refs. 1-20), by my friends (refs. 21-50), and by likely reviewers of my papers and proposals (refs. 51-100)."

Focus on the journals in which the work is published? This is less invalid than the other measures, but still an invitation to rampant cronyism and logrolling.

In my view, anyone resorting to numerical measures to judge research lacks the intellect to be suitable for an academic position. They might as well use a magic 8 ball. Such people should repair forthwith to Burger King for work as a trainee.

(Full disclosure: I got tenure, so my asperity is not driven by sour grapes.)

The deathblow to the mindless accounting approach is Gregor Mendel, whose work was published in the world's most obscure journal, and wasn't cited for generations (no pun intended). Yet he had some small impact, yes?

Judgment, not numbers.

Permalink to Comment

8. jsinger on September 1, 2005 2:50 PM writes...

If members of a department can't read the papers and arrive at their own judgment of their significance, then the university should fire the lot of them and start over.

Sure, that's more or less the point I was trying to make. I don't think there's anything inappropriate about a committee giving some extra weight to a candidate's publishing a particularly highly-cited paper, or to his having the publication management skills to get a given piece of work into Cell instead of PNAS. But if they can't make their own determination of a candidate's body of work, one has to wonder why they're in a position to judge him.

Incidentally, isn't it the case that Mendel's work had _no_ impact, and had been entirely recapitulated by the time it came to attention? I was gratified, though, to learn from one of the links here that I've had a paper that's been cited more than the Salk polio vaccine article...

Permalink to Comment

9. LNT on September 1, 2005 3:30 PM writes...

I have to disagree, Derek. Imapact factors do matter at some pharma companies. Wyeth has a "point" system that awards authors based on the quality of the journal that thier scientists publish in. Top quality journals get more "points" than lower quality journals. PhD scientists need to attain a certain number of "points" in order to be promoted and in order to earn the highest level on thier yearly performace eval.
Here's the caveat: the higher quality journals and lower quality journals are not "directly" linked to the "impact factor" -- but you can definately see a correlation...

(the big point of contention with the chemists at Wyeth is that BMCL and Tettrahedron Letters are in the "low quality" rank -- even though most of the industry publishes in those two journals)

Permalink to Comment

10. Derek Lowe on September 1, 2005 3:43 PM writes...

Oy. This seems like a truly misguided system to me, but I've heard various odd stories out of Wyeth over the years.


As far as I'm concerned, publication is a very good thing for the person involved - you get a better-looking resume and a higher profile in case you ever have to look for a position somewhere else. But I think it's of minor benefit for the company. And it can turn into an outright liability if you end up with your scientists spending their time getting elemental analyses for their J. Med. Chem. papers rather than making more new analogs.

Permalink to Comment

11. JPB on September 9, 2005 11:15 AM writes...

From someone in the academic medical community, I can relate to the PhDs referenced in the posting by LNT. As someone who devours each issue of Journal of Shoulder and Elbow Surgery, since that is the seminal journal for my subspecialty, it pains me to see the low IF of that journal. Consequently, when promotion time comes, it would be better to have published one article in Nature than ten in my subspecialty journal. You tell me how many orthopaedic surgeons read Nature (or even JAMA or Lancet for that matter).
My point is that JIF misses the point by trying to equate wide-ranging interest (and therefore citation) with impact of the article. A seminal article in a smaller subspecialty will necessarily have a lower "circulation" than one for a larger field and thereby a lower JIF.

Permalink to Comment


EMAIL THIS ENTRY TO A FRIEND

Email this entry to:

Your email address:

Message (optional):




RELATED ENTRIES
Conference in Basel
Messed-Up Clinical Studies: A First-Hand Report
Pharma and Ebola
Lilly Steps In for AstraZeneca's Secretase Inhibitor
Update on Alnylam (And the Direction of Things to Come)
There Must Have Been Multiple Chances to Catch This
Weirdly, Tramadol Is Not a Natural Product After All
Thiola, Retrophin, Martin Shkrell, Reddit, and More