Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

March 31, 2009

A DPP-IV Compound Makes It Through

Email This Entry

Posted by Derek

After talking the other week about the problems that Takeda has had with their DPP-IV inhibitor for diabetes, it now appears that AstraZeneca and Bristol-Myers Squibb have made it through the same narrows with their own drug. Saxagliptin has met the FDA's latest guidelines for cardiovascular safety, which (you'd think) will remove the biggest potential barrier to approval. The advisory committee is meeting today, so we'll see how their vote goes.

Comments (10) + TrackBacks (0) | Category: Diabetes and Obesity | Regulatory Affairs

Another Obesity Drug? Not Likely.

Email This Entry

Posted by Derek

One of the drug targets for obesity that’s been kicking around for years now is a serotonin-receptor based idea, a 5-HT2c agonist. There are several lines of evidence that make this a plausible way to affect appetite – well, as plausible as any of the appetite-based obesity targets are. I’ve long been wary of these, since we’ve found (over and over) that human feeding behavior is protected by multiple, overlapping redundant pathways. We are the descendants of a long line of creatures that have made eating and reproducing their absolute priorities in life, and neither of those behaviors are going to be altered lightly. The animals that can be convinced to voluntarily eat so little that they actually lose weight, just through modifying a single biochemical pathway, are all dead. Our ancestors were the other guys.

Arena Pharmaceuticals is the latest company to give us more evidence for this point of view. Many drug discovery organizations have taken a crack at 5-HT2c compounds, as a look at the patent literature will make clear. But Arena got theirs, Locaserin, well into the clinic, and yesterday they announced the results. And. . .well, it depends on how you spin it. If you’re a glass-half-full sort of person, you could say that twice as many people in the drug treatment group lost at least the FDA’s target of their body mass, as compared to placebo.

Unfortunately, the glass-half-empty people are probably going to win this one. The FDA wants to see 5% weight loss (versus placebo) with a drug therapy, arguing (correctly, I think) that showing less than that really doesn’t give you much risk/benefit over just plain old diet and exercise. Arena’s compound averages out at 3.6%, and I don’t see how that’s going to cut it, especially with a new central nervous system mechanism. By “new”, I don’t mean “new to science” – as mentioned above, this idea has been around for years. But it would be a new thing to try out in millions of patients if you let a drug through, that’s for sure. I think it’s safe to say that a certain fraction of those are going to react in ways that you didn’t expect. 5-HT2 receptors are involved in a lot of different things, and there's bound to be a lot about any agent in this class that we don't know. Locaserin seems to have been well tolerated in trials, but I personally would be jumpy if I were taking something like this out into the broad population.

That’s not why I think this compound won’t make it, though. The FDA doesn’t even have to talk safety; they can reject it just on the grounds of efficacy. And it’s hard to imagine a lot of insurance plans picking up the tab for something with only those levels of clinical support, too. Arena's CEO says that he's pleased with the results of the trial. No, he isn't. Of course, he also says that he's convinced that the company will get Locaserin approved and find a partner to market it with, too. But then, that's his job.

Comments (34) + TrackBacks (0) | Category: Clinical Trials | Diabetes and Obesity

March 30, 2009

Lilly's Latest Loses (This Time)

Email This Entry

Posted by Derek

Over the years of this blog, I’ve occasionally made comments about how no one really knows much about how drugs for the major central nervous system diseases work. Well, actually, I’ve stated things more forcefully than that, but you get the idea. And although many people who work in the area have written in to say that they agree, I’ve had questions from people completely outside it (journalists and others) about whether I’m serious when I say these things.

Oh, I am. For the latest piece of evidence, see what’s just happened to LY2140023, Eli Lilly’s new drug candidate for schizophrenia. The company was running a three-armed Phase II trial: placebo vs. their existing drug Zyprexa vs. the new one, which is a metabotropic glutamate ligand. And what happens? The placebo group performs about twice as well as the usual average in such trials, for some reason. And that not only swamped the investigational drug, but Zyprexa as well, which has been on the market for years.

Now, there's been a lot of argument about whether the current generation of antipsychotic drugs is really better than the older ones. But I believe that they're all supposed to come in better than a placebo. As Lilly points out, though, "inconclusive trials are common in neuroscience", and they're going to run another one and hope that the patients don't all start improving again on powdered sucrose or whatever the placebo was. But this is especially surprising (and disappointing) because an earlier Phase II trial, run in a very similar design to the latest one, showed the compound working very well indeed. How do you go from such impressive results to no better than placebo in the same sort of trial design? Easy - just make sure that you're developing a drug for schizophrenia. Or depression. Or chronic pain, or Alzheimer's. Stick with the central nervous system, and your drug discovery career will never be boring.

Oh, and one last note: after all the recent stories about buried clinical results, I'm glad to see a company fall completely flat with one of its most promising drugs - and then get up at a large scientific meeting and tell everyone about it in detail. It's not that it's so unusual, but it's good to show people that it happens, and how it's handled when it does.

Comments (16) + TrackBacks (0) | Category: Clinical Trials | The Central Nervous System

March 27, 2009

Layoffs At Merck

Email This Entry

Posted by Derek

I’ve been hearing for a little while about impending layoffs at Merck. I decided, though, that this isn’t the environment to be putting up posts about rumors of job cuts – everyone’s jumpy enough already. But unfortunately, they aren’t rumors any more.

What I’m hearing about, in person and via e-mail, is what sounds like across the board R&D shrinkage For what it’s worth, the damage seems heavier (on a percentage basis) at West Point and in Montreal, but I haven’t heard of any R&D area yet that’s completely missed out. More details are welcome from those closer to the sites affected.

You’d have to think that these cuts have been in the works for a while, but that the Merck/Schering-Plough merger is what’s turned them into reality right now. Still, that’s a bit unusual – most of the time, with these mergers, the job cuts come from the new organization after the merger goes through. With one partner in the deal swinging the ax before that even happens, you wonder what’s going to go on once the two companies merge. Fewer cuts overall than people were estimating (or fewer on the Schering-Plough end? That would be a switch.) Or is this just a head start on something that needed deeper cuts for it to make any financial sense at all?

Either way, if anyone out there knows of some organizations that are in a hiring mood, please feel free to post those details in the comments section. One thing’s for sure – anyone who is trying to fill positions these days will see some good candidates.

Comments (178) + TrackBacks (0) | Category: Business and Markets | Current Events

March 26, 2009

Fan Mail

Email This Entry

Posted by Derek

For those who haven't seen it and might be interested, I wanted to point out this excellent profile of Freeman Dyson in the New York Times Magazine. He's a particular scientific hero of mine, and I'm very glad indeed that he's still around.

And here's some more recent Dyson for those who wish.

Comments (42) + TrackBacks (0) | Category: General Scientific News

The Motions of a Protein

Email This Entry

Posted by Derek

So, people like me spend their time trying to make small molecules that will bind to some target protein. So what happens, anyway, when a small molecule binds to a target protein? Right, right, it interacts with some site on the thing, hydrogen bonds, hydrophobic interactions, all that – but what really happens?

That’s surprisingly hard to work out. The tools we have to look at such things are powerful, but they have limitations. X-ray crystal structures are great, but can lead you astray if you’re not careful. The biggest problem with them, though (in my opinion) is that you see this beautiful frozen picture of your drug candidate in the protein, and you start to think of the binding as. . .well, as this beautiful frozen picture. Which is the last thing it really is.

Proteins are dynamic, to a degree that many medicinal chemists have trouble keeping in mind. Looking at binding events in solution is more realistic than looking at them in the crystal, but it’s harder to do. There are various NMR methods (here's a recent review), some of which require specially labeled protein to work well, but they have to be interpreted in the context of NMR’s time scale limitations. “Normal” NMR experiments give you time-averaged spectra – if you want to see things happening quickly, or if you want to catch snapshots of the intermediate states along the way, you have a lot more work to do.

Here’s a recent paper that’s done some of that work. They’re looking at a well-known enzyme, dihydrofolate reductase (DHFR). It’s the target of methotrexate, a classic chemotherapy drug, and of the antibiotic trimethoprim. (As a side note, that points out the connections that sometimes exist between oncology and anti-infectives. DHFR produces tetrahydrofolate, which is necessary for a host of key biosynthetic pathways. Inhibiting it is espccially hard on cells that are spending a lot of their metabolic energy on dividing – such as tumor cells and invasive bacteria).

What they found was that both inhibitors do something similar, and it affects the whole conformational ensemble of the protein:

". . .residues lining the drugs retain their μs-ms switching, whereas distal loops stop switching altogether. Thus, as a whole, the inhibited protein is dynamically dysfunctional. Drug-bound DHFR appears to be on the brink of a global transition, but its restricted loops prevent the transition from occurring, leaving a “half-switching” enzyme. Changes in pico- to nanosecond (ps-ns) backbone amide and side-chain methyl dynamics indicate drug binding is “felt” throughout the protein.

There are implications, though, for apparently similar compounds having rather different effects out in the other loops:

. . .motion across a wide range of timescales can be regulated by the specific nature of ligands bound. Occupation of the active site by small ligands of different shapes and physical characteristics places differential stresses on the enzyme, resulting in differential thermal fluctuations that propagate through the structure. In this view, enzymes, through evolution, develop sensitivities to ligand properties from which mechanisms for organizing and building such fluctuations into useful work can arise. . .Because the affected loop structures are primarily not in contact with drug, it is reasonable to envision inhibitory small-molecule drugs that act by allosterically modulating dynamic motions."

There are plenty of references in the paper to other investigations of this kind, so if this is your sort of thing, you'll find plenty of material there. One thing to take home, though, is to remember that not only are proteins mobile beasts (with and without ligand bound to them), but that this mobility is quite different in each state. And keep in mind that the ligand-bound state can be quite odd compared to anything else the protein experiences otherwise. . .

Comments (3) + TrackBacks (0) | Category: Biological News | Cancer | Chemical News | In Silico

March 25, 2009

Two! Two! Two Drugs in One!

Email This Entry

Posted by Derek

There's an idea that shows up in the antibiotic field that seems a bit crazy by the standards of other therapeutic areas. Since bacteria develop resistance to single agents, why not take two different classes of antibiotic molecule and, y'know, string 'em together somehow? How about that, eh?

Well, it's the sort of thought that occurs either to people who don't know much about drug discovery, or to those who know an awful lot. In between, you're probably going to dismiss that one as something of an eye-roller. But while it's got some problems, it's not quite as much of a bozo move as it appears. Here's an example that just showed up in J. Med. Chem., where a group tied Cipro (ciprofloxacin) to neomycin.

The first objection is "Why don't you just give people two pills, instead of trying to make them all into one molecule?" (Here's a review that talks about both options). Well, one answer is that two different agents are going to have different absorption and PK, whereas a conjugate drug will be coming on all at the same time, which could be an advantage. But a more compelling answer is that the new conjugate is going to be a different creature at both of its drug targets, and might well be different enough at both to qualify as a new agent to the resistant strains.

The molecules described in that paper above are, depending on your point of view, fluoroquinolones with a lot of sugars hanging off of them - most unusual as far as traditional quinolone SAR - or neomycin oligosaccharides with some odd heterocycles hanging off of them in turn, which is also not the sort of thing that's usually tried on that scaffold. So if you can still hit both targets, you may well be able to hit them with something they haven't seen before (and may not yet know how to deal with). Importantly, in the case of those quinolone/neomycin thingies, some evidence is shown in the paper that bacteria have a harder time developing resistance to the new compounds. (In order to completely evade them, the bacteria will have to mutate out of both targets, too, but that advantage mostly holds with two separate pills as well).

But all this brings up the second objection: how do you think you're going to get away with hanging all that stuff off an active compound? Well, that's why this trick is usually done with known antibiotics. The SAR of these things has been well worked out by now, and that includes the parts of the molecule that don't seem to have much effect on things. Those will be the preferred positions to attach your linking groups, they're the nonessential region(s) of the molecule that can be messed with.

There's a potential show-stopper in all this, though, and it can be seen on display in the J. Med. Chem. paper. Sticking two drug molecules together, no matter how you do it, is going to make a rather large entity. Neomycin, for its part, didn't start out very small, and the linkers used in this paper aren't the tiniest things on the shelf, either (although I do like the use of the triazole click reaction, mentioned yesterday as well). It turns out that the resulting double-barreled compounds are better than plain neomycin, but worse than plain Cipro. And this happens in spite of the fact that when you assay them against the fluorquinolone target enzymes (DNA gyrase and topoisomerase IV), the new compounds are actually more potent than the original drug. So what's the problem?

Well, the problem, almost certainly, is that these things are probably just too huge. The disconnect between enzyme and bacterial potency here may well reflect trouble getting into the bacteria (although that doesn't seem to be hurting the neomycin end of the activity so much). Larger molecules are trouble when dosed orally, too, and I'd expect compounds like the ones shown to be difficult to develop as traditional pills. (That said, there's a real need for IV-based antibiotics for nasty hospital-derived infections, so something like this could still fly, as long as it showed activity against real bacteria).

So this idea is hard to realize, but it's not necessarily crazy. It keeps showing up in the antibiotic world, and here's an account of the same concept being applied to malaria therapy. Eventually someone's going to get this to work.

Comments (19) + TrackBacks (0) | Category: Drug Development | Infectious Diseases

March 24, 2009

Grabbing Onto A Protein's Surface

Email This Entry

Posted by Derek

I’ve written here before about the "click" triazole chemistry that Barry Sharpless’s group has pioneered out at Scripps. This reaction has been finding a lot of uses over the last few years (try this category for a few, and look for the word "click"). One of the facets I find most interesting is the way that they’ve been able to use this Huisgen acetylene/azide cycloaddition reaction to form inhibitors of several enzymes in situ, just by combining suitable coupling partners in the presence of the protein. Normally you have to heat that reaction up quite a bit to get it to go, but when the two reactants are forced into proximity inside the protein, the rate speeds up enough to detect a product.

Note that I said “inside the protein”. My mental picture of these things has involved binding-site cavities where the compounds are pretty well tied down. But a new paper from Jim Heath’s group at Cal Tech, collaborating with Sharpless and his team, demonstrates something new. They’re now getting this reaction to work out on protein surfaces, and in the process making what are basically artificial antibody-type binding agents.

To start with, they prepared a large library of hexapeptides out of the unnatural D-amino acids, in a one-bead-one-compound format. (Heath’s group has been working in this area for a while, and has experience dealing with these - see this PDF presentation for an overview of their research). Each peptide had an acetylene-containing amino acid at one end, for later use. They exposed these to a protein target: carbonic anhydrase II, the friend of every chemist who’s trying to make proteins do unusual things. The oligopeptide that showed the best binding to the protein’s surface was then incubated with the target CA II protein and another library of diverse hexapeptides. These had azide-containing amino acids at both ends, and the hope was that some of these would come close enough, in the presence of the protein, to react with the anchor acetylene peptide.

Startlingly, this actually worked. A few of the azide oligopeptides did do the click triazole-forming reaction. And the ones that worked all had related sequences, strongly suggesting that this was no fluke. What impresses me here is that (1) these things were lying on top of the protein, picking up what interactions they could, not buried inside a more restrictive binding site, and (2) the click reaction worked even though the binding constants of the two partners must not have been all the impressive. The original acetylene hexapeptide, in fact, bound at only 500 micromolar, and the other azide-containing hexapeptides that reacted with them were surely in the same ballpark.

The combined beast, though, (hexapeptide-triazole-hexapeptide) was a 3 micromolar compound. And then they took the thing through another round of the same process, decorating the end with a reactive acetylene and exposing it to the same azide oligopeptide library in the presence of the carbonic anhydrase target. The process worked again, generating a new three-oligopeptide structure which now showed 50 nanomolar binding. This increase in affinity over the whole process is impressive, but it’s just what you’d expect as you start combining pieces that have some affinity on their own. Importantly, when they made a library on beads by coupling the whole list of azide-containing hexapeptides with the biligand (through the now-standard copper-catalyzed reaction), the target CA II protein picked out the same sequences that were generated by the in situ experiment.

So what you have, in the end, is a short protein-like thing (actually three small peptides held together by triazole linkers) that has been specifically raised to bind a protein target – thus the comparison to antibodies above. What we don't know yet, of course, is just how this beast is binding to the carbonic anhydrase protein. It would appear to be stretched across some non-functional surface, though, because the triligand didn't seem to interfere with the enzyme's activity once it was bound. I'd be very interested in seeing if an X-ray structure could be generated for the triligand complex or any of the others. Heath's group is now apparently trying to generate such agents for other proteins and to develop assays based on them. I look forward to seeing how general the technique is.

This result makes a person wonder if the whole in situ triazole reaction could be used to generate inhibitors of protein-protein interactions. Doing that with small molecules is quite a bit different than doing it with hexapeptide chains, of course, but there may well be some hope. And there's another paper I need to talk about that bears on the topic; I'll bring that one up shortly. . .

Comments (7) + TrackBacks (0) | Category: Biological News | Chemical News

March 23, 2009

And While We're Talking About Industry-Sponsored Studies. . .

Email This Entry

Posted by Derek

Last week's discussions around here about the merits (and demerits) of pharma-industry research seem to be coming at what's either a really good or a really bad time. Take a look at this Washington Post article on the handling of clinical data at AstraZeneca.

These details have come up during a large array of lawsuits over Seroquel (quetiapine). And if they're as represented in this article, it doesn't make AZ's marketing folks look very good, and (by extension) the rest of the industry's. We shouldn't be doing this sort of thing, on general principle. But if that's not enough, and it probably isn't, here's a more practical concern: does it take much imagination or vision to think that, with all kinds of health care reform ideas in the air, this sort of behavior might just make Congress want to reform our industry really good and hard?

Comments (6) + TrackBacks (0) | Category: Clinical Trials | Press Coverage | Regulatory Affairs | The Central Nervous System | Why Everyone Loves Us

(Don't) Trust And (Don't) Verify

Email This Entry

Posted by Derek

When a medicinal chemist starts digging around through the literature for help on some chemistry, there are several levels of results. The most welcome are recent papers that solve the exact problem you’re looking for, of course. We’re not in business to come up with new reactions (unless we have to,) so if someone else has done the work, that’s great.

Almost as good are similar reactions from the reliable literature. Different people have different borders drawn for that territory, but everyone would include solid publications like the Journal of Organic Chemistry, particularly if it’s a full paper. Interestingly, that’s because JOC is generally not the place (these days) for the hottest work to appear, which takes that out of the equation. As has been proven several times over the last few years, journals can stumble when they try to publish stuff that’s a bit too avant-garde: some of that work is so cutting-edge that it hasn’t even been done yet.

The communications-only journals vary widely in quality, so they’re generally a step down from the better full-paper publications. One big reason is that the communications often don’t include much in the way of full experimental details. One way to tell how useful a journal is would be to measure how surprised you are when you can’t repeat chemistry from it. If you can’t get some reaction from Tetrahedron Letters to work, you just say “Oh, well”, whereas a dud reaction from a long JOC paper gives you more of a feeling of betrayal.

Then there are patents. When I was a grad student or post-doc, I tended to just ignore patent references, but that was partly because I couldn’t get access to them very easily. Now there’s less of an excuse, and anyone who bypasses them is missing out on a lot of useful preparations. They don’t always work quite as wonderfully as advertised, but there are a lot of very interesting intermediate compounds that are described nowhere else. And when the patent goes on to prepare seventy-five enabled compounds from said intermediate, you can be reasonably sure that you’ll be able to make enough for your own needs.

But there are patents, and then there are patents. If there’s no spectroscopic data associated with a compound, you’d better step lightly. Similarly, there’s journal literature and there’s. . .well, there are an awful lot of journals out there. And some of ‘em are, in fact, awful. SciFinder and other such tools are perfectly capable of tossing out references, in the same list as everything else, to the Bulletin of Some Obscure Country’s Academy of Sciences, 1953, communicated from Unpronounceable University in Everyone Leaves Province. You’re on your own if you track these things down, and good luck to you.

In a category all its own is the Soviet-era Russian literature. There are a large number of compounds (particularly heterocycles) that are described nowhere else, and a wide range of these things are available in English translation. But (as with patents) you have to be careful. Some of this material is really worthwhile and unique, and some of it is. . .well, my theory has always been that people in the Soviet era were willing to do a lot to remain "academicians", considering what some of the other options were like. Can't say I blame them, either, but it means that the more obscure Communist-era references need to be approached cautiously. If you're depending on a reference from J. Siberian Oil Chemist's Soc., (a real journal, at one point), then you may need to start looking for some backup.

Update: On the other hand, here's a new mathematics journal that's made up, explicitly, of papers that have been rejected by peer review. Each includes a summary of why it was rejected, and why the original author thinks it's still important anyway. . .thanks to Marginal Revolution for the tip.

Comments (11) + TrackBacks (0) | Category: The Scientific Literature

March 22, 2009

Blogs and Journalism

Email This Entry

Posted by Derek

Nature is out with a piece on the state of science journalism, and I'm quoted several times as a representative science blogger. They've overstated my blog traffic, though, which is gratifying but inaccurate. Instead of 200,000 page views per week, that's more like my traffic per month. Give it time, I guess! I also am an occasional contributor to an Atlantic web site, not a regular columnist for them.

Update: And in a response to the Nature article by science writer Francis Sedgemore, there's this:

One successful science blog identified by Brumfiel is that of pharmaceutical industry researcher Derek Lowe. “In the Pipeline” is a very well-written blog, but here we have a classic example of a blogospheric closed ecosystem. Lowe’s writing is not journalism, and can never be so given the author’s declared affiliation. More genuinely independent sources of online science news and comment include the web magazines Wired and Seed.

I guess it depends on whether opinion journalism is journalism - since that's what I write much of the time. And how about when I'm writing about something that has no connection to the pharmaceutical industry; do I slip back over the line then? To tell you the truth, I'm not necessarily sold on the idea of journalism as a particular professional category - as far as I know, the whole idea of the dispassionate truth-seeking journalist is a pretty recent one.

But that said, I don't consider myself a journalist, either, under almost anyone's definition. I'll take "writer", although that should really be "part-time writer": I'm a scientist by trade; the writing is something I do on my train rides or in the evenings. If I had to support my family on earnings from my written work, we'd all be eating weeds out of the back yard.

Comments (25) + TrackBacks (0) | Category: Blog Housekeeping

March 20, 2009

What Results Did You Have In Mind?

Email This Entry

Posted by Derek

Of course, no sooner do I come out defending drug company research than we have this to think about:

"An influential Harvard child psychiatrist told the drug giant Johnson & Johnson that planned studies of its medicines in children would yield results benefiting the company, according to court documents dating over several years that the psychiatrist wants sealed. . .much of (Dr. Joseph Biederman's) work has been underwritten by drug makers for whom he privately consults. An inquiry by Senator Charles E. Grassley, Republican of Iowa, revealed last year that Dr. Biederman earned at least $1.6 million in consulting fees from drug makers from 2000 to 2007 but failed to report all but about $200,000 of this income to university officials.

. . .One set of slides in the documents referred to “Key Projects for 2004” and listed a planned trial to compare Risperdal, also known as risperidone, with competitors in managing pediatric bipolar disorder. The trial “will clarify the competitive advantages of risperidone vs. other neuroleptics,” the slide stated. All of the slides were prepared by Dr. Biederman, according to his sworn statement."

There are other examples. Some of this is marketing-speak, to be sure. But mixing up the marketing stuff with the inner workings of the clinical trials is a very bad idea. For sales and marketing people, it's always onward and upward, positive attitude, create-your-own-successful-reality. You most definitely do not want that worldview in a clinician: "Just the facts, ma'am" is more like it. And that doesn't sound like what we're seeing here.

Comments (10) + TrackBacks (0) | Category: Clinical Trials | The Dark Side

Drug Industry Research: Reliable or Not?

Email This Entry

Posted by Derek

So, in light of the Reuben scandal of forged data about pain management in surgery patients, the question naturally comes to mind: how much role did industry play? I’ve seen articles (and had comments here) to the effect that industry-sponsored research is worthless: discount it, can't trust it, bought and paid for, and so on.

The problem is, you can't completely shake that accusation. Industries (and not just the drug industry, by any means) are willing to pay for results that tell them what they want to hear. And while at times that's crossed over into outright fraud, many times it's just that you can set up all kinds of studies, in all kinds of ways, and get all kinds of answers. Run enough of them, and you can choose the ones you like and pretend the others aren't there.

The whole idea of scientific research is that you don't operate like this, of course, and eventually these things do get settled out. If the drug industry really did make sure that only happy results came out, we'd never have catastrophic clinical trial failures, and never have any drugs recalled from the market. And things like the (Nobel-worthy) H. pylori story behind stomach ulcer formation never would have seen the light of day if the industry were capable (on the other hand) of burying everything it didn't want to hear about.

But there are biases, real and potential, and they always have to be looked out for. One error, though, is to assume that these biases can be eliminated by turning to academic research instead. That's the point of a recent Op-Ed in the Washington Post by David Shaywitz, who's worked both sides of the business:

Part of the problem is that we've been conditioned to trust university research. It is based, after all, on the presumably lofty motives of its practitioners. What's not to like about science carried out by academics who have nobly dedicated their lives to understanding the unknown, furthering knowledge and serving humanity?

. . .University researchers are in a constant battle for recognition and the rewards associated with success: research space, speaking engagements, funding and autonomy. Consequently, while academic research is often described as "curiosity-driven," the reality is messier, as (curiously) many researchers tend to pursue the trendiest technologies and explore topics that happen to be associated with the most generous levels of research support.

Moreover, since academic success is determined almost exclusively by the number and prestige of research publications, the incentives to generate results are exceedingly powerful and can encourage investigators to see patterns that may not exist, to disregard contradictory observations that might be important, to overvalue data that might be preliminary or unreliable, and to embrace conclusions that deserve to be viewed with far greater skepticism.

Shaywitz goes on to make the same point I did above - that the system is ultimately self-correcting - but is calling for people to recognize that academic research is also done by human beings, with all that entails. John Tierney at the New York Times had taken up this topic last fall, and wondered about what would happen if enough researchers decided to stop taking industry funding because they were tired of having their integrity questioned.

Tierney's responded to the Shaywitz piece now as well. The comments from his readers are all over the place each time. Some of them are (correctly, to my mind) going along with the idea that research always comes in with various potential biases and agendas, and should be judged case-by-case no matter the source. There are, naturally, some who aren't buying anything that might get industrial research off the hook.

"In industry sponsored comparative studies of medical treatments, the sponsor’s product always comes out on top," says one commenter there. But that's not true. I can give you plenty of examples right off the top of my head. For sure, we try to run studies that will show a benefit for our therapies - but we also have to pin these down to the real world for people (and the FDA) to have a better chance of trusting the results. We're not going to set up a trial that we have good reason to think will fail: life is too short, and the supply of funds is not infinite. You target the diseases (and the patients) that you think will benefit the most (and show the most impressive results, naturally).

And that's a bias to consider right there: we don't set up our trials randomly, so keep that in mind. But no one sets up drug trials randomly, anywhere. There's always a reason to do something so expensive and time-consuming - you should always keep that in mind, weigh it in your calculations, and decide from there.

Comments (16) + TrackBacks (0) | Category: Clinical Trials | Press Coverage | The Dark Side | Why Everyone Loves Us

March 19, 2009

Fraud: How, and Why, and How Again

Email This Entry

Posted by Derek

Readers may have seen the recent stories of an academic anaesthesiologist, Dr. Scott Reuben, who published an entire string of fraudulent papers in the pain management field. Various rabble-rousing sources have used this as a chance to run “Big Pharma Pays For Deception” stories, but I’m not going to give that angle much time at all. I’m sure that the companies involved (Pfizer, prominently) were glad to see studies that showed that their compounds worked and were glad to cite them, but the idea of some bigwig picking up the phone and saying “Fake me up some clinical data” is too much for me.

The biggest problem is that the physican involved seems to have decided that he could make a good living by telling people what they wanted to hear. That’s always a danger, and it works the same way in all sorts of fields. It isn’t always money that drives this kind of thing, either, although that’s a good place to start. Prestige is often a big part of it, too. And there's a problem on the receiving end, too - when someone brings you a company news that their compounds perform well and should be prescribed more often, the first impulse isn't necessarily to ask "Gosh, are you sure?" (It's worth keeping in mind, though, that asking just that question is a key part in making scientific research work - but if someone is going to fake numbers from top to bottom, it's not going to be enough to catch them at it, either).

There’s another factor at work that I think about every time a major fraud or plagiarism case comes up. The minor ones I can understand, actually – someone at an obscure school rewrites an equally obscure paper, slaps their name on it and sends it off to a third-rate journal, keeping their publication rate up so as to keep their better-than-the-alternatives academic position propped up for a while longer. It’s shabby and sad, but it makes a dingy sort of sense. The major cases, though, puzzle me.

I think this topic last came up around here during the Korean stem cell fiasco a few years ago. That one set off a lot of sniping among journal editors, and a lot of speculation about how someone can think that they'll get away with fraud in an area that hotly contested.

Now, it's not like post-operative pain management is the cutting edge of medical science - no Nobels are likely to be on the line - so the question of how Reuben thought he could keep doing this doesn't apply as much. He seems, in fact, to have gotten away with doing it for many years, with no apparent problems until recently. (How, in fact, was he caught? The only details I've been able to find were that it was an internal reviewer at his medical center who noticed something). But how someone can do this sort of thing is what baffles me: not the mechanics of it, but the mental aspect is what's a mystery. How do you look at yourself after turning out fake results of any kind? Especially, how do you do it when you're affecting how people are treated for pain after surgery? And year after year. . .no, I just can't get a handle on this. There are aspects of human behavior which apparently are closed off to me, and I hope that they stay that way.

Comments (18) + TrackBacks (0) | Category: The Dark Side

March 18, 2009

Things I Won't Work With: Chalcogen Polyazides

Email This Entry

Posted by Derek

The Klapötke group at Munich are some of the masters of alarming chemical structures, and they basically seem to own the field of chalcogen azides. Perhaps the competition for this class of compounds is not as intense as it might be - the other labs doing this sort of thing are collaborations between USC and various military research wings. But they're still interesting beasts.

A few years ago, both groups reported the synthesis of tellurium azides, with the Munich group sending in their paper a few days before the USC/Air Force team sent in theirs. The parent tetra-azide was explosive, to be sure, but could be kept at room temperature without necessarily blowing up. Klapötke's group and the USC group (led by Karl Christe) then teamed up to tackle the corresponding selenium analogs, which were reported in 2007.

And they're a livelier bunch. The selenium tetra-azide is another yellow solid, like the tellurium compound, but it's rather harder to keep it down on the farm. Taking some selenium tetrafluoride (see below) and condensing it with trimethylsilyl azide at -196 °C did the trick. After warming things up (you'll note the relative use of that term "warming"), they saw that:

"Within minutes, the mixture turned yellow, the color intensified, and a lemon-yellow solid precipitated while the reaction proceeded. Keeping the reaction mixture for about 15 min at -64 °C resulted in a violent explosion that destroyed the sample container and the surrounding stainless-steel Dewar flask."

Did I mention that this prep was performed on less than one millimole? Spirited stuff, that tetra-azide. The experimental section of the paper enjoins the reader to wear a face shield, leather suit, and ear plugs, to work behind all sorts of blast shields, and to use Teflon and stainless steel apparatus so as to minimize shrapnel. Hmm. Ranking my equipment in terms of its shrapneliferousness is not something that's ever occurred to me, I have to say. It's safe to assume that any procedure which involves considering which parts of the apparatus I'd prefer to have flying past me will not get much business in my lab, no matter how dashing I might look in a leather suit.

That procedure deserves a closer look, though. You can't just crack open a can of selenium tetrafluoride whenever you feel the urge, you know. That stuff has to be made fresh, as far as I can see, and the way these hearty sons of toil make it is by reacting selenium dioxide with chlorine trifluoride. Yep, that stuff, the delightful compound that sets sand on fire and eats through asbestos firebrick.

So if you're going to make selenium polyazides, your day starts with chlorine trifluoride and I'm sure that it just rolls along from there. Before you know it, you've gone from viciously reactive halogens, paused to prepare some disgusting selenium fluorides, made some violently unstable azides that explode if you stick your tongue out at them and hey, it's dinnertime already. . .

Comments (24) + TrackBacks (0) | Category: Things I Won't Work With

March 17, 2009

Takeda Gets A Surprise

Email This Entry

Posted by Derek

DPP-IV is short for “dipeptidylpeptidase IV”, understandably, and we need a good abbreviation for it. It’s an important enzyme target for diabetes therapy, since under normal conditions it breaks down glucagon-like-peptide 1. Longer-circulating GLP-1 would actually do a lot of diabetics good, and people have actually made such proteins as separate drugs, so inhibiting an enzyme that clears it out looks like a good bet. Of such reasoning are drug targets made.

A lot of companies have bought into this reasoning, for sure. For quite a while, Novartis looked like the leader in the area, with the most advanced clinical candidate and a lot of publications in the literature from their development work. But Merck turned out to be running a big effort of their own, and actually got to market first with Januvia (sitagliptin). Novartis’s drug (Galvus, vildagliptin) looks as if it will never make it at all here in the US.

They had to slow down development due to some troubling side effects, giving Merck the edge. There are several DPP subtypes, and you need to be pretty selective, as it turns out – at least some of the problems stem from that consideration. This wasn’t fully appreciated in the first wave of development in this area – the pioneers had to figure it out the hard and expensive way. But a number of companies have come up behind, trying to get a piece of the market, and they now have a clearer idea of what they need to accomplish.

Or do they? Takeda recently heard from the FDA that their DPP-IV inhibitor alogliptin has been turned down for now. What’s more, the agency wants more cardiovascular safety data from them and from anyone else who comes in with a drug in that category. Cardiovascular problems have always been the weak point for Type II diabetes drugs, to be sure. The patient population tends to be older and overweight, often with elevated blood pressure, so you really don’t have much room to work in when it comes to side effects. That’s led to a lot of attempts to come up with therapies that address the CV side of things at the same time as glucose levels (such as the ill-fated disaster of the PPAR alpha-gamma compounds, all of when went most expensively down in flames). DPP-IV inhibitors wouldn’t be expected to have any direct CV benefits, but they do have to avoid making things any worse.

So Merck looks to have the market to itself for a while longer, but as the only DPP-IV drug on the market, they’re going to be under a good deal of scrutiny. The company has already had its share of post-launch cardiovascular nightmares; you’d think that they’re going to work hard to avoid any more. And now all we have to do is assure ourselves that the actions of the DPP-IV inhibitors are all through making GLP-1 last longer. Because even if you're selective for that one enzyme, it has a lot of other substrates. So the story may well swing back to the biochemical mechanism again before we're through.

Comments (19) + TrackBacks (0) | Category: Cardiovascular Disease | Diabetes and Obesity | Regulatory Affairs

March 16, 2009

The Equipment Graveyard

Email This Entry

Posted by Derek

The comment that showed up recently about unearthing an "original Cable and Wireless dephilostagenator" in a lab reminded me of the huge lab moving job I was in on some years ago. We were packing up the entire company's research site and moving it to another spot in New Jersey (Bloomfield to Kenilworth), and this was supposedly the biggest moving job in the US that year. I do know that the Garden State Parkway was used for the parade of 18-wheel trucks at like 3 AM several times, by special arrangement with the state. (You normally can't take trucks on the thing; that's for the Jersey Turnpike, which doesn't go anywhere real close to Kenilworth).

At any rate, as we started clearing things out, there were several layers of equipment. First were the things that we'd either ordered or had used fairly recently - fine. Behind that, or in the less traveled cabinets, were things that we recognized, but (in many cases) didn't even know that we had. Finally, we began to unearth things that we hardly even knew the names of. I remember finding a dropping mercury electrode apparatus down our way; it's still the only one I've ever seen. It had that solid, black-enameled 1952 look to it, with the name of the company written in silver script lettering on the side, "Dyno-Electromat" or something of the sort. It reminded me somehow of those solid old electromechanical adding machines.

That one was only going to find a home in a museum or in a hazardous waste collection dumpster, and you can guess which alternative won out. But when a site shuts down or moves, there are generally large piles of perfectly usable equipment left sitting around, and it finds its way out into the market one way or another. Courtesy of another commentator, here are some folks from Yale digging through stuff that I might have leaned up against at some point. . .

Comments (13) + TrackBacks (0) | Category: Drug Industry History | Life in the Drug Labs

March 13, 2009

"I’d Like to Structure This Transaction So That My Lunch Buys Me"

Email This Entry

Posted by Derek

That's from this post in InVivoBlog. Enjoy!

Comments (3) + TrackBacks (0) | Category:

And That Is That

Email This Entry

Posted by Derek

I’d heard of Neose Pharmaceuticals on and off over the years. They’d been trying to make a go of it in carbohydrate-based drug discovery, an underappreciated and underserved niche. But late last year the company announced that it had run out of money and out of time. And if you really want to see what the end of the line looks like, well, this is it.

Comments (15) + TrackBacks (0) | Category: Business and Markets

Drugs For Bacteria: Really That Hard, Or Not?

Email This Entry

Posted by Derek

A few readers have told me that I’m being too hard on antibacterial drug discovery, at least on target-based efforts in the field. The other day I asked if anyone could name a single antibacterial drug on the market that had been developed from a target, rather than by screening or modification of existing drugs and natural products, and the consensus was that there’s nothing to point to yet.

The objections are that antibacterials are an old field, and that for many years these natural products (and variations thereof) were pretty much all that anyone needed. Even when target-based drug discovery got going in earnest (gathering momentum from the 1970s through the 1980s), the antibacterial field was in general thought to be pretty well taken care of, so correspondingly less effort was put into it. Even now, there’s still a lot of potential in modifying older compounds to evade resistance, which is not something that a lot of other drug discovery areas have the option of doing.

And I have to say, these points have something to them. It’s true that antibacterials are something of a world apart; this was the first field of modern pharmaceutical discovery, and the struggle against living, adapting organisms makes it different than most other therapeutic areas even today. The lack of target-driven successes is surely due in part to historical factors. (The relative success of the later-blooming antiviral therapeutic targets is evidence in favor of this, too).

That said, I think that it’s not generally realized how few target-based drugs there are in the field (approximately none), so I did want to highlight that. And it does seem to be the case that working up from targets in the area is a hard row to hoe. There’s a rather disturbing review from GlaxoSmithKline that makes that case:

"From the 70 HTS campaigns run between 1995–2001 (67 target based, 3 whole cell), only 5 leads were delivered, so that, on average, it took 14 HTS runs to discover one lead. Based on GSK screening metrics, the success rate from antibacterial HTS was four- to five-fold lower than for targets from other therapeutic areas at this time. To be sure, this was a disappointing and financially unsustainable outcome, especially in view of the length of time devoted to this experiment and considering that costs per HTS campaign were around US$1 million. Furthermore, multiple high-quality leads are needed given the attrition involved in the lead optimization and clinical development processes required to create a novel antibiotic.

GSK was not the only company that had difficulty finding antibacterial leads from HTS. A review of the literature between 1996 and 2004 shows that >125 antibacterial screens on 60 different antibacterial targets were run by 34 different companies25. That none of these screens resulted in credible development candidates is clear from the lack of novel mechanism molecules in the industrial antibacterial pipeline. We are only aware of two compounds targeting a novel antibacterial enzyme (PDF) that have actually progressed as far as Phase I clinical trials, and technically speaking PDF was identified as an antibacterial target well before the genome era."

So although the history is a mitigating factor, the field does seem to have its. . .special character. The GSK authors discuss some of the possible reasons for this, but those can be the topic of another post or two; they're worth it.

Comments (3) + TrackBacks (0) | Category: Drug Assays | Drug Industry History | Infectious Diseases

March 12, 2009

Greedy Biotechs?

Email This Entry

Posted by Derek

Smack in the middle of the biotech district of Cambridge, at one of the busy intersections, is a whopping billboard. It’s one of those that rotate vertical segments between three faces, and for some weeks now, all three of them have proclaimed loudly “Stop Biotech Greed!” Variations on the theme include how much money biotech companies make, how the state should stop trying to encourage the industry and spend its money somewhere else, and so on. I’m sure the folks at Biogen enjoy seeing this thing switching between messages all day long; it’s right across from one of their buildings.

I wasn't at all sure who was funding this, because that billboard would presumably take more cash to lease than many activist groups have on hand. I do see occasional hand-made flyers against a proposed biological lab that Boston University wants to build, an issue that’s been fermenting around here for some time, but this was the first blast of anti-industry sentiment that I’d noticed. A quick look around provided the answer, though: the message is from the International Brotherhood of Electrical Workers, and the bottom of the dispute seems to be that several building project are going on that employ non-union electricians. And since a significant amount of the new construction in this area has to do with biotech and associated fields, well. . .

I suppose that they figured that attacking "biotech greed" will play better than a billboard saying "Hire Our Members Or We'll Insult You Again".

Comments (25) + TrackBacks (0) | Category: Current Events | Why Everyone Loves Us

Roche / Genentech: The Chase Is Over

Email This Entry

Posted by Derek

So Roche and Genentech have come to terms: $95 per share. They'd offered more last fall, but, well, it isn't last fall any more. And this was still well above Roche's recent offers, although they'd come up to $93 in public before this was announced this morning. Genentech shares had been climbing up to much closer to Roche's revised offer, so the deal was starting to become clearer in the last couple of days.

What's this going to mean? The main encouraging thing I can take out of it is that Roche is saying that they want to keep Genentech's R&D operation separate, and to keep their talent and their approach to discovery. It's nice to at least hear lip service to that idea - it's a start - but now we'll have to see if they follow through.

Overall, though, I don't like big mergers, as has been a repeating theme around here. And now we've had three whoppers in just the last few months: Pfizer/Wyeth, Merck/Schering-Plough (I know, I know, I'm supposed to have those names the other way around, but come on), and now Roche/Genentech. So I can't say that the industry is moving in a way that makes me really happy. But at the same time, I can see why all this is happening, so perhaps it's the underlying trends which lead to these things that should be making me unhappy - I should be upset about the causes, not the symptoms. (Mind you, I think that the decreased research productivity that accompanies some of these mergers tends to blend the whole cause and effect relationship up a bit).

And it's important not to confuse these moves with the current financial mess. The drug industry has problems totally outside the turmoil in the credit and equity markets. If anything, some of these conditions are making it harder to do the deals that the companies themselves feel like they have to do (look, for example, at how Pfizer had to work to get the financing together for the Wyeth takeover). No, if the markets were in better shape, we'd be seeing the same sort of thing - maybe a bit faster, or a bit slower, but different only in degree, not in kind.

We aren't producing enough good new drugs quickly enough. Collateralized debt obligations and credit default swaps have nothing to do with that. And we're either going to have to find ways to increase our research productivity, or batten down the industry for survival under the conditions we have now. Mergers, right- or wrong-headed, are part of the latter process. If we could find a way to do the former one, we wouldn't be in the shape we're in now.

Comments (16) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

March 11, 2009

A Quick Quiz (Re: Antibacterials)

Email This Entry

Posted by Derek

Just as an addendum to this morning's post, a quick question: can anyone name an antibiotic that was brought up through a target-driven approach? That is, not one that's a variation on an existing class, or has its origins in a hey-that-killed-bugs assay. I mean, one that started off with people saying "OK, XYZase looks to be essential in bacteria, and higher organisms don't have it. Nothing's on the market that works that way, but it looks to be a good target, so let's go after it".

Off the top of my head, I can't think of one. There may be an example somewhere, but just the fact that I'm having to rack my brain about it says something. Doesn't it?

Comments (31) + TrackBacks (0) | Category: Infectious Diseases

Bacteria: Respect Must Be Paid

Email This Entry

Posted by Derek

I’ve had the opportunity to learn more about antibacterial drug discovery in the last year or so – that was one of the few therapeutic areas I hadn’t worked in, actually. And although I already knew that it was no picnic in the park (I’d heard the complaints), doing it myself has given me a new respect for the nasty resilience of bacteria.

I’ve been used to having my compounds go into cell assays, where a good number of them fail. That’s expected – every medicinal chemist knows that some of the potent compounds in the primary assay (against the purified protein target) are going to wipe out when they go up against cells. Cells have membranes, for one thing, and they have bailing pumps built into them to spit out molecules that they don’t recognize. I’ve seen compounds, as has everyone who does drug discovery, that bounce right off the cell assay while closely related analogs work just fine. That’s why you run the assay, to weed those guys out – you may not every understand what specifically went wrong, but you at least get a chance to try to avoid it as you go on.

But bacteria are different beasts. Their independent, free-living nature makes them nastier than even tumor cell lines. Cancer cells, aggressive creatures though they are, still expect to get their food delivered (and their garbage hauled) by the bloodstream. (That’s what makes angiogenesis a drug target in oncology). But bacteria have to search out their own meals, fighting it out with every other bacterium in the area while doing so. Their membranes are like armor plate compared to a lot of higher-organism cell lines, with the gram-negative organisms taking the trophy. Or is it the mycobacteria? (They're both awful, and the proportion of compounds that fail when you move past the pure-protein stage is thus far higher). They react to threats, communicate with each other, and reproduce like crazy. It’s like dealing with a swarm of tiny, self-replicating attack submarines.

So yes, finding an effective new antibacterial drug is a real triumph, and it’s not a triumph that’s been happening very often in recent years. This gets mentioned a lot in the popular press, when they feel like running a Coming of the Superbugs piece, and one of the usual explanations is that drug companies got out of the area years ago because they thought the problem wasn’t big enough to worry about. That’s part of the explanation – or was, quite a while ago. And the finances are different in this space, true. You’re never going to have a multibillion dollar blockbuster, because a new agent is going to be reserved just for infections that are unresponsive to the older drugs. But it’ll still sell.

No, there are plenty of companies working in the area now, and many that never left. And the need for new agents is clear, and has been for quite a while now. The real reason that we don’t have lots of new antibacterial drugs is that it’s really hard to find them, for one thing, and the the bacterial are more than capable of fighting back when we try.

Update: for more on the topic, see here.

Comments (25) + TrackBacks (0) | Category: Infectious Diseases

March 10, 2009

Don't Like It? Well, Just Don't Cite It!

Email This Entry

Posted by Derek

You know, this is a comparatively minor sin, but it's an irritating one. I was browsing through my Google Reader list of chemistry journals, and this paper caught my eye. It's from a group in Hyderabad, and describes a preparation of propargylamines using indium bromide. "I've seen that reaction before," I thought.

And sure enough, here's a 2005 paper from a group in Tokyo which describes the preparation of (among other things) propargyl amines using. . .indium bromide. The details are slightly different, but it seems clear that these reactions are proceeding through the same sorts of intermediates. The earlier Tokyo reaction uses N,O acetals, while the Hyderabad one starts from the amine and aldehydes (which, to give them their due, is more convenient).

So fine, the new paper is a reasonable way to make these compounds. But what gets on my nerves is that its abstract reads:

"Indium(III) bromide has been used for the first time for the synthesis of propargyl amines in a one-pot operation from aldehydes,
amines and alkynes. This is the first example on the use of InBr3 for the activation of both alkyne and aryl imine."

And what's more, it doesn't even reference the earlier Japanese work. What's more, the Japanese authors actually published a preliminary communication of their work in 2003, in Tetrahedron Letters, the same journal that the new paper appears in. As I say, I realize that this is a (comparatively) minor sin. And I realize that it goes on all, all, all the time. But it's still wrong. And someone should have called the authors on it when the paper was reviewed.

Comments (45) + TrackBacks (0) | Category: The Scientific Literature

Merck/Schering-Plough: Waiting for J&J To Raise Their Hand

Email This Entry

Posted by Derek

One issue in the Merck/Schering Plough deal that's come up since it was announced is Johnson & Johnson's role in it. They have a deal on Remicade ( ) and its follow-up golimumab, which provides significant revenue to S-P (who have the non-US rights). J&J is no doubt weighing their options today, because Merck and Schering-Plough structured their deal, by all appearances, specifically to avoid triggering some provisions that would make these rights revert to J&J.

Over at the S-P watchdog site Shearlings Got Plowed, we find this, referring to this partnership agreement which you can find at the SEC. Scrolling down to section 8.2, one finds (emphasis added):

(c) Change in Control. If either party is acquired by a third party or otherwise comes under Control (as defined in Section 1.4 above) of a third party, it will promptly notify the other party not subject to such change of control. The party not subject to such change of control will have the right, however not later than thirty (30) days from such notification, to notify in writing the party subject to the change of Control of the termination of the Agreement taking effect immediately. As used herein "Change of Control" shall mean (i) any merger, reorganization, consolidation or combination in which a party to this Agreement is not the surviving corporation; or (ii) any "person" (within the meaning of Section 13(d) and Section 14(d)(2) of the Securities Exchange Act of 1934), excluding a party's Affiliates, is or becomes the beneficial owner, directly or indirectly, of securities of the party representing more than fifty percent (50%) of either (A) the then-outstanding shares of common stock of the party or (B) the combined voting power of the party's then-outstanding voting securities; or (iii) if individuals who as of the Effective Date constitute the Board of Directors of the party (the "Incumbent Board") cease for any reason to constitute at least a majority of the Board of Directors of the party; provided, however, that any individual becoming a director subsequent to the Effective Date whose election, or nomination for election by the party's shareholders, was approved by a vote of at least a majority of the directors then comprising the Incumbent Board shall be considered as though such individual were a member of the Incumbent Board, but excluding, for this purpose, any such individual whose initial assumption of office occurs as a result of an actual or threatened election contest with respect to the election or removal of directors or other actual or threatened solicitation of proxies or consents by or on behalf of a person other than the Board; or (iv) approval by the shareholders of a party of a complete liquidation or the complete dissolution of such party.

That's why the deal is, on paper, Schering-Plough acquiring Merck (of all things). S-P will be the "surviving corporation", you see, so J&J can just go away and keep splitting all that anti-TNF antibody revenue. Somehow I don't think that this is going to go that smoothly. Like "Condor" over at the Shearlings site, I don't see how that language about the board of directors can apply to the new board of the merged company, since Merck was clearly not a party to this agreement. It has to apply, I'd think, to the current Schering-Plough board, which will cease to exist in its present form. No, we're surely going to see some response from J&J, and very soon. They won't walk away from this one.

Comments (5) + TrackBacks (0) | Category: Business and Markets

March 9, 2009

The Merck Deal and the SEC: Not a Joke

Email This Entry

Posted by Derek

And I thought that I was kidding, at least a bit, in my post where I warned some of the folks buying into Schering-Plough last week that they might be hearing from the SEC. Well, maybe not - whenever a deal like this goes through, the first place they look is in the options market:

Some lucky option players appeared to have reaped a windfall with Schering-Plough call options rocketing after Merck on Monday announced a proposed $41.1 billion takeover of the drugmaker.

. . .a burst of activity in the stock's call options last Tuesday and again on Friday may be too much of a coincidence to overlook and prompted some option traders to ask if inside word of the pending deal reached some investors.

"Our examination of the data suggests a high degree of likelihood that someone did indeed place what I will be politically correct and call nicely timed trades," said Jon Najarian, a founder of Web information site optionmonster.com, in an email to Reuters.

Good luck explaining these, is all I can say. Telling them how lucky you felt that day won't make the folks from the enforcement division go away. As a lawyer in this business once said to me at a meeting, "I have to make sure that no one in this company trades our stock on material information. And material information is defined as something that makes you think about trading the stock."

Comments (3) + TrackBacks (0) | Category: Business and Markets | The Dark Side

Merck Actually Does It

Email This Entry

Posted by Derek

So Merck wasn’t kidding about making a large deal, were they? When I used to work for Schering-Plough back in the 1990s, there were constant rumors, the whole eight years I spent there, about the company merging or being taken over. And as far as I know, those have never really ceased – until now, that is. And Merck has always resisted the big merger route – until now.

This deal would seem to have made more sense a few years ago, when Vytorin looked ready to make huge amounts of money. (Of course, Schering-Plough was more expensive then, and perhaps Merck may have had a bit more confidence in its own pipeline back then, too). But it’s the deal we have today, so does it make sense now?

Well, up to a point. Like many of these, it works the best on paper when you talk about shedding head count and realizing all those cost savings that are supposed to be hiding in the numbers. We’ll have to see how that actually shakes out – the main research sites for the two companies are actually very close to each other in New Jersey, and I’m not sure if that’s a bug or a feature here. One interesting question is what happens to Fred Hassan, Schering-Plough’s wily and ambitious CEO. Back when the two companies first partnered up, the rumor was that Hassan was trying to use this as a springboard into Merck’s upper management, so we’ll see if that’s coming true.

As for the portfolio fit, S-P has been very upbeat about its drug pipeline, and they seem to have convinced them in Rahway. The biggest attraction seems to be the thrombin receptor antagonist program, which I wrote about here from slight personal experience. But there have been stories over the years that Pfizer was never very happy about what they ended up buying when they purchased Pharmacia / Upjohn, Hassan’s previous company. (They were primarily buying Celebrex, of course, and we all know how well that worked out, but the story is that Pfizer believed that they were getting rather more besides).

In a way, though, this deal saddens me, and that’s not because I used to work for Schering-Plough. It’s not like I’m worried about the fate of its corporate culture – to be honest, I can’t say that I much cared for a lot of that culture while I was there. But what strikes me is that Merck has been a symbol of a company that’s done well by going it alone, ruling out these kinds of deals for many years. It’s as if they’re breaking down and giving in, and since I’ve never cared much for mergers in this business anyway, seeing them do one is doubly disturbing.

And for those folks who drove Schering-Plough's stock price up 10% last Friday - hey, nice work. The SEC will be in touch with some of you shortly, I should think.

Comments (41) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

March 6, 2009

Tie Me Molecule Down, Sport

Email This Entry

Posted by Derek

There are a huge number of techniques in the protein world that relay on tying down some binding partner onto some kind of solid support. When you’re talking about immobilizing proteins, that’s one thing – they’re large beasts, and presumably there’s some tether that can be bonded to them to string off to a solid bead or chip. It’s certainly not always easy, but generally can be done, often after some experimentation with the length of the linker, its composition, and the chemistry used to attach it.

But there are also plenty of ideas out there that call for doing the same sort of thing to small molecules. The first thing that comes to mind is affinity chromatography – take some small molecule that you know binds to a given protein or class of proteins well, attach it to some solid resin or the like, and then pour a bunch of mixed proteins over it. In theory, the binding partner will stick to its ligand as it finds it, everything else will wash off, and now you’ve got pure protein (or a pure group of related proteins) isolated and ready to be analyzed. Well, maybe after you find a way to get them off the solid support as well.

That illustrates one experimental consideration with these ideas. You want the association between the binding partners to be strong enough to be useful, but (in many cases) not so incredibly strong that it can never be broken up again. There are a lot of biomolecule purification methods that rely on just these sorts of interactions, but those often use some well-worked-out binding pair that you introduce into the proteins artificially. Doing it on native proteins, with small molecules that you just dreamed up, is quite another thing.

But that would be very useful indeed, if you could get it work reliably. There are techniques available like surface plasmon resonance, which can tell with great sensitivity if something is sticking close to a solid surface. At least one whole company (Graffinity) has been trying to make a living by (among other things) attaching screening libraries of small molecules to SPR chips, and flowing proteins of interest over them to look for structural lead ideas.

And Stuart Schreiber and his collaborators at the Broad Institute have been working on the immobilized-small-molecule idea as well, trying different methods of attaching compound libraries to various solid supports. They’re looking for molecules that disrupt some very tough (but very interesting) biological processes, and have reported some successes in protein-protein interactions, a notoriously tempting (and notoriously hard) area for small-molecule drug discovery.

The big problem that people tend to have with all these ideas – and I’m one of those people, in the end – is that it’s hard to see how you can rope small molecules to a solid support without changing their character. After all, we don’t have anything smaller than atoms to make the ropes out of. It’s one thing to do this to a protein – that’ll look like a tangle of yarn with a small length of it stretching out to the side. But on the small molecule scale, it’s a bit like putting a hamster on a collar and leash designed for a Doberman. Mr. Hamster is not going to be able to enjoy his former freedom of movement, and a blindfolded person might, on picking him up, have difficulty recognizing his essential hamsterhood.

There's also the problem of how you attach that leash and collar, even if you decide that you can put up with it once it's on. Making an array of peptides on a solid support is all well and good - peptides have convenient handles at both ends, and there are a lot of well-worked-out reactions to attach things to them. But small molecules come in all sorts of shapes, sizes, and combinations of functional groups (at least, they'd better if you're hoping to see some screening hits with them). Trying to attach such a heterogeneous lot of stuff through a defined chemical ligation is challenging, and I think that the challenge is too often met by making the compound set less diverse. And after seeing how much my molecules can be affected by adding just one methyl group in the right (or wrong) place, I’m not so sure that I understand the best way to attach them to beads.

So I’m going to keep reading the tethered-small-molecule-library literature, and keep an eye on its progress. But I worry that I’m just reading about the successes, and not hearing as much about the dead ends. (That’s how the rest of the literature tends to work, anyway). For those who want to catch up with this area, here's a Royal Society review from Angela Koehler and co-workers at the Broad that'll get you up to speed. It's a high-risk, high-reward research area, for sure, so I'll always have some sympathy for it.

Comments (12) + TrackBacks (0) | Category: Analytical Chemistry | Drug Assays | General Scientific News

March 5, 2009

More on Wyeth v. Levine and Preemption

Email This Entry

Posted by Derek

For those who want it, I have more thoughts on the Wyeth v. Levine pre-emption decision over at The Atlantic's business site. Reading the decision, it looks less like a loss for the drug companies than a loss for the FDA, but see what you think.

Comments (9) + TrackBacks (0) | Category: Regulatory Affairs

Your Temperamental Diva Reactions

Email This Entry

Posted by Derek

Since I was talking the other day about getting published procedures to work (or not!), I thought I should mention that most chemists have, at one time or another, had reactions of their own that not even they can get to work right every time. Most chemical reactions are reasonably robust, within limits (see here for a proposal to establish some!) But every so often, you come across one that has a narrow tolerance, sometimes for things that you can’t even put your finger on.

I’ve seen this happen particularly in low-temperature carbanion reactions. Some of these anions don’t particularly want to form in the first place, and they can be quite sensitive to concentration, the presence of different amounts of salts and counterions, variations in temperature, and so on. The rates and efficiencies of cooling and stirring can affect some of these factors, as can the age and handling of the reagents, and the rates at which they’re added into the reaction mixture. If you’ve got a system that just barely works, a lot of things can push it over the edge.

My personal experience with this first came in grad school, when I had a cyanocuprate reagent opening an epoxide. As I mentioned on the blog a few years ago, I tried that system out, after several other reagents had given not-so-great yields, and it worked really well. So I tried it again – same results! I scaled it up (at the time, “scaled it up” meant running it on about a gram), and it worked again. Problem solved! Little did I realize that the reaction would never work again. It failed the next time, and the next, and the next. I tried everything I could think of. I made everything cleaner, I made everything fresh: no product. I made everything sloppy, with no particular care, the way I’d done it in the beginning. No product. Nothing ever worked. I never did sort out what was going wrong; it was easier, in the end, to find another reaction.

Scaling up such a reaction is especially difficult – even relatively laid-back reactions have to be looked at closely when moved up to larger scales, much less a jumpy, skittish one that gets the vapours and passes out at the first sign of trouble. It’s the job of the process chemists to avoid such narrow-window chemistry whenever possible. The idea process reaction is one that provides the same yield, with the same purity profile, under a wide range of conditions: foolproof, in other words. Naturally, nothing is really foolproof (fools are too tricky), but you do what you can.

Comments (30) + TrackBacks (0) | Category: Life in the Drug Labs

March 4, 2009

Wyeth v. Levine: Pre-emption Goes Away

Email This Entry

Posted by Derek

The idea of preemption in drug liability cases has been coming up a lot in recent years. If the FDA approves a drug, does that Federal-level approval stop liability suits at the state level, or not?

The Supreme Court has ruled today in the Wyeth v. Levine case, which directly addresses this issue. And pre-emption now appears to be a dead issue, at least in my first reading:

". . .State tort suits uncover unknown drug hazards and pro-vide incentives for drug manufacturers to disclose safety risks promptly. They also serve a distinct compensatory function that may motivate injured persons to come for-ward with information. . .

. . .Wyeth has not persuaded us that failure-to-warn claims like Levine’s obstruct the federal regulation of drug labeling. Congress has repeatedly declined to pre-empt state law, and the FDA’s recently adopted position that state tort suits interfere with its statutory mandate is entitled to no weight. Although we recognize that some state-law claims might well frustrate the achievement of congressional objectives, this is not such a case.

We conclude that it is not impossible for Wyeth to comply with its state and federal law obligations and that Levine’s common-law claims do not stand as an obstacle to the accomplishment of Congress’ purposes in the FDCA. Accordingly, the judgment of the Vermont Supreme Court is affirmed."

And that, I would say, is that.

Comments (12) + TrackBacks (0) | Category: Current Events | Regulatory Affairs

Gene Expression: You Haven't Been Thinking Big Enough?

Email This Entry

Posted by Derek

Well, here’s another crack at open-source science. Stephen Friend, the previous head of Rosetta (before and after being bought by Merck), is heading out on his own to form a venture in Seattle called Sage. The idea is to bring together genomic studies from all sorts of laboratories into a common format and database, with the expectation that interesting results will emerge that couldn’t be found from just one lab’s data.

I’ll be interested to see if this does yield something worthwhile – in fact, I’ll be interested to see if it gets off the ground at all. As I’ve discussed before, the analogy with open-source software doesn’t hold up so well with most scientific research these days, since the entry barriers (facilities, equipment, and money) are significantly higher than they are in coding. Look at genomics – the cost of sequencing has been dropping, for sure, but it’s still very expensive to get into the game. That lowered cost is measured per base sequenced – today’s technology means that you sequence more bases, which means that the absolute cost hasn’t come down as much as you might think. I’m sure you can get ten-year-old equipment cheap, but it won’t let you do the kind of experiments you might want to do, at least not in the time you’ll be expected to do them in.

But even past that issue, once you get down to the many labs that can do high-level genomics (or to the even larger number that can do less extensive sequencing), the problems will be many. Sage is also going to look at gene expression levels, something that's easier to do (although we're still not in weekend-garage territory yet). Some people would say that it's a bit too easy to do: there are a lot of different techniques in this field, not all of which always yield comparable data, to put it mildly. There have been several attempts to standardize things, along with calls for more control experiments, but getting all these numbers together into a useful form will still not be trivial.

Then you've got the really hard issues: intellectual property, for one. If you do discover something by comparing all these tissues from different disease states, who gets to profit from it? Someone will want to, that's for sure, and if Sage itself isn't getting a cut, how will they keep their operation going? Once past that question (which is a whopper), and past all the operational questions, there's an even bigger one: is this approach going to tell us anything we can use at all?

At first thought, you'd figure that it has to. Gene sequences and gene expression are indeed linked to disease states, and if we're ever going to have a complete understanding of human biology, we're going to have to know how. But. . .we're an awful long way from that. Look at the money that's been poured into biomarker development by the drug industry. A reasonable amount of that has gone into gene expression studies, trying to find clear signs and correlations with disease, and it's been rough sledding.

So you can look at this two ways: you can say fine, that means that the correlations may well be there, but they're going to be hard to find, so we're going to have to pool as much data as possible to do it. Thus Sage, and good luck to them. Or the systems may be so complex that useful correlations may not even be apparent at all, at least at our current level of understanding. I'm not sure which camp I fall into, but we'll have to keep making the effort in order to find out who's right.

Comments (16) + TrackBacks (0) | Category: Biological News | Drug Development

March 3, 2009

How Good (or Bad?) Are Patent Procedures, Anyway?

Email This Entry

Posted by Derek

All the comments on the Lundbeck / Dr. Reddy's imbroglio got me to thinking: how good are patent procedures, anyway? I said in that earlier post that I didn't think that they were that much different from procedures in the open literature, but I'd like to throw the issue open for comment.

You might think that patent procedures would be better, actually. There are potential legal implications to bad patent writeups that don't apply to lousy procedures published in a journal. You're supposed to teach how to make the new chemical matter (or how to do the new process) that you're claiming, and if your patent's details really are insufficient to fulfill that requirement, you have a problem. Patents have been invalidated over such disputes. If you thought your invention worth the trouble of patenting, you'd presumably be motivated to provide sufficient detail to make sure the patent is granted, and that it holds up if challenged.

That said, not all that many patents get seriously challenged over such issues. It takes lot of time and a lot of money, and the number of cases where it's worth the trouble are limited. And a patent has to be pretty lousy (or pretty deceptive) to truly fail to teach what its procedures outline. I guess what I'm asking about is the wide middle ground - the various procedures that aren't necessarily make-or-break for the validity of the patent, but are in there as parts of synthetic schemes. What's your success rate following these? And is it better or worse than your success rate trying to reproduce things out of, say, The Journal of Organic Chemistry?

Comments (21) + TrackBacks (0) | Category: Life in the Drug Labs | Patents and IP

March 2, 2009

Hot Chemistry, Low Tech to High

Email This Entry

Posted by Derek

Time for some lab talk. There are usually a number of different ways to attack a given problem in organic chemistry. You go with what you know, or what looks most likely to work, or what you actually have the equipment (or funds) to realize. This range of choices goes all the way down to what you’d think would be pretty trivial questions, such as: how do I heat up my reaction?

The standard way to do this is to take the usual flask you’d run the thing in at room temperature and dunk it into something hot. That can be an oil bath with a heating coil in it (good temperature control, but messy), a solid heating mantle of ceramic or metal (clean, but doesn’t change temperature so readily), a woven glass heating mantle, a sand bath on a hot plate, what have you.

Then you can go a bit higher-tech, and heat up your reaction with microwaves. I talked about this here a few years ago (and I note that somehow that stretch of blog time has never been archived on this site; I'll have to work that in some time). The early days of the technology featured (first) kitchen models hauled directly into the lab, then carousel devices built to go inside their cooking spaces. But over the years things have settled down to custom-built chemistry microwave setups, walk-up instruments that let you drop a sample tube in, set the temperature and time that you want, and walk away to pick things up later. Microwave heating has become a preferred way to run a lot of palladium-catalyzed reactions.

Does the microwave do anything special other than heat things up, though? That’s been an arguing point for several years. Various “microwave effects” have been proposed, with mechanisms ranging from the unlikely to the pretty believable. In that last category is the thought that when you’re using powdered metal catalysts, that since these absorb microwaves strongly they give some sort of local micro-heating effect that drives the reactions forward.

Could be – but apparently isn’t. A recent paper from Oliver Kappe's lab in Graz, Austria looks at Heck reactions done that way. Kappe is a recognized pioneer and expert in microwave synthesis (see his latest book, linked below), and if you're interested in the field he's well worth reading. In this case, careful experimentation established that the microwave reactions work well because of their heating profile: they get up to temperature very quickly, which seems to be beneficial. But they found no evidence of a specific microwave effect when they ran the reactions under similar heat gradients but with different energy sources.

They also tried this reaction via yet another heating technique, flow chemistry, which I last spoke about here. That turned out to be pretty interesting, too. They were pumping their two starting materials hot over a cartridge of supported palladium-on-carbon catalyst, but found a couple of odd effects. For one thing, the first flow runs tended to give a lot of side reactions, which was surprising considering how clean the conventional runs were. Looking over the system carefully, the team found that the two reactants were separating from each other as they went down the catalyst tube. They couldn’t couple as efficiently because they were pulling away from each other – the alkene coupling partner came out first, while the aryl halide dragged behind, presumably slowed down by interactions with the powdered carbon support.

The other unexpected effect was that even after partially fixing that problem, after a dozen runs or so the reactions weren't working so well. Then the earlier fractions collected and left to sit turned out to be depositing shiny mirrors of palladium metal on the insides of the glass tubes, and all became clear. The Heck reaction was leaching the palladium metal off the solid support! This had been a mechanistic proposal before, but the flow apparatus provides some real evidence to back it up. When you do this in batch mode, via microwave or whatever, the palladium species get a chance to re-absorb onto the carbon as the reaction cools down, and you're none the wiser, but the flow system just washes 'em on through.

What finally did the trick was to add very small amount of the palladium to the starting system, pump that through a hot tube reactor, and use another scavenger column to clean out the metal. You can get away with that in a Heck reaction, since they can run using ridiculously low catalyst loads. I have to say, I hadn't thought so much about this possibility; that's somewhere in between my Type I and Type II flow reactions in my own scheme.

I mentioned that Kappe has a new book, titled Practical Microwave Synthesis for Organic Chemists: Strategies, Instruments, and Protocols. I haven't seen it personally, but if you're interested in microwave work, it looks worthwhile.

Comments (22) + TrackBacks (0) | Category: Life in the Drug Labs