Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

July 29, 2011

Merck Announces More Big Cutbacks

Email This Entry

Posted by Derek

This is not good, not good at all: Merck is out this morning with earnings, and they're saying that they're going to cut at least 12% of their work force over the next four years. That's up to 13,000 jobs, and the word is that 35 to 40% of those cuts will be in the US.

This is after they'd already done a fair amount of restructuring after the Schering-Plough deal. And it makes a person wonder: was that deal such a good idea? Has Merck really gotten their money's worth out of it, or have they just brought on a big upheaval that could have been avoided? Going down the list of Schering-Plough assets that were advanced at the time of the acquisition, and the shape that they're in now, I really don't think it looks like something that just had to be done. Hindsight?

Comments (87) + TrackBacks (0) | Category: Business and Markets

2011 Drug Approvals Are Up: We Rule, Right?

Email This Entry

Posted by Derek

I've been meaning to comment on this article from the Wall Street Journal - the authors take a look at the drug approval numbers so far this year, and speculate that the industry is turning around.

Well, put me in the "not so fast" category. And I have plenty of company there. Neither Bruce Booth (from the venture capital end), John LaMattina (ex-Pfizer R&D head) nor Matthew Herper at Forbes are buying it either.

One of the biggest problems with the WSJ thesis is that most of these drugs have been in development for longer than the authors seem to think. Bruce Booth's post goes over this in detail, and he's surely correct that these drugs were basically all born in the 1990s. Nothing that's changed in the research labs in the last 5 to 10 years is likely to have significantly affected their course; we're going to have to wait several more years to see any effects. (And even then it's unlikely that we're going to get any unambiguous signals; there are too many variables in play). That, as many people have pointed out over the years, is one of the trickiest parts about drug R&D: the timelines are so long and complex that it's very hard to assign cause and effect to any big changes that you make. If your car only responds to the brake pedal and steering wheel a half hour after you touch them, how can you tell if that fancy new GPS you bought is doing you any good?

Comments (8) + TrackBacks (0) | Category: Drug Development | Drug Industry History | Press Coverage | Regulatory Affairs

July 28, 2011

The Secret History of Pfizer

Email This Entry

Posted by Derek

Here's a fascinating account at Fortune of the departure of Jeff Kindler as Pfizer's CEO. The magazine says that they interviewed over 100 people to round up the details, but some of these meetings only feature four or five people in a room, so that narrows things down a bit. It's also a back-room history of Pfizer over the last ten or fifteen years, and there's a lot of high-level political stuff that wasn't widely known at the time:

McKinnell kept boosting R&D budgets, maintaining Pfizer's "shots on goal" approach -- the more compounds you explored, in theory, the more drugs you'd generate. But drugs can take a full decade to be developed and approved, and nothing big would be ready for years.

So McKinnell fell back on the refuge of the desperate pharma CEO: In July 2002 he announced the acquisition of Pharmacia, the industry's seventh-largest company, for $60 billion in stock. But even as Pfizer struggled to digest this latest meal, McKinnell seemed to spend less and less time at headquarters, becoming head of industry trade groups, funding an institute in Africa to combat AIDS, even writing a book about reforming health care.

That left a power vacuum, and Bill Steere, the former CEO, seemed more than willing to fill it. . ."He says almost nothing," says a person familiar with Pfizer's board. "But people look to him to see how he nods and how he moves, because he knows the company better than anyone."

With Pfizer no longer soaring, internal squabbling intensified. Vexed by what he viewed as Steere's meddling, McKinnell even tried to terminate his consulting contract. Steere fended off that move. Support for him ran deep on the board: Later, when Steere turned 72, the mandatory retirement age for directors, the board raised it to 73 so he could stick around, then amended the provision again when he hit that limit.

Steere and McKinnell, former friends and colleagues, became mortal enemies. . .

Read the whole thing, if you're interested in either Pfizer or the way that human beings behave at this level of a large corporation: anonymous letters, secret meetings, all varieties of intrigue. 14th-century Florence can offer little more in the way of power politics. There are those who swim in such waters like fish, but I've devoted time and effort trying to stay away.

Comments (25) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

Massive Piles of Faked Data - But Right On Time

Email This Entry

Posted by Derek

Here's every outsourcing manager's nightmare: you contract out for research, and your CRO turns around the studies you want right on schedule. They send back the complete data package, and everything's in place. But they faked it.

Gives you the shivers, doesn't it? Well, unfortunately, it seems to be the case with an outfit called Cetero Research out of Houston. The FDA has been investigating them, and has found enough warning signs to believe that none of the data generated there can be relied on. The companies that used their services now have to decide if they need to re-run these studies themselves, which I'm sure excites them no end.

FDA is taking this action as a result of two inspections of Cetero's bioanalytical facility in Houston, Texas conducted in 2010, as well as the company's own investigation and third party audit. The inspections and audit identified significant instances of misconduct and violations of federal regulations, including falsification of documents and manipulation of samples.

The pattern of misconduct was serious enough to raise concerns about the integrity of the data Cetero generated during the five-year time frame. FDA concurs with the assessment of Cetero's independent auditor who stated, "This misconduct appears to be significant enough to cast doubt on the data generated...If the foundation of the laboratory is corrupt, then the data generated will be also."

The investigation appears to have started with an internal whistleblower, according to this letter from the FDA. The company conducted its own investigation, but the agency is slamming them both for the violations and for the inadequacy of that internal review:

According to your internal investigation, electronic records of key card building entry times demonstrated approximately 1900 instances of blood/plasma samples allegedly extracted on weekends and holidays between April 15, 2005, and June 30, 2009, where the arrival times of laboratory chemists were greater than one hour after the documented start time of the sample extraction. In addition, there were approximately 875 instances where the laboratory chemists were not present in the facility at the documented sample extraction date.

The company's original theory was that its scientists were falsifying the weekend hours in order to get paid more, but that the numbers themselves were OK. But as the FDA notes, they didn't go on to see if times were being faked during the working week (where there was no overtime incentive), and it turns out that they were. Falsus in uno, falsus in omnibus, as the lawyers say, and that appears to be the case here. A third-party investigation found many other irregularities - faked standard concentration curves and back-filling of internal standards, for example. One of the worst was the use of not-yet-analyzed samples as preliminary runs, apparently as a quick look to see if the numbers were going to come out looking good or not. That's not quite how you want to be conducting your research, and most especially if you're going to follow that up by "fixing" the numbers that you don't like. The whistleblower alleged just that. The FDA letter says that the company's conduct makes that a real possibility, and that the documention that's available sure isn't sufficient to rule it out.

You hear a lot of talk (in the comments on this blog and in other venues) about the alleged unreliability of offshore CRO work. But you don't have to go to China to get shaky data. Houston is far enough.

Update: Here's Cetero's response to the FDA.

Comments (35) + TrackBacks (0) | Category: Regulatory Affairs | The Dark Side

July 27, 2011

Bait And Switch For Type B GPCRs

Email This Entry

Posted by Derek

You hear often about how many marketed drugs target G-protein coupled receptors (GPCRs). And it's true, but not all GPCRs are created equal. There's a family of them (the Class B receptors) that has a number of important drug targets in it, but getting small-molecule drugs to hit them has been a real chore. There's Glucagon, CRF, GHRH, GLP-1, PACAP and plenty more, but they all recognize good-sized peptides as ligands, not friendly little small molecules. Drug-sized things have been found that affect a few of these receptors, but it has not been easy, and pretty much all of them have been antagonists. (That makes sense, because it's almost always easier to block some binding event rather than hitting the switch just the right way to turn a receptor on).

That peptide-to-receptor binding also means that we don't know nearly as much about what's going on in the receptor as we do for the small-molecule GPCRs, either (and there are still plenty of mysteries around even those). The generally accepted model is a two-step process: there's an extra section of the receptor protein that sticks out and recognizes the C-terminal end of the peptide ligand first. Once that's bound, the N-terminal part of the peptide ligand binds into the seven-transmembrane-domain part of the receptor. The first part of that process is a lot more well-worked-out than the second.

Now a German team has reported an interesting approach that might help to clear some things up. They synthesized a C-terminal peptide that was expected to bind to the extracellular domain of the CRF receptor, and made it with an azide coming off its N-terminal end. (Many of you will now have guessed where this is going!) Then they took a weak peptide agonist piece and decorated its end with an acetylene. Doing the triazole-forming "click" reaction between the two gave a nanomolar agonist for the receptor, revving up the activity of the second peptide by at least 10,000x.

This confirms the general feeling that the middle parts of the peptide ligands in this class are just spacers to hold the two business ends together in the right places. But it's a lot easier to run the "click" reaction than it is to make long peptides, so you can mix and match pieces more quickly. That's what this group did next, settling on a 12-amino-acid sequence as their starting point for the agonist peptide and running variations on it.

Out of 89 successful couplings to the carrier protein, 70 of the new combinations lowered the activity (or got rid of it completely). 15 were about the same as the original sequence, but 11 of them were actually more potent. Combining those single-point changes into "greatest-hit" sequences led to some really potent compounds, down to picomolar levels. And by that time, they found that they could get rid of the tethered carrier protein part, ending up with a nanomolar agonist peptide that only does the GPCR-binding part and bypasses the extracellular domain completely. (Interestingly, this one had five non-natural amino acid substitutions).

Now that's a surprise. Part of the generally accepted model for binding had the receptor changing shape during that first extracellular binding event, but in the case of these new peptides, that's clearly not happening. These things are acting more like the small-molecule GPCR agonists and just going directly into the receptor to do their thing. The authors suggest that this "carrier-conjugate" approach should speed up screening of new ligands for the other receptors in this category, and should be adaptable to molecules that aren't peptides at all. That would be quite interesting indeed: leave the carrier on until you have enough potency to get rid of it.

Comments (3) + TrackBacks (0) | Category: Biological News | Chemical News | Drug Assays

July 26, 2011

Data Handling in Collaborations

Email This Entry

Posted by Derek

I wanted to mention a book I've received, courtesy of the editors: Collaborative Computational Technologies for Biomedical Research. It's a multi-author look at various ways to handle data in all sorts of research partnerships - precompetitive consortia, academia-industrial collaborations, open-source discovery, and so on. Several levels of information are dealt with - patentable IP, raw data, notebook-sharing, etc.

Different readers will find different chapters of use - there's a lot of material covered here, with some unavoidable overlap - but anyone who's having to deal with these issues should definitely have a look.

Obligatory semi-regular note: that's an Amazon link, and this blog is an Amazon affiliate. Any purchases will send a small fee my way, which comes out of Amazon's hide, not yours.

Comments (5) + TrackBacks (0) | Category: Book Recommendations

Precious Metal Time

Email This Entry

Posted by Derek

What do you want to bet that Huw Davies and co-workers were partly interested in making dihydrofurans here, and mostly interested in having a synthetic sequence that used rhodium, silver, and then gold? Not that I blame them - personally, I'd have gone ahead and done a palladium coupling, a copper-catalyzed Ullmann of some sort, and then found something to reduce with platinum oxide. Go for the record! What is the record, I wonder?

Comments (3) + TrackBacks (0) | Category: Chemical News

Alzheimer's: The News Is Not Getting Better

Email This Entry

Posted by Derek

Is there something going on with patients in Alzheimer's trials that we didn't expect? There have been reports of an unexpected side effect (vasogenic edema) in several trials, for drugs that work through completely different mechanisms.

It makes some sense in the case of antibody-based therapies like bapineuzumab (where this problem first got attention) and solanezumab. After all, the immune system is pretty powerful stuff, and you could certainly imagine these sorts of side effects (either directly or from some effect of clearing out amyloid debris). As those reports indicate, the problem may lessen with time, and may be more severe in patients with the APOE4 allele, a known (but not understood) risk factor for Alzheimer's.

But this latest report is for the Bristol-Myers Squibb gamma-secretase inhibitor avagacestat (BMS-708163). That shouldn't be involved with any inflammatory/immune mechanisms, nor, really, with amyloid clearance. A secretase inhibitor should just keep new amyloid from being formed and deposited, which should be beneficial if the beta-amyloid theory of Alzheimer's is correct, which is what we're all still in the middle of deciding these days. Expensively and excruciatingly deciding.

Meanwhile, the most recent big clinical failure in this area continues to reverberate. Lilly's gamma-secretase inhibitor semagacestat, the first that went deep into the clinic, imploded when the company found that patients in the treatment group were deteriorating faster than those in the control group. Seven months on, they're still worse. What does this mean for the BMS compound targeting the same mechanism? That is the big, important, unanswerable question - well, unanswerable except by taking the thing deep into big clinical trials, which is what BMS is still committed to doing.

For more on Alzheimer's drug development - and it hasn't been pretty - scroll back in this category.

Comments (21) + TrackBacks (0) | Category: Alzheimer's Disease | Toxicology

July 25, 2011

Broader Impacts Indeed

Email This Entry

Posted by Derek

Since I don't have to write NSF grants, I haven't had to wrestle with "Criterion 2". But ask anyone in academic science about it. The first criterion is intellectual merit, as it darn well should be. Here's the NSF's own description (in full):

How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, the reviewer will comment on the quality of prior work.) To what extent does the proposed activity suggest and explore creative, original, or potentially transformative concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources?

But the second criterion, while initially worthy-sounding, invites trouble. It's "What are the broader impacts of the proposed activity?" Here's more description:

How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?

To me, that puts the most important question last, and even that one can be hard to answer. As for the rest, this would seem to be an open invitation to insert all sorts of nice-sounding boilerplate, or to just start making things up. The NSF itself seems to have realized this, and has been working on a revised version of this language, but here's a column from Dan Sarewitz that says that "Criterion 2.1" isn't a bit better than the old one:

At the heart of the new approach is "a broad set of important national goals". Some address education, training and diversity; others highlight institutional factors ("partnerships between academia and industry"); yet others focus on the particular goals of "economic competitiveness" and "national security". The new Criterion 2 would require that all proposals provide "a compelling description of how the project or the [principal investigator] will advance" one or more of the goals.

The nine goals seem at best arbitrary, and at worst an exercise in political triangulation. . .Yet, more troubling than the goals themselves is the problem of democratic legitimacy. In applying Criterion 2, peer-review panels will often need to choose between projects of equal intellectual merit that serve different national goals. Who gave such panels the authority to decide, for example, whether a claim to advance participation of minorities is more or less important than one to advance national security?

. . .Motivating researchers to reflect on their role in society and their claim to public support is a worthy goal. But to do so in the brutal competition for grant money will yield not serious analysis, but hype, cynicism and hypocrisy.

One of the comments to that article points out that this isn't the NSF's fault, in a way, because this exact language was mandated by Congress. And so it is - take a look at Section 526 of Title V of H.R. 5116, the "America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and Science Reauthorization Act of 2010". There's all the same language. Not only that, but the Act directs the NSF to assign people and funds to evaluating how well all these "Broader Impact" measurements are going. The director, within six months, is supposed to have implemented a policy that:

. . .requires principal investigators applying for Foundation research grants to provide evidence of institutional support for the portion of the investigator's proposal designed to satisfy the Broader Impacts Review Criterion, including evidence of relevant training, programs, and other institutional resources available to the investigator from either their home institution or organization or another institution or organization with relevant expertise.

So, in case you've lost track, the NSF is supposed to train people to implement a policy that requires grant applicants to show that their institutions are training people to implement a policy that requires grant applicants to show evidence that their work involves training people to implement a policy. I think I've got that right. A greater invitation to bullshit I cannot picture.

Comments (21) + TrackBacks (0) | Category: Academia (vs. Industry)

July 22, 2011

A Few More Victories Like This, And We Will Be Undone

Email This Entry

Posted by Derek

AstraZeneca has a lot of problems these days, so you'd think that approval of their new anticoagulant Brilinta would be reason for the company to celebrate. Not much, though - see this post at InVivoBlog for more.

A lot of companies have piled into this space over the last ten years, seeking some of those huge, huge Plavix-style revenues. But blood thinning is a tricky business. One step over the line and you're causing more problems than you're helping. And given the heterogeneity of the patient population, you never quite know where that line is going to be. By this point, too, any new therapy is going to have to compete with the generic of tried-and tested Plavix pretty soon. No, anticoagulants of all sorts don't seem to making anyone as rich as they were supposed to. King Pyrrhus would understand perfectly.

Comments (7) + TrackBacks (0) | Category: Cardiovascular Disease

Right Up Next to Academia

Email This Entry

Posted by Derek

Here's one of Pfizer's get-close-to-academia research centers, being established near UCSF. The idea is that you not only want to do deals with academic research centers (and associated small biotechs), you also want to be physically present with them:

"Proximity leads to progress; this promises to be a very strong liaison," said Dr. Warner Greene, director of virology and immunology research at Gladstone Institutes, a basic-science research nonprofit at Mission Bay that will sublease space to Pfizer. "There is a valley of death for many basic-science discoveries that have significant promise because they are not far enough advanced to be of interest to a biotech or pharmaceutical company. By forming closer relationships between Pfizer and biotech companies, I think more creative solutions can be had for moving research down the pipeline."

Now, I would like to believe that this is true, but what I'd like to believe doesn't necessarily correspond to reality. I do think that (for various reasons) it will hurt your small biopharma company's chances if you establish it in, say, Sioux Falls, Yakima, or Louisville. So being "out of the loop" can hurt, but does it follow that being ever more tightly in it helps? Does anyone have evidence that speaks to this?

Comments (28) + TrackBacks (0) | Category: Who Discovers and Why

July 21, 2011

The Public Perception of Chemistry

Email This Entry

Posted by Derek

I wanted to call attention to another blog roundtable, on several subjects related to how nonchemists see us and our business. The first post (at ScienceGeist) is on chemical safety (industrial chemicals = bad?). Day 2, at ChemJobber, is on whether the general public has any good idea of not only what chemists do (we work with chemicals, right?) but why and how we do it. Day 3, at ChemBark, takes things to a practical level, showing how lack of understanding can confuse people about energy policy (does growing corn to make ethanol make any sense?) And Day 4, at The Bunsen Boerner, is on a topic I've been known to go off on myself, the use (and mostly the misuse) of the word "organic".

Comments (24) + TrackBacks (0) | Category: Chemical News | Press Coverage

Drugs for Multiple Sclerosis: Worth the Price, Or Not?

Email This Entry

Posted by Derek

Now this is an uncomfortable study, if you're in the business of treating multiple sclerosis. An article in Neurology looks at the cost-effectiveness of several disease-modifying therapies: the two interferon-beta-1as (Avonex and Rebif), interferon-beta-1b (Betaseron) Copaxone, Betaseron and the immune modulator Copaxone (glatiramer acetate). The authors tracked ten-year quality of life, including lost time at work, overall time without relapses, and so on, and compared that with the cost of treatment.

The final figure is in dollars per quality-adjusted life year (QALY). That's not the most exact calculation in the world, but if you're going to try to rank cost-effectiveness, no measure is going to be without controversy. There's been a lot of debate about this in the UK, where the NICE explicitly uses these figures in its recommendations, and if you haven't heard much about the concept over here, well, you're definitely going to. What's considered a good figure? To give you an idea, the NICE starts raising an eyebrow at about $40K to $50K (based on 1.62 dollars to the pound). Here, we'll stand for 100K to 150K.

And how do the MS drugs compare? Closer to $1 million/QALY than any of those figures. All of them were above $800,000/QALY. In other words, the benefits of these drugs are real (although Copaxone's were less impressive compared to the interferons), but are they real enough to justify their prices? It'll be quite interesting to see where Gilenya (fingolimod) will land once it gets more of a track record in the real world. Note that the price of all these drugs has gone up since the study's calculations (while their effectiveness has presumably not budged) and that Gilenya, I believe, costs even more than the rest of them.

This naturally brings up all the usual questions about drug pricing. In no particular order, and with no priority given to those that I agree with, we have: Who says you can put a price on quality of life? Well, if you can't, then why can't drug companies just charge whatever the market will bear? What market - drug pricing is about the worst example of a free market you could ask for! Well, what if people want to pay out of their own pockets - shouldn't they be free to? Right, sure, who does that, and how much would sales fall if everyone had to? But still, shouldn't people be able to get what therapies are available - who's the person who gets to tell patients that they can't have what's out there? OK, but since a lot of this is Medicare and the like, are we supposed to pay for everything to be done to everyone forever? And why should these drugs cost so much, anyway? Well, because insurance companies apparently will pay for them - why don't you go complain to them? And so on. I honestly have no idea what the end to these arguments might be, but studies like this one are going to force us to have them again. Here's more from the New York Times and from Bloomberg, with a hat tip to FiercePharma.

Comments (25) + TrackBacks (0) | Category: Drug Prices | The Central Nervous System

July 20, 2011

Will Macrocycles Get It Done?

Email This Entry

Posted by Derek

Here's an article from Xconomy on Ensemble Therapeutics, a company that spun off from work in David Liu's lab at Harvard. Their focus these days is on a huge library of macrocyclic compounds (prepared by using DNA tags to bring the reactants together, which is a topic for a whole different post). They're screening against several targets, and with several partners. Why macrocycles?

Well, there's been a persistent belief, with some evidence behind it, that medium- and large-ring compounds are somehow different. Cyclic peptides certainly can be distinguished from their linear counterparts - some of that can be explained by their being unnatural (and poor) substrates for some of the proteases that would normally clear them out, but there can be differences in distribution and cell penetration as well. The great majority of non-peptidic macrocycles that have been studied in biological systems are natural products - plenty of classic antibiotics and the like are large rings. I worked on one for my PhD, although I never quite closed the ring on the sucker.

You can look that that natural product distribution in two ways: one view might be that we have an exaggerated idea of the hit rate of macrocycles, because we've been looking at a bunch of evolutionarily optimized compounds. But the other argument is that macrocycles aren't all that easy to make, therefore evolutionary pressures must have led to so many of them for some good reasons, and we should try to take advantage of the evidence that's in front of us.

What's for sure is that macrocyclic compounds are under-represented in drug industry screening collections, so there's an argument to be made just on that basis. (You do see them once in a while). And the chemical space that they cover is probably not something that other compounds can easily pick up. Large rings are a bit peculiar - they have some conformational flexibility, in most cases, but only within a limited range. So if you're broadly in the right space for hitting a drug target, you probably won't pay as big an entropic penalty when a macrocycle binds. It already had its wings clipped to start with. And as mentioned above, there's evidence that these compounds can do a better job of crossing membranes than you'd guess from their size and functionality. One hope is that these properties will allow molecular weight ranges to be safely pushed up a bit, allowing a better chance for hitting nontraditional targets such as protein-protein interactions.

All this has led to a revival of med-chem interest in the field, so Ensemble is selling their wares at just the right time. One reason that there haven't been so many macrocycles in the screening decks is that they haven't been all that easy to make. But besides Liu's DNA templating, some other interesting synthetic methods have been coming along - the Nobel-worthy olefin metathesis reaction has been recognized for some time as a good entry into the area, and Keith James out at Scripps has been publishing on macrocyclic triazoles via the copper-catalyzed click reaction. Here's a recent review in J. Med. Chem., and here's another. It's going to be interesting to see how this all works out - and it's also a safe bet that this won't be the only neglected and tricky area that we're going to find ourselves paying more attention to. . .

Comments (31) + TrackBacks (0) | Category: Chemical News | Drug Development

July 19, 2011

Sezen / Sames: What Does it Say About Grad School?

Email This Entry

Posted by Derek

If you haven't seen it, Chembark has Part III of the series on the Sezen/Sames research scandal. And it's another good one, focusing this time on Prof. Sames and his responsibilities in the whole affair. Everyone who's interested should go over to Paul's blog to read what he has to say about things. He's not keeping things bottled up:

Apparently, there is a double standard when it comes to judging students and professors. I guess that shouldn’t surprise anyone. Apparently, students should be fired for failure to replicate fictitious results, but professors are to be rewarded with tenure for being so grossly negligent as to oversee the greatest case of scientific misconduct in the history of organic chemistry.

But that quote shouldn't give you the idea that his post is all invective - there's a lot to back up those statements as well. I'll add that I'm not surprised by a double standard, either - after all, tenured professors are around for years. They bring in grant money (and overhead), while students. . .well, they're transient, and there are always more of them where the last bunch came from.

And while I think it would be a good thing if some of that were to change, I'm not optimistic about that happening. Unstacking that deck would be very, very hard. What would help a bit, though, would be for graduate students (and prospective graduate students) to realize that the deck is stacked, or in some of the more extreme cases of cluelessness, to realize that the deck exists in the first place. Forewarned is forearmed. You are in a very unequal and potentially precarious position as a graduate student, which is one the reasons my standard grad-school advice is to get a PhD as quickly as is consistent with honor and propriety. Don't hang around one day longer than you have to. My own university educated me in that regard: whenever it was more advantageous for them to consider us students, well, that's what we were. Did it then, five minutes later, cost them less money and trouble with respect to some other issue to consider us staff? Then we were staff. Whatever put the university in a better relative position or allowed them to save a nickel.

That's not to say the world beyond graduate school is fair, because it isn't, of course. Wide-ranging hopes in that line will not serve you well. Fred Schwed put it well, quoting what he called "the falsest text in the language" (from Sterne), to the effect that the Lord tempereth the wind to the shorn lamb. "He doesn't, you know,, said Schwed. "Look around you". But at least in some other spheres there are usually more options, more means of redress, than are available to any graduate student. Those problems with university administration are small compared to the potential for trouble with your own professor, and in many cases there's not a damn thing you can do about it - even being in the right may help least of all. The students dismissed from the Sames group over Sezen's work appear, from this vantage point, to have been quite correct about her conduct and the quality of her work. In vain.

Comments (68) + TrackBacks (0) | Category: Graduate School | The Dark Side

Back to Blogging

Email This Entry

Posted by Derek

Just wanted to let everyone know that I'm back, and back to blogging. As usual, I'm spending some of my morning working out again what it is that I do for a living - although, again as usual, many people have been coming by my office trying to remind me. And I'll be putting up a post at lunchtime.

But in the meantime, since I've been totally out of most any loop you could name, what have I missed in the last week? Anything interesting?

Comments (16) + TrackBacks (0) | Category: Blog Housekeeping

July 13, 2011

Book Review: The Quest for the Cure

Email This Entry

Posted by Derek

I wanted to mention that I have a review up at Cell for a new book by Brent Stockwell (at Columbia): The Quest for the Cure: The Science and Stories Behind the Next Generation of Medicines. I found it a good summary of recent drug discovery, and a look at the attempts to attack "undruggable" targets like protein-protein interaction, transcription factors, and so on. It's written for an educated general readership, and one of the things I wondered about was how books like this find an audience.

Comments (45) + TrackBacks (0) | Category: Book Recommendations

July 11, 2011

On and Off

Email This Entry

Posted by Derek

I'm going to be traveling this week, so posting will be intermittent. (It's summer, after all). I'll surface now and then, but for the most part, things will be quiet around here. Enjoy the weather, if you have weather to enjoy!

Comments (3) + TrackBacks (0) | Category: Blog Housekeeping

July 8, 2011

The Sames / Sezen Fraud Case: Holy Cow

Email This Entry

Posted by Derek

C&E News has an extraordinary piece on the long-running Bengü Sezen case at Columbia U. They've obtained two detailed reports from the federal government on the matter, and, well, pretty much all ones worst suspicions are confirmed:

By the time Sezen received a Ph.D. degree in chemistry in 2005, under the supervision of Sames, her fraudulent activity had reached a crescendo, according to the reports. Specifically, the reports detail how Sezen logged into NMR spectrometry equipment under the name of at least one former Sames group member, then merged NMR data and used correction fluid to create fake spectra showing her desired reaction products.

The documents paint a picture of Sezen as a master of deception, a woman very much at ease with manipulating colleagues and supervisors alike to hide her fraudulent activity; a practiced liar who would defend the integrity of her research results in the face of all evidence to the contrary. Columbia has moved to revoke her Ph.D.

. . .the reports echo sources from inside the Sames lab who spoke with C&EN under conditions of anonymity when the case first became public in 2006. These sources described Sezen as Sames’ “golden child,” a brilliant student favored by a mentor who believed that her intellect and laboratory acumen provoked the envy of others in his research group. They said it was hard to avoid the conclusion that Sames retaliated when other members of his group questioned the validity of Sezen’s work.

For more on this, see ChemBark, where the same documents have been obtained (via FOIA requests). Here's Part One, and Part Two. The site has been on this case for a long time now, and that's the place to go for the details - and if you're a chemist, or are interested in what human beings are capable of getting up to, you'll want to read them. Years of faked NMR spectra, faked reaction products, faked logbooks - this is surely one of the longest-running and most thorough frauds in modern organic chemistry history. I await Part Three!

Comments (126) + TrackBacks (0) | Category: The Dark Side

The Duke Cancer Scandal and Personalized Medicine

Email This Entry

Posted by Derek

Here's a good overview from the New York Times of the Duke scandal. Basically, a team there spent several years publishing high-profile papers, and getting high-profile funding, and treating cancer patients based on their own tumor-profiling biomarker work. Which was shoddy, as it turns out, and useless, and wasted everyone's time, money, and (in some cases) the last weeks or months of people's lives. I think that about sums it up. It was Keith Baggerly at M. D. Anderson who really helped catch what was going on, and Retraction Watch has a good link to his presentation on the whole subject.

The lead investigator in this sordid business, Anil Potti, ended up retracting four papers on the work and left Duke last fall (although he's since resurfaced at a cancer treatment center in South Carolina). That's an interesting hiring decision. Looking over the case (and such details of it as Potti lying about having a Rhodes Scholarship), I don't think I'd consider hiring him to mow my yard. Perhaps that statement will be something for his online reputation management outfit to deal with.

But enough about Dr. Potti himself; I hope I never hear about him again. What this case illustrates are several very important problems with the whole field of personalized medicine, and with its public perception. First off, for some years now, everyone has been hearing about the stuff: the coming age of individual cancer treatment, biomarkers, zeroing in on the right drugs for the right patient, and so on. You'd almost get the impression that this age is already here. But it isn't, not yet. It's just barely, barely begun. By one estimate, no major new cancer biomarker has been approved for clinical use in 25 years. Update: changed the language here to reflect differences of opinion!)

Why is that? What's holding things up? We can read off DNA so quickly these days - what's to stop us from just ripping through every cancer sample there is, matching those up with who responded to which treatment regime and which cancer targets are (over)expressed, and there you have it. That's what all these computers are for, right?

Well, that sort of protocol has, in fact, occurred to many researchers. And it's been tried, over and over, without a whole lot of success. Now, there are some good correlations, here and there - but the best ones tend to be in relatively rare tumor types. There's nowhere near as much overlap as we'd like between the cancers that present the most serious public health problems and the ones that we have good biomarker-driven treatment data for. Breast cancer may be one of the fields where things have moved along the most - treatment really is affected by checking for things like Her-2. But it's not enough, nowhere near enough.

So why, then, is that the case? Several reasons - for one, tumor biology is clearly a lot more complex than we'd like it to be. Many common forms of cancer present as a host of mutated cells, each with a host of mutations (see this breast cancer work for an example). And they're genetically unstable, constantly changing. That's why so many cancers relapse after initially successful treatment - you kill off the tumor cells that can be killed off, but that may just give the ones that are left a free field.

Given this state of affairs, and the huge need (and demand) for something that works, the field is primed for just the sort of trouble that occurred at Duke. Someone unscrupulous would have no problem convincing people that a hot new biomarker was worthwhile - any patients that survived would praise it to the skies, while the ones that didn't would not be around to add their perspective. And even without criminal behavior, it's all too easy for researchers to honestly believe that they're on to something, even what that isn't true. The statistical workup needed to go through data sets like these is not trivial; you really have to know what you're doing. Adding to the problem, a number of judgment calls can be made along the way about what to allow, what to emphasize, and what to ignore.

The other problem is that cancer is such an emotional issue. It's very easy for anyone with a drum to beat to join in at full volume. Do you think that the FDA is letting all sorts of toxic junk through? Or do you think that the FDA is killing people by being stupidly cautious? Are drug companies ignoring dying patients, or ruthlessly profiteering off them? Are there too few good ideas for people to work on, or too many? Come to oncology; you can find plenty of support for whatever position you like. They can't all be right, but when did that ever slow anyone down? Besides, that means that there will invariably be Wrong-Thinking Evil People on the other side of any topic, and that's always stimulating, too.

It is, in fact, a mess. Nor are we out of it. But our only hope to is to keep hacking away. Wish us luck!

Comments (22) + TrackBacks (0) | Category: Cancer | Clinical Trials | Regulatory Affairs | The Dark Side

July 7, 2011

Phenotypic Screening For the Win

Email This Entry

Posted by Derek

Here's another new article in Nature Reviews Drug Discovery that (for once) isn't titled something like "The Productivity Crisis in Drug Research: Hire Us And We'll Consult Your Problems Away". This one is a look back at where drugs have come from.

Looking over drug approvals (259 of them) between 1999 and 2008, the authors find that phenotypic screens account for a surprising number of the winners. (For those not in the business, a phenotypic screen is one where you give compounds to some cell- or animal-based assay and look for effects. That's in contrast to the target-based approach, where you identify some sort of target as being likely important in a given disease state and set out to find a molecule to affect it. Phenotypic screens were the only kinds around in the old days (before, say, the mid-1970s or thereabouts), but they've been making a comeback - see below!)

Out of the 259 approvals, there were 75 first-in-class drugs and 164 followers (the rest were imaging agents and the like). 100 of the total were discovered using target-based approaches, 58 through phenotypic approaches, and 18 through modifying natural substances. There were also 56 biologics, which were all assigned to the target-based category. But out of the first-in-class small molecules, 28 of them could be assigned to phenotypic assays and only 17 to target-based approaches. Considering how strongly tilted the industry has been toward target-based drug discovery, that's really disproportionate. CNS and infectious disease were the therapeutic areas that benefited the most from phenotypic screening, which makes sense. We really don't understand the targets and mechanisms in the former, and the latter provide what are probably the most straightforward and meaningful phenotypic assays in the whole business. The authors' conclusion:

(this) leads us to propose that a focus on target-based drug discovery, without accounting sufficiently for the MMOA (molecular mechanism of action) of small-molecule first-in-class medicines, could be a technical reason contributing to high attrition rates. Our reasoning for this proposal is that the MMOA is a key factor for the success of all approaches, but is addressed in different ways and at different points in the various approaches. . .

. . .The increased reliance on hypothesis-driven target-based approaches in drug discovery has coincided with the sequencing of the human genome and an apparent belief by some that every target can provide the basis for a drug. As such, research across the pharmaceutical industry as well as academic institutions has increasingly focused on targets, arguably at the expense of the development of preclinical assays that translate more effectively into clinical effects in patients with a specific disease.

I have to say, I agree (and have said so here on the blog before). It's good to see some numbers put to that belief, though. This, in fact, was the reason why I thought that the NIH funding for translational research might be partly spent on new phenotypic approaches. Will we look back on the late 20th century/early 21st as a target-based detour in drug discovery?

Comments (36) + TrackBacks (0) | Category: Drug Assays | Drug Development | Drug Industry History

July 6, 2011

A First Step Toward A New Form of Life

Email This Entry

Posted by Derek

There's been a real advance in the field of engineered "unnatural life", but it hasn't produced one-hundredth the headlines that the arsenic bacteria story did. This work is a lot more solid, although it's hard to summarize in a snappy way.

Everyone knows about the four bases of DNA (A, T, C, G). What this team has done is force bacteria to use a substitute for the T, thymine - 5-chlorouracil, which has a chlorine atom where thymine's methyl group is. From a med-chem perspective, that's a good switch. The two groups are about the same size, but they're different enough that the resulting compounds can have varying properties. And thymine is a good candidate for a swap, since it's not used in RNA, thus limiting the number of systems that have to change to accommodate the new base. (RNA, of course, uses uracil instead, the unsubstituted parent compound of both thymine and the 5-chloro derivative used here).

Over the years, chlorouracil has been studied in DNA for just that reason, and it's been found to make the proper base-pair hydrogen bonds, among other things. So incorporating it into living bacteria looks like an experiment in just the right spot - different enough to be a real challenge, but similar enough to be (probably) doable. People have taken a crack at similar experiments before, with mixed success. In the 1970s, mutant hamster cells were grown in the presence of the bromo analog, and apparently generated DNA which was strongly enriched with that unnatural base. But there were a number of other variables that complicated the experiment, and molecular biology techniques were in their infancy at the time. Then in 1992, a group tried replacing the thymine in E. coli with uracil, with multiple mutations that shut down the T-handling pathways. They got up to about 90% uracil in the DNA, but this stopped the bacteria from growing - they just seemed to be hanging on under those T-deprived conditions, but couldn't do much else. (In general, withholding thymine from bacterial cultures and other cells is a good way to kill them off).

This time, things were done in a more controlled manner. The feat was accomplished by good old evolutionary selection pressure, using an ingenious automated system. An E. coli strain was produced with several mutations in its thymine pathways to allow it to survive under near-thymine-starvation conditions. These bacteria were then grown in a chamber where their population density was being constantly measured (by turbidity). Every ten minutes a nutrient pulse went in: if the population density was above a set limit, the cells were given a fixed amount of chlorouracil solution to use. If the population had falled below a set level, the cells received a dose of thymine-containing solution to keep them alive. A key feature of the device was the use of two culture chambers, with the bacteria being periodically swapped from one to the other (which the first chamber undergoes sterilization with 5M sodium hydroxide!) That's to keep biofilm formation from giving the bacteria an escape route from the selection pressure, which is apparently just what they'll do, given the chance. One "culture machine" was set for a generation time of about two hours, and another for a 4-hour cycle (by cutting in half the nutrient amounts). This cycle selected for mutations that allowed the use of chlorouracil throughout the bacteria's biochemistry.

And that's what happened - the proportion of the chlorouracil solution that went in went up with time. The bacterial population had plenty of dramatic rises and dips, but the trend was clear. After 23 days, the experimenters cranked up the pressure - now the "rescue" solution was a lower concentration of thymine, mixed 1:1 with chlorouracil, and the other solution was a lower concentration of chlorouracil only. The proportion of the latter solution used still kept going up under these conditions as well. Both groups (the 2-hour cycle and the 4-hour cycle ones) were consuming only chlorouracil solution by the time the experiment went past 140 days or so.

Analysis of their DNA showed that it had incorporated about 90% chlorouracil in the place of thymine. The group identified a previously unknown pathway (U54 tRNA methyltransferase) that was bringing thymine back into the pathway, and disrupting this gene knocked the thymine content down to just above detection level (1.5%). Mass spec analysis of the DNA from these strains clearly showed the chlorouracil present in DNA fractions.

The resulting bacteria from each group, it turned out, could still grow on thymine, albeit with a lag time in their culture. If they were switched to thymine media and grown there, though, they could immediately make the transition back to growing on chlorouracil, which shows that their ability to do so was now coded in their genomes. (The re-thymined bacteria, by the way, could be assayed by mass spec as well for the disappearance of their chlorouracil).

These re-thymined bacteria were sequenced (since the chloruracil mutants wouldn't have matched up too well with sequencing technology!) and they showed over 1500 base substitutions. Interestingly, there were twice as many in the A-T to G-C direction as the opposite, which suggests that chlorouracil tends to mispair a bit with guanine. The four-hour-cycle strain had not only these sorts of base swaps, but also some whole chromosome rearrangements. As the authors put it, and boy are they right, "It would have been impossible to predict the genetic alterations underlying these adaptations from current biological knowledge. . ."

These bacteria are already way over to the side of all the life on Earth. But the next step would be to produce bacteria that have to live on chlorouracil and just ignore thymine. If that can be realized, the resulting organisms will be the first representatives of a new biology - no cellular life form has ever been discovered that completely switches out one of the DNA bases. These sorts of experiments open the door to organisms with expanded genetic codes, new and unnatural proteins and enzymes, and who knows what else besides. And they'll be essentially firewalled from all other living creatures.

Postscript: and yes, it's occurred to me as well that this sort of system would be a good way to evolve arsenate-using bacteria, if they do really exist. The problem (as it is with the current work) is getting truly phosphate-free media. But if you had such, and ran the experiment, I'd suggest isolating small samples along the way and starting them fresh in new apparatus, in order to keep the culture from living off the phosphate from previous generations. Trying to get rid of one organic molecule is hard enough; trying to clear out a whole element is a much harder proposition).

Comments (17) + TrackBacks (0) | Category: Biological News | Chemical Biology | Life As We (Don't) Know It

July 5, 2011

Fakery, As Revealed By Figures

Email This Entry

Posted by Derek

I note via Retraction Watch that the Journal of Biological Chemistry has issued retraction notices for four papers published from the group of the late Maria Diverse-Pierluissi, at the Mt. Sinai School of Medicine. One of their readers looked over the papers (which had been cited a few times, without making any particular huge impact, it seems), and found that some of the figures (Western blots and so on) repeat, even though they're supposed to represent different things (e.g., Figure 3A and 3C here).

Mt. Sinai told the Retraction Watch people that an internal investigation turned up the evidence of misconduct, and that the matter has been referred to the NIH, which funded the work. What those duplicate figures make me wonder, though, is how long it'll be before we have a plagiarized-figure search tool, in the same way that we have plagiarized-text tools running? There's already something similar out there - TinEye - and I'm sure that much nicer systems are available for a fee. Have any scientific journals implemented something like this?

Comments (18) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

July 2, 2011

Innovation and Return (Europe vs. the US)

Email This Entry

Posted by Derek

Here's another look at the productivity problems in drug R&D. The authors are looking at attrition rates, development timelines, targets and therapeutic areas, and trying to find some trends to explain (or at least illuminate) what's been going on.

Their take? Attrition rates have been rising at all phases of drug development, and most steeply in Phase III. (This sounds right to me). Here are their charts:
Attrition%20rates.png
And when they look at where the drug R&D efforts have been going, they find that comparatively more time and money has been spent on targets with lower probability of success. That means (among other things) more oncology, Alzheimer's, arthritis, Parkinson's et al. and less cardiovascular and anti-HIV.

That makes sense, too, in a paradoxical way. If we were to get drugs in those areas, the expected returns would be higher than if we found them in the well-established ones. The regulatory barriers would be smaller, the competition thinner, the potential markets are enthusiastic about new therapies - everything's lined up. If you can find a drug, that is. The problem is the higher failure rates. We knew that going in, of course, but the expectation was that the greater rewards would cancel that out. But what if they don't? What if, for a protracted period, there are no rewards at all?

The paper also has a very interesting analysis of European firms versus US ones. Instead of looking at where companies might be headquartered, the authors used the addresses of the inventors on patent filings as a better location indicator. Over 18,000 projects started by companies or public research organizations between 1990 and 2007 were examined, and they found:

Although at a first glance, European organizations seem to have higher success rates compared with US organizations, after controlling for the larger share of biotechnology companies and PROs in the United States and for differences in the composition of R&D portfolios, there is no significant gap between European and US organizations in this respect. Unconditional differences (that is, differences arising when no controls are taken into account) are driven by the higher propensity of US organizations to focus on novel R&D methodologies and riskier therapeutic endeavours. . .as an average US organization takes more risk, when successful, they attain higher price premiums than the European organizations.

The other take-home has to do with "me-too" compounds versus first-in-class ones, and is worth considering:

". . .both private and public payers discourage incremental innovation and investments in follow-on drugs in already established therapeutic classes, mostly by the use of reference pricing schemes and bids designed to maximize the intensity of price competition among different molecules. Indeed, in established markets, innovative patented drugs are often reimbursed at the same level as older drugs. As a consequence, R&D investments tend to focus on new therapeutic targets, which are characterized by high uncertainty and difficulty, but lower expected post-launch competition. Our empirical investigation indicates that this reorienting of investments accounts for most of the recent decline in productivity in pharmaceutical R&D, as measured in terms of attrition rates, development times and the number of NMEs launched."

So, rather than being in trouble for not trying to be innovative enough, according to these guys, we're in trouble for innovating too much. . .

Comments (26) + TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Industry History

July 1, 2011

Avastin and Medicare

Email This Entry

Posted by Derek

So, the FDA's advisors voted unanimously to remove Avastin's indication for metastatic breast cancer. Fine. But Medicare is saying that they'll continue to cover it? Really?

Now, as opposed to the other day, we're starting to talk about cost. Avastin is (famously) not cheap, and insurance companies often don't want to reimburse for off-label use of drugs. But if Medicare, well, doesn't care, then what? A number of insurance companies take their policies into account for their own coverage recommendations.

So this makes a person wonder what all the arguing over this issue has accomplished. Perhaps fewer oncologists will be willing to write off-label prescriptions after the FDA makes its call - there is that. But (on the one hand) this isn't looking quite like the consigning-people-to-death outcome that patient advocates were warning about. It also gives you an insight into health care costs, doesn't it? The FDA says "We don't recommend you use this. The clinical trial data don't support it." And Medicare says "Well, yeah, sure, but we'll pay for it, so what the hey". What, indeed, the hey?

Comments (13) + TrackBacks (0) | Category: Cancer | Drug Prices | Regulatory Affairs

The Histamine Code, You Say?

Email This Entry

Posted by Derek

I've been meaning to link to John LaMattina's blog for some time now. He's a former R&D guy (and author of Drug Truths: Dispelling the Myths About Pharma R & D, which I reviewed here for Nature Chemistry), and he knows what he's talking about when it comes to med-chem and drug development.

Here he takes on the recent "Scientists Crack the Histamine Code" headlines that you may have seen this week. Do we have room, he wonders, for a third-generation antihistamine, or not?

Comments (17) + TrackBacks (0) | Category: Biological News | Drug Industry History