Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

March 29, 2013

Two New Books

Email This Entry

Posted by Derek

A colleague pointed out to me this week that there's a new edition of Copeland's Evaluation of Enzyme Inhibitors in Drug Discovery. I haven't seen this expanded and updated version, but the previous one was excellent. From the new preface:

. . .I have attempted to improve upon the first edition by substantially expanding most of the chapters with two overarching aims: to cover more completely the experimental aspects of the evaluation methods contained in each chapter and to enhance the clarity of the presentation, especially for the newcomer to applied enzymology. Toward these ends, a number of additional appendices have been added to the text, providing ready sources of useful information as they apply to quantitative biochemistry in drug discovery.

There are also two new chapters - one on residence time as a factor in enzyme inhibitor action, and another on the connections between in vitro enzymology and the factors in vivo that have to be considered for drug candidate selection. I have no problem recommending this one just on this basis.

And on a different (but still very useful) level, Erland Stevens of Davidson College has sent along a new textbook on medicinal chemistry that he's written for the advanced undergraduate/grad student market. I've looked it over, and it's a fine intro to the field, covering an impressively wide range of topics. All the classic stuff is there, but you'll also find references up to at least 2010, including things like George Whitesides' paper on linkers in fragment-based drug design, the structure of the P2X4 ion channel, and screening of crystallization conditions for API synthesis. If I were teaching a survey course on medicinal chemistry, I would be glad to use this as a text.

Comments (2) + TrackBacks (0) | Category: Book Recommendations

The Price of Publishing

Email This Entry

Posted by Derek

So, how much does it cost to publish a scientific paper, anyway? I'm not only talking about how much it costs you. That varies from journal to journal, and from type of journal to type of journal. One aspect of most open-access publishing models is that the author defrays editorial costs. (Which model is, of course, open to abuse by the sleazy). How much does it cost the publishers themselves, and how much do they make on it? There's an excellent overview at Nature that tries to put some real numbers on these questions:

Data from the consulting firm Outsell in Burlingame, California, suggest that the science-publishing industry generated $9.4 billion in revenue in 2011 and published around 1.8 million English-language articles — an average revenue per article of roughly $5,000. Analysts estimate profit margins at 20–30% for the industry, so the average cost to the publisher of producing an article is likely to be around $3,500–4,000.

In case you were wondering why we have so many journals. And if you're still wondering, for some reason, try these numbers on:

Scientists pondering why some publishers run more expensive outfits than others often point to profit margins. Reliable numbers are hard to come by: Wiley, for example, used to report 40% in profits from its science, technology and mathematics (STM) publishing division before tax, but its 2013 accounts noted that allocating to science publishing a proportion of 'shared services' — costs of distribution, technology, building rents and electricity rates — would halve the reported profits. Elsevier's reported margins are 37%, but financial analysts estimate them at 40–50% for the STM publishing division before tax. (Nature says that it will not disclose information on margins.) Profits can be made on the open-access side too: Hindawi made 50% profit on the articles it published last year. . .

Might I add that scientific publishing, for all the upheavals in it, is probably a slightly less risky bet than drug discovery? I keep planning to do a big post on pharmaceutical profit margins - and when I do, it's going to sound like an accounting seminar - and this makes me want to move it closer to the top of the list.

Comments (10) + TrackBacks (0) | Category: The Scientific Literature

Sirtuins Live On at GSK

Email This Entry

Posted by Derek

Well, GSK is shutting down the Sirtris operation in Cambridge, but sirtuins apparently live on. I'm told that the company is advertising for chemists and biologists to come to Pennsylvania to staff the effort, and in this market, they'll have plenty of takers. We'll have the sirtuin drug development saga with us for a while yet. And I'm glad, actually, and no, not just because it gives me something to write about. I'd like to know what sirtuins actually are capable of doing in humans, and I'd like to see a drug or two come out of this. What the odds of that are, though, I couldn't say. . .

Comments (18) + TrackBacks (0) | Category: Drug Development

March 28, 2013

Pfizer Tears It All Down

Email This Entry

Posted by Derek

Yeah, I know, that's a headlines that could have been used several times over the years. But this time, they mean it: the company is demolishing their former research headquarters off Eastern Point Road in Groton. And local officials and developers aren't happy at all:

Officials involved in the negotiations said last-minute obstacles thrown up by Pfizer after a major developer had offered to purchase the 750,000-square-foot complex known as Building 118 made it appear as though the pharmaceutical giant never had been serious about finding a buyer.

. . .(Developer Stu) Lichter echoed legislators' suspicions that Pfizer never really intended to sell the building, despite the fact that it will cost more to demolish the structure than it would have to sell it, even for a nominal price.

Lichter said Pfizer initially told him that the company had a schedule for demolition, but if a deal could be worked out within a certain timetable, officials would seriously consider an offer.

But, according to Lichter, Pfizer kept bringing up additional issues that would stall negotiations.

Guys, wrecking balls are what Pfizer does.

Comments (29) + TrackBacks (0) | Category: Business and Markets

X-Ray Structures Of Everything. Without Crystals. Holy Cow.

Email This Entry

Posted by Derek

There's an absolutely startling new paper out from Makoto Fujita and co-workers at the University of Tokyo. I've written a number of times here about X-ray crystallography, which can be the most powerful tool available for solving the structures of both large and small molecules - if you can get a crystal, and if that crystal is good enough. Advances in X-ray source brightness, in detectors, and in sheer computational power have all advanced the field far beyond what Sir Lawrence Bragg could have imagined. But you still need a crystal.

Maybe not any more, you don't. This latest paper demonstrates that if you soak a solution of some small molecule in a bit of crystalline porous "molecular sponge", you can get the x-ray structure of the whole complex, small molecules and all. If you're not a chemist you might not feel the full effect of that statement, but so far, every chemist I've tried it out on has reacted with raised eyebrows, disbelief, and sometimes a four-letter exclamation for good measure. The idea that you can turn around and get a solid X-ray structure of a compound after having soaked it with a tiny piece of crystalline stuff is going to take some getting used to, but I think we'll manage.
santonin%20lattice.pngsantonin.png
The crystalline stuff in question turns out to be two complexes with tris(4-pyridyl)triazine and either cobalt isothiocyanate or zinc iodide. These form large cage-like structures in the solid state, with rather different forms, but each of them seems to be able to pick up small molecules and hold them in a repeating, defined orientation. Shown is a lattice of santonin molecules in the molecular cage, to give you the idea.

Just as impressive is the scale that this technique works on. They demonstrate that by solving the structure of a marine natural product, miyakosyne A, using a 5-microgram sample. I might add that its structure certainly does not look like something that is likely to crystallize easily on its own, and indeed, no crystal is known. By measuring the amount of absorbed material in other examples and extrapolating down to their X-ray sample size, the authors estimate that they can get a structure on as little as 80 nanograms of actual compound. Holy crap.

Not content with this, the paper goes on to show how this method can be applied to give a completely new form of analysis: LC/SCD. Yes, that means what it says - they show that you can run an HPLC separation on a mixture, dip bits of the molecular sponge in the fractions, and get (if you are so inclined) X-ray structures of everything that comes off your column. Now, this is not going to be a walk-up technique any time soon. You still need a fine source of X-rays, plenty of computational resources, and so on. But just the idea that this is possible makes me feel as if I'm reading science fiction. If this is as robust as it looks like, the entire field of natural product structure determination has just ended.

Here's a comment in the same issue of Nature from Pierre Stallforth and Jon Clardy, whose opinions on X-ray crystallography are taken seriously by anyone who knows anything about the field. This new work is described as "breathtakingly simple", and furthermore, that "One can even imagine that, in the near future, researchers will not bother trying to crystallize new molecules". Indeed one can.

I would guess that there are many more refinements to be made in what sorts of host frameworks are used - different ones are likely to be effective for different classes of compounds. A number of very interesting extensions to this idea are occurring to me right now, and I'm sure that'll be true for a lot of the people who will read it. But for now, what's in this paper is plenty. Nobel prizes have been given for less. Sir Lawrence Bragg, were he with us, would stand up and lead the applause himself.

Update: as those of you reading up on this have discovered by now, the literature on metal-organic frameworks (MOFs) is large and growing. But I wanted to highlight this recent report of one with pore large enough for actual proteins to enter. Will they?

And here's more on the story from Nature News.

Comments (40) + TrackBacks (0) | Category: Analytical Chemistry

March 27, 2013

A Therapy Named After You?

Email This Entry

Posted by Derek

Back last fall I wrote about Prof. Magnus Essand and his oncoloytic virus research. He's gotten a good amount of press coverage, and has been trying all sorts of approaches to get further work funded. But here's one that I hadn't thought of: Essand and his co-workers are willing to name the therapy after anyone who can pony up the money to get it into a 20-patient human trial.

The more I think about that, the less problem I have with it. This looks at first like a pure angel investor move, and if people want to take a crack at something like this with their own cash, let them do the due diligence and make the call. Actually, Essand believes that his current virus is unpatentable (due to prior publication), so this is less of an a angel investment and more sheer philanthropy. But I have no objections at all to that, either.

Update: here's more on the story.

Comments (12) + TrackBacks (0) | Category: Cancer | Clinical Trials | Drug Development

The DNA-Encoded Library Platform Yields A Hit

Email This Entry

Posted by Derek

I wrote here about DNA-barcoding of huge (massively, crazily huge) combichem libraries, a technology that apparently works, although one can think of a lot of reasons why it shouldn't. This is something that GlaxoSmithKline bought by acquiring Praecis some years ago, and there are others working in the same space.

For outsiders, the question has long been "What's come out of this work?" And there is now at least one answer, published in a place where one might not notice it: this paper in Prostaglandins and Other Lipid Mediators. It's not a journal whose contents I regularly scan. But this is a paper from GSK on a soluble epoxide hydrolase inhibitor, and therein one finds:

sEH inhibitors were identified by screening large libraries of drug-like molecules, each attached to a DNA “bar code”, utilizing DNA-encoded library technology [10] developed by Praecis Pharmaceuticals, now part of GlaxoSmithKline. The initial hits were then synthesized off of DNA, and hit-to-lead chemistry was carried out to identify key features of the sEH pharmacophore. The lead series were then optimized for potency at the target, selectivity and developability parameters such as aqueous solubility and oral bioavailability, resulting in GSK2256294A. . .

That's the sum of the med-chem in the article, which certainly compresses things, and I hope that we see a more complete writeup at some point from a chemistry perspective. Looking at the structure, though, this is a triaminotriazine-derived compound (as in the earlier work linked to in the first paragraph), so yes, you apparently can get interesting leads that way. How different this compound is from the screening hit is a good question, but it's noteworthy that a diaminotriazine's worth of its heritage is still present. Perhaps we'll eventually see the results of the later-generation chemistry (non-triazine).

Comments (12) + TrackBacks (0) | Category: Chemical Biology | Chemical News | Drug Assays | Drug Development

The NIH, Pfizer, and Senator Wyden

Email This Entry

Posted by Derek

Senator Ron Wyden (D-Oregon) seems to be the latest champion of the "NIH discovers drugs and Pharma rips them off" viewpoint. Here's a post from John LaMattina on Wyden's recent letter to Francis Collins. The proximate cause of all this seems to be the Pfizer JAK3 inhibitor:

Tofacitinib (Xeljanz), approved last November by the U.S. Food and Drug Administration, is nearing the market as the first oral medication for the treatment of rheumatoid arthritis. Given that the research base provided by the National Institutes of Health (NIH) culminated in the approval of Xeljanz, citizens have the right to be concerned about the determination of its price and what return on investment they can expect. While it is correct that the expenses of drug discovery and preclinical and clinical development were fully undertaken by Pfizer, taxpayer-funded research was foundational to the development of Xeljanz.

I think that this is likely another case where people don't quite realize the steepness of the climb between "X looks like a great disease target" and "We now have an FDA-approved drug targeting X". Here's more from Wyden's letter:

Developing drugs in America remains a challenging business, and NIH plays a critically important role by doing research that might not otherwise get done by the private sector. My bottom line: When taxpayer-funded research is commercialized, the public deserves a real return on its investment. With the price of Xeljanz estimated at about $25,000 a year and annual sales projected by some industry experts as high as $2.5 billion, it is important to consider whether the public investment has assured accessibility and affordability.

This is going to come across as nastier than I intend it to, but my first response is that the taxpayer's return on this was that they got a new drug where there wasn't one before. And via the NIH-funded discoveries, the taxpayers stimulated Pfizer (and many other companies) to spend huge amounts of money and effort to turn the original discoveries in the JAK field into real therapies. I value knowledge greatly, but no human suffering whatsoever was relieved by the knowledge alone that JAK3 appeared to play a role in inflammation. What was there was the potential to affect the lives of patients, and that potential was realized by Pfizer spending its own money.

And not just Pfizer. Let's not forget that the NIH entered into research agreements with many other companies, and that the list of JAK3-related drug discovery projects is a long one. And keep in mind that not all of them, by any means, have ever earned a nickel for the companies involved, and that many of them never will. As for Pfizer, Xeljanz has been on the market for less than six months, so it's too early to say how the drug will do. But it's not a license to print money, and is in a large, extremely competitive market. And should it run into trouble (which I certainly hope doesn't happen), I doubt if Senator Wyden will be writing letters seeking to share some of the expenses.

Comments (35) + TrackBacks (0) | Category: Academia (vs. Industry) | Drug Development | Drug Prices | Regulatory Affairs

March 26, 2013

Automated Med-Chem, At Last?

Email This Entry

Posted by Derek

I've written several times about flow chemistry here, and a new paper in J. Med. Chem. prompts me to return to the subject. This, though, is the next stage in flow chemistry - more like flow med-chem:

Here, we report the application of a flow technology platform integrating the key elements of structure–activity relationship (SAR) generation to the discovery of novel Abl kinase inhibitors. The platform utilizes flow chemistry for rapid in-line synthesis, automated purification, and analysis coupled with bioassay. The combination of activity prediction using Random-Forest regression with chemical space sampling algorithms allows the construction of an activity model that refines itself after every iteration of synthesis and biological result.

Now, this is the point at which people start to get either excited or fearful. (I sometimes have trouble telling the difference, myself). We're talking about the entire early-stage optimization cycle here, and the vision is of someone topping up a bunch of solvent reservoirs, hitting a button, and leaving for the weekend in the expectation of finding a nanomolar compound waiting on Monday. I'll bet you could sell that to AstraZeneca for some serious cash, and to be fair, they're not the only ones who would bite, given a sufficiently impressive demo and slide deck.

But how close to this Lab of the Future does this work get? Digging into the paper, we have this:

Initially, this approach mirrors that of a traditional hit-to-lead program, namely, hit generation activities via, for example, high-throughput screening (HTS), other screening approaches, or prior art review. From this, the virtual chemical space of target molecules is constructed that defines the boundaries of an SAR heat map. An initial activity model is then built using data available from a screening campaign or the literature against the defined biological target. This model is used to decide which analogue is made during each iteration of synthesis and testing, and the model is updated after each individual compound assay to incorporate the new data. Typically the coupled design, synthesis, and assay times are 1–2 h per iteration.

Among the key things that already have to be in place, though, are reliable chemistry (fit to generate a wide range of structures) and some clue about where to start. Those are not givens, but they're certainly not impossible barriers, either. In this case, the team (three UK groups) is looking for BCL-Abl inhibitors, a perfectly reasonable test bed. A look through the literature suggested coupling hinge-binding motifs to DFG-loop binders through an acetylene linker, as in Ariad's ponatinib. This, while not a strategy that will earn you a big raise, is not one that's going to get you fired, either. Virtual screening around the structure, followed by eyeballing by real humans, narrowed down some possibilities for new structures. Further possibilities were suggested by looking at PDB structures of homologous binding sites and seeing what sorts of things bound to them.

So already, what we're looking at is less Automatic Lead Discovery than Automatic Patent Busting. But there's a place for that, too. Ten DFG pieces were synthesized, in Sonogashira-couplable form, and 27 hinge-binding motifs with alkynes on them were readied on the other end. Then they pressed the button and went home for the weekend. Well, not quite. They set things up to try two different optimization routines, once the compounds were synthesized, run through a column, and through the assay (all in flow). One will be familiar to anyone who's been in the drug industry for more than about five minutes, because it's called "Chase Potency". The other one, "Most Active Under Sampled", tries to even out the distributions of reactants by favoring the ones that haven't been used as often. (These strategies can also be mixed). In each case, the model was seeded with binding constants of literature structures, to get things going.

The first run, which took about 30 hours, used the "Under Sampled" algorithm to spit out 22 new compounds (there were six chemistry failures) and a corresponding SAR heat map. Another run was done with "Chase Potency" in place, generating 14 more compounds. That was followed by a combined-strategy run, which cranked out 28 more compounds (with 13 failures in synthesis). Overall, there were 90 loops through the process, producing 64 new products. The best of these were nanomolar or below.

But shouldn't they have been? The deck already has to be stacked to some degree for this technique to work at all in the present stage of development. Getting potent inhibitors from these sorts of starting points isn't impressive by itself. I think the main advantage to this is the time needed to generated the compound and the assay data. Having the synthesis, purification, and assay platform all right next to each other, with compound being pumped right from one to the other, is a much tighter loop than the usual drug discovery organization runs. The usual, if you haven't experienced it, is more like "Run the reaction. Work up the reaction. Run it through a column (or have the purification group run it through a column for you). Get your fractions. Evaporate them. Check the compound by LC/MS and NMR. Code it into the system and get it into a vial. Send it over to the assay folks for the weekly run. Wait a couple of days for the batch of data to be processed. Repeat."

The science-fictional extension of this is when we move to a wider variety of possible chemistries, and perhaps incorporate the modeling/docking into the loop as well, when it's trustworthy enough to do so. Now that would be something to see. You come back in a few days and find that the machine has unexpectedly veered off into photochemical 2+2 additions with a range of alkenes, because the Chase Potency module couldn't pass up a great cyclobutane hit that the modeling software predicted. And all while you were doing something else. And that something else, by this point, is. . .what, exactly? Food for thought.

Comments (16) + TrackBacks (0) | Category: Chemical News | Drug Assays | Drug Development

The Wyeth/Elan Insider Trading Case Resolves

Email This Entry

Posted by Derek

You may remember this insider trading scandal from last year, involving a lead investigator for Wyeth/Elan's trials of bapineuzumab for Alzheimer's.

Here's the sequel. The hedge fund involved has agreed to pay $600 million dollars to settle the charges, although this does not get the manager himself off the hook (litigation in his case continues). Dr. Sidney Gilman, the investigator who leaked the information, has already been required to give back all his own gains, with interest and penalties.

Comments (6) + TrackBacks (0) | Category: Business and Markets | Clinical Trials | The Dark Side

A Malfunctioning Spambot

Email This Entry

Posted by Derek

A quick behind-the-scenes note: anyone who has a blog has to contend with comment spam. It seems like something that has no chance of leading to any clicks or any money, but it piles up anyway. Hundreds of junk comments sometimes show up in a single day around here, most all of which automatically go right into the bit-bucket and are never seen (by me or anyone else). As a side effect, that means legitimate comments that end up in the spam folder have to be rescued pretty quickly, or they'll scroll off into the void, pushed along by illiterate come-ons for replica sports jerseys and implausible money-making schemes.

A few of them make it in each day, though, and I noticed this morning while taking out the blog-trash that one spammer seems to have glitched up. Behold, pretty much every vague, oddly worded blogspam comment all at once, complete with Mad-Lib style word lists at every opportunity. I couldn't help but {laugh | cackle | snort} at the {stupidity | incompetence | cluelessness} of this {garbage | crap}.

Comments (11) + TrackBacks (0) | Category: Blog Housekeeping

March 25, 2013

The FDA's New Alzheimer's Guidance: Wonder or Blunder?

Email This Entry

Posted by Derek

You can get either answer, depending on whom you ask. Last month, the agency unveiled new guidelines for developing Alzheimer's therapies. They're trying to deal with the difficulty of showing actual cognitive improvement in more advanced patients, while at the same time figuring out how to evaluate therapies in early-stage patients who really aren't cognitively impaired yet. It's a worthy problem, for sure, and a good thing to be thinking about. Two of the agency's scientists laid out the thinking in a NEJM editorial:

The current landscape of research and drug development in Alzheimer's disease offers a study in contrasts. On the positive side, numerous discoveries over the past decade have begun to unmask complex pathophysiological processes that underlie disease progression. Such advances have, in part, resulted from large, well-organized observational studies, such as the Alzheimer's Disease Neuroimaging Initiative (ADNI), that have elucidated various disease biomarkers that reflect, or even predict, the progression of disease. On the negative side, drug discovery has been disappointing. Despite all best efforts to translate mechanistic insights concerning Alzheimer's disease into new drug products, several candidate agents have failed to demonstrate efficacy in large, well-designed, phase 3 clinical trials of late-stage disease.

That they have, and how. Avoiding these, or at least finding out ways to fail more cheaply, is very much on the minds of everyone working in the field. The New York Times, though, had a rather unexpected fit about the whole idea, culminating in this editorial:

. . .The goal is commendable — to find ways to prevent or slow the progression of this terrible disease before it can rob people of their mental capacities. But the proposal raises troubling questions as to whether the agency would end up approving drugs that provide little or no clinical benefit yet cause harmful side effects in people who take the medications for extended periods. . .

. . .F.D.A. officials say they would never approve drugs based on cognitive effects alone unless absolutely convinced that patients with very early-stage Alzheimer’s that is likely to progress to full-blown dementia could be reliably identified. It will be up to the drug companies or other sponsors of clinical trials to do the convincing.

The F.D.A.’s proposal is open for comment for 60 days. Independent analysts need to look hard at whether the F.D.A. should lower the bar for these drugs — or should demand a very high level of proof of safety and effectiveness before exposing still-healthy people to possible harm. Even if drugs are eventually approved under this new approach, it will be imperative to force manufacturers to conduct follow-up studies, as required by law, to see if patients benefit in the long run. . .

That's not a crazy point of view at all, though - I've worried about something similar myself. But I think that the Times is a bit too worked up over the current FDA guidance. On the other hand, as BioCentury has it in their latest issue (March 25), there are people in the biotech industry who have been talking up the new guidelines as some sort of regulatory breakthrough. "Not So Fast", is the response:

. . .The only reasonable conclusion to be drawn from a careful reading of both the guidance and the commentary is that while FDA would be willing to accelerate approval of AD drugs, the science isn’t there to allow it to do so. . .

The guidance states FDA would grant accelerated approval to a treatment for AD “based on the use of a biomarker as a single primary surrogate efficacy measure” — if a biomarker that reliably predicted clinical benefit existed.

As the guidance notes — and as anyone who follows the field knows — “no reliable evidence exists at the present time that any observed treatment effect on such a measure is reasonably likely to predict ultimate clinical benefit (the standard for accelerated approval), despite a great deal of research interest in understanding the role of biomarkers in AD.”

. . .The process of reaching scientific consensus on an appropriate assessment tool will take years. And no one knows how much longer it will take for regulators to generate sufficient confidence in the tool to use it as the sole basis for approving an AD drug.

I think that's the key part of all this. It's fine that the FDA is open to the idea of biomarkers for Alzheimer's; we're probably going to have to do it that way. But no one knows how to do that yet, and it's not like the agency is just going to pick one of the current measures and tell everyone that that's fine. What I would expect this latest initiative to do, actually, is to end up pushing more money into the hunt for such biomarkers, with perhaps less going to direct shots-on-goal. That's disappointing, from one perspective, but the shots on goal will bankrupt us all at the current rate.

Comments (8) + TrackBacks (0) | Category: Alzheimer's Disease | Regulatory Affairs

James Watson Likes Us, Anyway

Email This Entry

Posted by Derek

Around us we see changes in everything, but there are some constants that we can count on. James Watson, for example, is still James Watson:

While noting that genetics is vital, Watson said, "You could sequence 150,000 people with cancer and its not going to cure anyone. It might give you a few leads, but it's not, to me, the solution. The solution is good chemistry. And that's what's lacking. We have a world of cancer biology trained to think genes. They don't think chemistry at all."

Watson, who had his entire genome sequenced, said the current level of cancer research funding is enough to find a cure. But he added that "most of the experiments we do are irrelevant ... We're not going to cure cancer by doubling the money. We're going to do it by being more intelligent. The money thing is just a red herring of people not thinking."

Read the article for more - he has a number of other opinions that (as usual) he's not shy about sharing with the audience (!)

Comments (10) + TrackBacks (0) | Category: Cancer

Advertising in the Supplementary Information?

Email This Entry

Posted by Derek

Here's a publication concern I'd never come across before. A reader sends word that an ACS journal asked him and his co-authors to remove the names of vendors and manufacturers in their Supporting Information, over concerns that this might be seen as some form of advertising. I think they were specifically thinking of whether the authors might have had academic discounts, etc., that influenced their selection of reagents and equipment.

But while I can see that point, I also think it's important to name suppliers. Any experienced chemist knows that a palladium catalyst from one supplier may well not be the same as one from another supplier, for example (unpaid, unsolicited endorsement: stick with Strem). To pick another issue, HPLC columns come in as many varieties as there are manufacturers - how are you supposed to honestly list your experimental details if you can't say whose columns you used? I don't see how you can have a complete writeup without these details, and I think that this outweighs the concerns about discounts.

My correspondent suggests a compromise: list all the brands, but also state whether any discounts were received. Has anyone else run into this issue?

Comments (19) + TrackBacks (0) | Category: The Scientific Literature

March 23, 2013

Quick E-mail Housekeeping Note

Email This Entry

Posted by Derek

I wanted to mention that several e-mails sent to me last night and this morning (Friday night and Saturday AM, EST) got deleted inadvertently while I was fumbling through them early this morning. There were some that I'd like to respond to, so if you sent me something recently, please feel free to re-send it. Note: this request does not apply to the people who keep asking if I will be a speaker at the First World International Summit Congress of Everything for Everybody, to be held somewhere in southern China, or to anyone who starts off by saying "Dear Purchasing Manager".

Comments (1) + TrackBacks (0) | Category: Blog Housekeeping

March 22, 2013

Trouble for a Whole Class of Diabetes Drugs?

Email This Entry

Posted by Derek

The FDA has been turning its attention to some potential problems with therapies that target the incretin pathways. That includes the DPP-IV inhibitors, such as Januvia (sitagliptin) and GLP-1 peptide drugs like Byetta and Victoza.

There had been reports (and FDA mentions) of elevated risks with GLP-1 drugs, but this latest concern is prompted by a recent paper in JAMA Internal Medicine that uses insurance company data to nail down the effect. Interestingly, the Endocrine Society has come out with a not-so-fast press release of its own, expressing doubts about the statistics of the new paper. I'm not quite sure why they're taking that side of the issue, but there it is.

For what it's worth, this looks to me like one of those low-but-real incidence effects, with consequences that are serious enough to make physicians (and patients) think twice. At the very least, you'd expect diabetic patients on these drugs to stay very alert to early signs of pancreatitis (which is really one of the last things you need to experience, and in fact, may be one of the last things you experience should the case arise). And this just points out how hard the diabetes field really is - there are already major cardiovascular concerns that have to be checked out with any new drug, and now we have pancreatitis cropping up with one of the large mechanistic classes. In general, diabetic patients can have a great deal wrong with their metabolic functions, and they have to take your drugs forever. While that last part might sound appealing from a business point of view, you're also giving every kind of trouble all the time it needs to appear. Worth thinking about. . .

Comments (8) + TrackBacks (0) | Category: Diabetes and Obesity | Toxicology

Good News in Oncology: More Immune Therapy for Leukemia

Email This Entry

Posted by Derek

I've written a couple of times about the work at the University of Pennsylvania on modified T-cell therapy for leukemia (CLL). Now comes word that a different version of this approach seems to be working at Sloan-Kettering. Recurrent B-cell acute lymphoblastic leukemia (B-ALL) has been targeted there, and it's generally a more aggressive disease than CLL.

As with the Penn CLL studies, when this technique works, it can be dramatic:

One of the sickest patients in the study was David Aponte, 58, who works on a sound crew for ABC News. In November 2011, what he thought was a bad case of tennis elbow turned out to be leukemia. He braced himself for a long, grueling regimen of chemotherapy.

Brentjens suggested that before starting the drugs, Aponte might want to have some of his T-cells stored (chemotherapy would deplete them). That way, if he relapsed, he might be able to enter a study using the cells. Aponte agreed.

At first, the chemo worked, but by summer 2012, while he was still being treated, tests showed the disease was back.

“After everything I had gone through, the chemo, losing hair, the sickness, it was absolutely devastating,’’ Aponte recalled.

He joined the T-cell study. For a few days, nothing seemed to be happening. But then his temperature began to rise. He has no memory of what happened for the next week or so, but the journal article — where he is patient 5 — reports that his fever spiked to 105 degrees.

He was in the throes of a ‘‘cytokine storm,’’ meaning that the T-cells, in a furious battle with the cancer, were churning out enormous amounts of hormones called cytokines. Besides fever, the hormonal rush can make a patient’s blood pressure plummet and his heart rate shoot up. Aponte was taken to intensive care and treated with steroids to quell the reaction.

Eight days later, his leukemia was gone

He and the other patients in the study all received bone marrow transplantations after the treatment, and are considered cured - which is remarkable, since they were all relapsed/refractory, and thus basically at death's door. These stories sound like the ones from the early days of antibiotics, with the important difference that resistance to drug therapy doesn't spread through the world's population of cancer cells. The modified T-cell approach has already gotten a lot of attention, and this is surely going to speed things up even more. I look forward to the first use of it for a non-blood-cell tumor (which appears to be in the works) and to further refinements in generating the cells themselves.

Comments (11) + TrackBacks (0) | Category: Biological News | Cancer | Clinical Trials

Good News in Oncology: Oncolytic Virus Therapy

Email This Entry

Posted by Derek

The last few days have brought some good news on some unusual approaches to cancer therapy. First off was Amgen's report that they'd seen positive results in advanced melanoma using a modified HSV treatment. This is technology that they brought in by buying Biovex in 2011, and as a minor side effect, if it works, it'll be so much the better for Roger Perlmutter (now at Merck), since this was a deal made under his watch.

Specifically, the company says that 16% of patients showed a response (durable response rate, DRR) to the treatment, versus 2% of the control group. That's encouraging, but the big question is overall survival. DRR will get you little or nothing at the FDA, or shouldn't, if people don't actually live longer. We should have those numbers later this year - considering what sort of shape people are in with late-stage melanoma, you can look at the odds two different ways. The disease is so advanced, perhaps, that it'll be difficult for anything to show a benefit. Or, on the other hand, anything that doe have an effect will stand out, since the control group's course will be so relentless.

I hope this works, both for the patients and for the idea of using a virus to attack cancerous cells. That one's been kicking around for a long time, with several companies in the chase, and it has a lot of appealing features. But it also has a lot of tricky details, too - targeting the tumor cells over normal ones, finding the appropriate viral platform, delivering it safely to the patient, and more. There's also the question of whether you just want to lyse the tumor cells with a viral load, or also make them express some therapeutically useful protein. The Amgen/Biovex HSV virus in this latest trial, for example, also causes the cells to express GM-CSF for an additional immune response (with the control group getting GM-CSF alone).

So even though this has been actively researched in humans since the mid-1990s, I'd still call it the early days. Here's hoping for more encouraging news, from Amgen and the others in this chase.

Comments (7) + TrackBacks (0) | Category: Cancer | Clinical Trials

March 21, 2013

NeuroSearch's Decline

Email This Entry

Posted by Derek

If you looked at the timelines of a clinical trial, you'll notice that there's often a surprisingly long gap between when the trial actually ends and when the results of it are ready to announce. If you've ever been involved in working up all that data, you'll know why, but it's usually not obvious to people outside of medical research why it should take so long. (I know how they'd handle the scene in a movie, were any film to ever take on such a subject - it would look like the Oscars, with someone saying "And the winner is. . ." within the first few seconds after the last patient was worked up).

The Danish company NeuroSearch unfortunately provided everyone with a lesson in why you want to go over your trial data carefully. In February of 2010, they announced positive results in a Phase III trial of a drug (pridopidine, Huntexil) for Huntington's (a rare event, that), but two months later they had to take it back. This move cratered their stock price, and investor confidence in general, as you'd imagine. Further analysis, which I would guess involved someone sitting in front of a computer screen, tapping keys and slowly turning pale and sweaty, showed that the drug actually hadn't reached statistical significance after all.

It came down to the varying genetic background in the patients being studied, specifically, the number of CAG repeats. That's the mutation behind Huntington's - once you get up to too many of those trinucleotide repeats in the middle of the gene sequence, the resulting protein starts to behave abnormally. Fewer than 36 CAGs, and you should be fine, but a good part of the severity of the disease has to do with how many repeats past that a person might have. NeuroSearch's trial design was not predicated on such genetic differences, at least not for modeling the primary endpoints. If you took those into account, they reached statistical significance, but if you didn't, you missed.

That's unfortunate, but could (in theory) be worse - after all, their efficacy did seem to track with a clinically relevant measure of disease severity. But you'll have noticed that I'm wording all these sentences in the past tense. The company has announced that they're closing. It's all been downhill since that first grim announcement. In early 2011, the FDA rejected their New Drug Application, saying that the company needed to provide more data. By September of that year, they were laying off most of their employees to try to get the resources together for another Phase III trial. In 2012, the company began shopping Huntexil around, as it became clear that they were not going to be able to develop it themselves, and last September, Teva purchased the program.

This is a rough one, because for a few weeks there in 2010, NeuroSearch looked like they had made it. If you want to see the fulcrum, the place about which whole companies pivot, go to clinical trial design. It's hard to overstate just how important it is.

Comments (7) + TrackBacks (0) | Category: Clinical Trials | The Central Nervous System

AstraZeneca Makes a Deal With Moderna. Wait, Who?

Email This Entry

Posted by Derek

AstraZeneca has announced another 2300 job cuts, this time in sales and administration. That's not too much of a surprise, as the cuts announced recently in R&D make it clear that the company is determined to get smaller. But their overall R&D strategy is still unclear, other than "We can't go on like this", which is clear enough.

One interesting item has just come out, though. The company has done a deal with Moderna Therapeutics of Cambridge (US), a relatively new outfit that's trying something that (as far as I know) no one else has had the nerve to. Moderna is trying to use messenger RNAs as therapies, to stimulate the body's own cells to produce more of some desired protein product. This is the flip side of antisense and RNA interference, where you throw a wrench into the transcription/translation machinery to cut down on some protein. Moderna's trying to make the wheels spin in the other direction.

This is the sort of idea that makes me feel as if there are two people inhabiting my head. One side of me is very excited and interested to see if this approach will work, and the other side is very glad that I'm not one of the people being asked to do it. I've always thought that messing up or blocking some process was an easier task than making it do the right thing (only more so), and in this case, we haven't even reliably shown that blocking such RNA pathways is a good way to a therapy.

I also wonder about the disease areas that such a therapy would treat, and how amenable they are to the approach. The first one that occurs to a person is "Allow Type I diabetics to produce their own insulin", but if your islet cells have been disrupted or killed off, how is that going to work? Will other cell types recognize the mRNA-type molecules you're giving, and make some insulin themselves? If they do, what sort of physiological control will they be under? Beta-cells, after all, are involved in a lot of complicated signaling to tell them when to make insulin and when to lay off. I can also imagine this technique being used for a number of genetic disorders, where we know what the defective protein is and what it's supposed to be. But again, how does the mRNA get to the right tissues at the right time? Protein expression is under so many constraints and controls that it seems almost foolhardy to think that you could step in, dump some mRNA on the process, and get things to work the way that you want them to.

But all that said, there's no substitute for trying it out. And the people behind Moderna are not fools, either, so you can be sure that these questions (and many more) have crossed their minds already. (The company's press materials claim that they've addressed the cellular-specificity problem, for example). They've gotten a very favorable deal from AstraZeneca - admittedly a rather desperate company - but good enough that they must have a rather convincing story to tell with their internal data. This is the very picture of a high-risk, high-reward approach, and I wish them success with it. A lot of people will be watching very closely.

Comments (37) + TrackBacks (0) | Category: Biological News | Business and Markets | Drug Development

March 20, 2013

Thought for the Day, On Interdisciplinary Research

Email This Entry

Posted by Derek

This quote caught my eye fromNature's "Trade Secrets" blog, covering a recent conference. Note that the Prof. Leggett mentioned is a 2003 Nobel physics laureate:

It’s been a recent trend to mix disciplines and hope the results will solve some of science’s stickier problems. But is it possible the pendulum has swung too far? Leggett told the audience the term ‘interdisciplinarity’ is often “abused.”

“I don’t myself feel it is a good thing for government committees and so forth to encourage interdisciplinarity for its own sake. Some of these committees – at least in my experience – seem to be under the impression that interdisciplinarity is a sort of sauce, which you can put on otherwise unpromising ingredients, to improve the whole collection,” Prof. Leggett said. “I don’t really think that is right. The problem with that kind of approach is that sometimes people get the impression that simply to attack a problem in biology for the sake of attacking a problem in biology is itself a virtue.”

It's interesting that Leggett would use biology as an example. There's been a long history of physics/biology crossovers, going back to Schrödinger's What is Life?: and George Gamow's interest in DNA. Francis Crick originally studied physics, and Richard Feynman did very good work on sabbatical in Max Delbrück's lab. (Here's a rundown of these and other connections).

But Leggett does indeed have a good point, one that applies to all sorts of other "magic recipes" for inducing creativity. If we knew how to induce that, we'd have a hell of a lot more of it, has always been my opinion. A lot of great things have come out of the borderlands between two sciences, but just the fact that you're going out into those territories doesn't assure you of a thing.

Comments (22) + TrackBacks (0) | Category: Who Discovers and Why

Off Topic: Happy New Year, In March

Email This Entry

Posted by Derek

For those of my readers who are celebrating it, eid-e shoma mubarak! That's "Happy New Year" for Iran, Afghanistan, and a number of nearby areas. It's a Zoroastrian-derived holiday, on the solar calendar, and thanks to my Iranian wife we celebrate it at home. I can fully endorse the emphasis on special holiday sweets (such as sohan, which I'd describe as a dark saffron-flavored almond brittle) and large quantities of other festive foods. We'll be having a traditional meal of spicy fried fish later on, which fits my Arkansan sensibilities just fine. So, happy new year!

Comments (4) + TrackBacks (0) | Category: Blog Housekeeping

Using DNA to Make Your Polymers. No Enzymes Needed.

Email This Entry

Posted by Derek

Here's an ingenious use for DNA that never would have occurred to me. David Liu and co-workers have been using DNA-templated reactions for some time, though, so it's the sort of thing that would have occurred to them: using the information of a DNA sequence to make other kinds of polymers entirely.
Liu%20scheme.png
The schematic above gives you the idea. Each substrate has a peptide nucleic acid (PNA) pentamer, which recognizes a particular DNA codon, and some sort of small-molecule monomer piece for the eventual polymer, with cleavable linkers holding these two domains together. The idea is that when these things line up on the DNA, their reactive ends will be placed in proximity to each other, setting up the bond formation in the order that you want.

Even so, they found that if you use building blocks whose ends can react with each other intramolecularly (A----B), they tend to do that as a side reaction and mess things up. So the most successful runs had an A----A type compound on one codon, with a B----B one on the next, and so on. So what chemical reactions were suitable? Amide formation didn't get very far, and reductive amination failed completely. Hydrazone and oxime formation actually worked, though, although you can tell that Liu et al. weren't too exciting about pursuing that avenue much further. But the good ol' copper-catalyzed acetylene/azide "click" reaction came through, and appears to have been the most reliable of all.

That platform was used to work out some of the other features of the system. Chain length on the individual pieces turned out not to be too big a factor (Whitesides may have been right again on this one). A nice mix-and-match experiment with various azides and acetylenes on different PNA codon recognition sequences showed that the DNA was indeed templating things the in the way that you would expect from molecular recognition. Pushing the system by putting rather densely functionalized spacers (beta-peptide sequences) in the A----A and B----B motifs also worked well, as did pushing things to make 4-, 8-, and even 16-mers. By the end, they'd produced completely defined triazole-linked beta-peptide polymers of 90 residues, with a molecular weight of 26 kD, which pushes things into the realm of biomolecular sizes.

You can, as it turns out, take a sample of such a beast (with the DNA still attached) and subject it to PCR, amplifying your template again. That's important, because it's the sort of thing you could imagine doing with a library of these things, using some sort of in vitro selection criterion for activity, and then identifying the sequence of the best one by using the attached DNA as a bar-code readout. This begins to give access to a number of large and potentially bioactive molecules that otherwise would be basically impossible to synthesize in any defined form. Getting started is not trivial, but once you get things going, it looks like you could generate a lot of unusual stuff. I look forward to seeing people take up the challenge!

Comments (6) + TrackBacks (0) | Category: Chemical Biology

March 19, 2013

Scientific Snobbery

Email This Entry

Posted by Derek

Here's something that you don't see mentioned very often in science, but it's most certainly real: snobbery:

We all do it. Pressed for time at a meeting, you can only scan the presented abstracts and make snap judgements about what you are going to see. Ideally, these judgements would be based purely on what material is of most scientific interest to you. Instead, we often use other criteria, such as the name of the researchers presenting or their institution. I do it too, passing over abstracts that are more relevant to my work in favour of studies from star universities such as Stanford in California or Harvard in Massachusetts because I assume that these places produce the 'best' science.

As someone who is based at a less well-known institution, the University of South Dakota in Vermillion, I see other scientists doing the same to me and my students. In many cases, this is a loss: to my students and their projects, which could have benefited from the input, and to the investigators who might have missed information that could have been useful in their own work.

It's true. This carries over to industry, too, both in the ways that people look at other's academic backgrounds, and even in terms of industrial pedigrees. Working for a biopharma that's been successful, that everyone's heard of, does a lot more for your reputation than working for one that no one knows anything about. The unspoken supposition is that a really small obscure company must have had to reach lower down the ladder to hire people, even though this might not be the case at all.

I have no idea of what could be done about this, because I think it's sheer human nature. The best we can do, I think, is to realize that it happens and to try to consciously correct for it when we can. It's realistic to assume that some small school doesn't have the resources that a larger one has, or that a professor at one has more trouble attracting students. But beyond that, you have to be careful. Some very smart people have come out of some very obscure backgrounds, and you can't - and shouldn't - assume anything in that line.

Comments (30) + TrackBacks (0) | Category: General Scientific News

Affymax In Trouble

Email This Entry

Posted by Derek

Affymax has had a long history, and it's rarely been dull. The company was founded in 1988, back in the very earliest flush of the Combichem era, and in its early years it (along with Pharmacopeia) was what people thought of when they thought of that whole approach. Huge compound libraries produced (as much as possible) by robotics, equally huge screening efforts to deal with all those compounds - this stuff is familiar to us now (all too familiar, in many cases), but it was new then. If you weren't around for it, you'll have to take the word of those who were that it could all be rather exciting and scary at first: what if the answer really was to crank out huge piles of amides, sulfonamides, substituted piperazines, aminotriazines, oligopeptides, and all the other "build-that-compound-count-now!" classes? No one could say for sure that it wasn't. Not yet.

Glaxo bought Affymax back in 1995, about the time they were buying Wellcome, which makes it seem like a long time ago, and perhaps it was. At any rate, they kept the combichem/screening technology and spun a new version of Affymax back out in 2001 to a syndicate of investors. For the past twelve years, that Affymax has been in the drug discovery and development business on its own.

And as this page shows, the story through most of those years has been peginesatide (brand name Omontys, although it was known as Hematide for a while as well). This is synthetic peptide (with some unnatural amino acids in it, and a polyethylene glycol tail) that mimics erythropoetin. What with its cyclic nature (a couple of disulfide bonds), the unnatural residues, and the PEGylation, it's a perfect example of what you often have to do to make an oligopeptide into a drug.

But for quite a while there, no one was sure whether this one was going to be a drug or not. Affymax had partnered with Takeda along the way, and in 2010 the companies announced some disturbing clinical data in kidney patients. While Omontys did seem to help with anemia, it also seemed to have a worse safety profile than Amgen's EPO, the existing competition. The big worry was cardiovascular trouble (which had also been a problem with EPO itself and all the other attempted competition in that field). A period of wranging ensued, with a lot of work on the clinical data and a lot of back-and-forthing with the FDA. In the end, the drug was actually approved one year ago, albeit with a black-box warning about cardiovascular safety.

But over the last year, about 25,000 patients got the drug, and unfortunately, 19 of them had serious anaphylactic reactions to it within the first half hour of exposure. Three patients died as a result, and some others nearly did. That is also exactly what one worries about with a synthetic peptide derivative: it's close enough to the real protein to do its job, but it's different enough to set off the occasional immune response, and the immune system can be very serious business indeed. Allergic responses had been noted in the clinical trials, but I think that if you'd taken bets last March, people would have picked the cardiovascular effects as the likely nemesis, not anaphylaxis. But that's not how it's worked out.

Takeda and Affymax voluntarily recalled the drug last month. And that looked like it might be all for the company, because this has been their main chance for some years now. Sure enough, the announcement has come that most of the employees are being let go. And it includes this language, which is the financial correlate of Cheyne-Stokes breathing:

The company also announced that it will retain a bank to evaluate strategic alternatives for the organization, including the sale of the company or its assets, or a corporate merger. The company is considering all possible alternatives, including further restructuring activities, wind-down of operations or even bankruptcy proceedings.

I'm sorry to hear it. Drug development is very hard indeed.

Comments (11) + TrackBacks (0) | Category: Business and Markets | Cardiovascular Disease | Drug Development | Drug Industry History | Toxicology

March 18, 2013

AstraZeneca Site Closings - And Openings

Email This Entry

Posted by Derek

I started hearing word Friday that it looked like some AstraZeneca sites were preparing for some sort of big announcement or meeting, but I didn't want to run with the news in case it turned out to to be nothing. Well, it wasn't nothing. The company is restructuring R&D:

. . .Under the plans, AstraZeneca's small molecule and biologics R&D activities will be concentrated in three strategic centres: Cambridge, UK; Gaithersburg, US; and Mölndal, Sweden. The proposals are expected to be fully implemented by 2016.

Cambridge, UK: AstraZeneca will invest around $500 million to establish a new, purpose-built facility in Cambridge, a world-renowned centre for life sciences innovation with strong links to globally important research institutions in London. Consolidating the company's UK-based small molecule and biologics research and development at a new centre will build on AstraZeneca's world-leading protein engineering capabilities already based in the city. Cambridge will also become AstraZeneca's new global corporate headquarters.

Gaithersburg, Maryland, US: The site of MedImmune's headquarters and the primary location for AstraZeneca's biologics activities, Gaithersburg will also become home to much of the company's US-based Global Medicines Development activities for small and large molecules and will accommodate some global marketing and US specialty care commercial functions.

Mölndal, Sweden: AstraZeneca's site in Mölndal, near Gothenburg, will continue to be a global centre for research and development, with a primary focus on small molecules.

The three strategic sites will be supported by other existing AstraZeneca facilities around the world, including Boston, Massachusetts, US which will continue to be a centre for research and development, with a primary focus on small molecules.

But that means that some other sites are getting hit. Specifically, Alderley Park in the UK will no longer be an R&D site. The company says that "1,600 roles" will migrate from the site, but it says nothing about people. Alderley Park, which is up to the south of Manchester, is a stiff drive from Cambridge; no one could possible haul 160 miles each way on the M6 every day of the week. AZ's Paddington office in London will also be closing. In the US, 1,200 "roles" will be leaving Wilmington, as the Global Medicines Development group relocates.

So there's a lot that's unclear about this announcement. What happens to the people who are now employed at Alderley Park? How is the company going to staff its new Cambridge (UK) site? And what's the real role of the Waltham (Massachusetts) R&D site in this new arrangement? That one's already gone through a lot of shakeups over the last couple of years. More details as they become known.

Update: FiercePharma says that this comes down to a loss of 650 jobs in the US. No more details on how the UK moves will work, though.

Comments (63) + TrackBacks (0) | Category: Business and Markets

GlaxoSmithKline's CEO on the Price of New Drugs

Email This Entry

Posted by Derek

Well, GlaxoSmithKline CEO Andrew Witty has made things interesting. Here he is at a recent conference in London when the topic of drug pricing came up:

. . . Witty said the $1 billion price tag was "one of the great myths of the industry", since it was an average figure that includes money spent on drugs that ultimately fail.

In the case of GSK, a major revamp in the way research is conducted means the rate of return on R&D investment has increased by about 30 percent in the past three or four years because fewer drugs have flopped in late-stage testing, he said.

"If you stop failing so often you massively reduce the cost of drug development ... it's why we are beginning to be able to price lower," Witty said.

"It's entirely achievable that we can improve the efficiency of the industry and pass that forward in terms of reduced prices."

I have a feeling that I'm going to be hearing "great myths of the industry" in my email for some time, thanks to this speech, so I'd like to thank Andrew Witty for that. But here's what he's trying to get across: if you start research on for a new drug, name a clinical candidate, take it to human trials and are lucky enough to have it work, then get it approved by the FDA, you will not have spent one billion dollars to get there. That, though, is the figure for a single run-through when everything works. If, on the other hand, you are actually running a drug company, with many compounds in development, and after a decade or so you total up all the money you've spent, versus the number of drugs you got onto the market, well, then you may well average a billion dollars per drug. That's because so many of them wipe out in the clinic; the money gets spent and you get no return at all.

That's the analysis that Matthew Herper did here (blogged about here), and that same Reuters article makes reference to a similar study done by Deloitte (and Thomson Reuters!) that found that the average cost of a new drug is indeed about $1.1 billion when you have to pay for the failures.

And believe me, we have to pay for them. A lottery ticket may only cost a dollar, but by the time you've won a million dollars playing the lottery, you will have bought a lot of losing tickets. In fact, you'll have bought far more than a million dollar's worth, or no state would run a lottery, but that's a negative-expectations game, while drug research (like any business) is supposed to be positive-expectations. Is it? Just barely, according to that same Deloitte study:

In effect, the industry is treading water in the fight to deliver better returns on the billions of dollars ploughed into the hunt for new drugs each year.

With an average internal rate of return (IRR) from R&D in 2012 of 7.2 percent - against 7.7 percent and 10.5 percent in the two preceding years - Big Pharma is barely covering its average cost of capital, estimated at around 7 percent.

Keep that in mind next time you hear about how wonderfully profitable the drug business is. And those are still better numbers than Morgan Stanley had a couple of years before, when they estimated that our internal returns probably weren't keeping up with our cost of capital at all. (Mind you, it seems that their analysis may have been a bit off, since they used their figures to recommend an "Overweight" on AstraZeneca shares, a decision that looked smart for a few months, but one that a person by now would have regretted deeply).

But back to Andrew Witty. What he's trying to say is that it doesn't have to cost a billion dollars per drug, if you don't fail so often, and he's claiming that GSK is starting to fail less often. True, or not? The people I know at the company aren't exactly breaking out the party hats, for what that's worth, and it looks like the company's might have to add the entire Sirtris investment to the "sunk cost" pile. Overall, I think it's too soon to call any corners as having been turned, even if GSK does turn out to have been doing better. Companies can have runs of good fortune and bad, and the history of the industry is absolutely littered with the press releases of companies who say that they've Turned A New Page of Success and will now be cranking out the wonder drugs like nobody's business. If they keep it up, GSK will have plenty of chances to tell us all about it.

Now, one last topic. What about Witty's statement that this new trend to success will allow drug prices themselves to come down? That's worth thinking about all by itself, on several levels - here are my thoughts, in no particular order:

(1) To a first approximation, that's true. If you're selling widgets, your costs go down, you can cut prices, and you can presumably sell more widgets. But as mentioned above, I'm not yet convinced that GSK's costs are truly coming down yet. And see point three below, because GSK and the rest of us in this business are not, in fact, selling widgets.

(2) Even if costs are coming down, counterbalancing that are several other long-term trends, such as the low-hanging fruit problem. As we move into harder and harder sorts of targets and disease areas, I would assume that the success rate of drugs in the clinic will be hard pressed to improve. This is partly a portfolio management problem, and can be ameliorated and hedged against to some degree, but it is, I think, a long-term concern, unless we start to make some intellectual headway on these topics, and speed the day. On the other side of this balance are the various efforts to rationalize clinical trials and so on.

(3) A larger factor is that the market for innovative drugs is not very sensitive to price. This is a vast topic, covered at vast length in many places, but it comes down to there being (relatively) few entrants in any new therapeutic space, and to people, and governments, and insurance companies, being willing to spend relatively high amounts of money for human health. (The addition of governments into that list means also that various price-fixing schemes distort the market in all kinds of interesting ways as well). At any rate, price mechanisms don't work like classical econ-textbook widgets in the drug business.

So I'm not sure, really, how this will play out. GSK has only modest incentives to lower the prices of its drugs. Such a move won't, in many markets, allow them to sell more drugs to make up the difference on volume. And actually, the company will probably be able to offset some of the loss via the political capital that comes from talking about any such price changes. We might be seeing just that effect with Witty's speech.

Comments (30) + TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Prices

March 15, 2013

More ENCODE Skepticism

Email This Entry

Posted by Derek

There's another paper out expressing worries about the interpretation of the ENCODE data. (For the last round, see here). The wave of such publications seems to be largely a function of how quickly the various authors could assemble their manuscripts, and how quickly the review process has worked at the various journals. You get the impression that a lot of people opened up new word processor windows and started typing furiously right after all the press releases last fall.

This one, from W. Ford Doolittle at Dalhousie, explicitly raises a thought experiment that I think has occurred to many critics of the ENCODE effort. (In fact, it's the very one that showed up in a comment here to the last post I did on the subject). Here's how it goes: The expensive, toxic, only-from-licensed-sushi-chefs puffer­fish (Takifugu rubripes) has about 365 million base pairs, with famously little of it looking like junk. By contrast, the marbled lungfish (Protopterus aethiopicus) has a humungous genome, 133 billion base pairs, which is apparently enough to code for three hundred different puffer fish with room to spare. Needless to say, the lungfish sequence features vast stretches of apparent junk DNA. Or does it need saying? If an ENCODE-style effort had used the marbled lungfish instead of humans as its template, would it have told us that 80% of its genome was functional? If it had done the pufferfish simultaneously, what would it have said about the difference between the two?

I'm glad that the new PNAS paper lays this out, because to my mind, that's a damned good question. One ENCODE-friendly answer is that the marbled lungfish has been under evolutionary pressure that the fugu pufferfish hasn't, and that it needs many more regulatory elements, spacers, and so on. But that, while not impossible, seems to be assuming the conclusion a bit too much. We can't look at a genome, decide that whatever we see is good and useful just because it's there, and then work out what its function must then be. That seems a bit too Panglossian: all is for the best in the best of all possible genomes, and if a lungfish needs one three hundreds times larger than the fugu fish, well, it must be three hundred times harder to be a lungfish? Such a disparity between the genomes of two organisms, both of them (to a first approximation) running the "fish program", could also be explained by there being little evolutionary pressure against filling your DNA sequence with old phone books.

Here's an editorial at Nature about this new paper:

There is a valuable and genuine debate here. To define what, if anything, the billions of non-protein-coding base pairs in the human genome do, and how they affect cellular and system-level processes, remains an important, open and debatable question. Ironically, it is a question that the language of the current debate may detract from. As Ewan Birney, co-director of the ENCODE project, noted on his blog: “Hindsight is a cruel and wonderful thing, and probably we could have achieved the same thing without generating this unneeded, confusing discussion on what we meant and how we said it”

He's right - the ENCODE team could have presented their results differently, but doing that would not have made a gigantic splash in the world press. There wouldn't have been dozens of headlines proclaiming the "end of junk DNA" and the news that 80% of the genome is functional. "Scientists unload huge pile of genomic data analysis" doesn't have the same zing. And there wouldn't have been the response inside the industry that has, in fact, occurred. This comment from my first blog post on the subject is still very much worth keeping in mind:

With my science hat on I love this stuff, stepping into the unknown, finding stuff out. With my pragmatic, applied science, hard-nosed Drug Discovery hat on, I know that it is not going to deliver over the time frame of any investment we can afford to make, so we should stay away.

However, in my big Pharma, senior leaders are already jumping up and down, fighting over who is going to lead the new initiative in this exciting new area, who is going to set up a new group, get new resources, set up collaborations, get promoted etc. Oh, and deliver candidates within 3 years.

Our response to new basic science is dumb and we are failing our investors and patients. And we don't learn.

Comments (16) + TrackBacks (0) | Category: Biological News

March 14, 2013

Does Baldness Get More Funding Than Malaria?

Email This Entry

Posted by Derek

OK, let's fact-check Bill Gates today, shall we?

Capitalism means that there is much more research into male baldness than there is into diseases such as malaria, which mostly affect poor people, said Bill Gates, speaking at the Royal Academy of Engineering's Global Grand Challenges Summit.

"Our priorities are tilted by marketplace imperatives," he said. "The malaria vaccine in humanist terms is the biggest need. But it gets virtually no funding. But if you are working on male baldness or other things you get an order of magnitude more research funding because of the voice in the marketplace than something like malaria."

Gates' larger point, that tropical diseases are an example of market failure, stands. But I don't think this example does. I have never yet worked on any project in industry that had anything to do with baldness, while I have actually touched on malaria. Looking around the scientific literature, I see many more publications on potential malaria drugs than I see potential baldness drugs (in fact, I'm not sure if I've ever seen anything on the latter, after minoxidil - and its hair-growth effects were discovered by accident during a cardiovascular program). Maybe I'm reading the wrong journals.

But then, Gates also seems to buy into the critical-shortage-of-STEM idea:

With regards to encouraging more students into STEM education, Gates said: "It's kind of surprising that we have such a deficit of people going into those fields. Look at where you can have the most interesting job that pays well and will have impact on society -- all three of those things line up to say science and engineering and yet in most rich countries we see decline. Asia is an exception."

The problem is, there aren't as many of these interesting, well-paying jobs around as there used to be. Any discussion of the STEM education issue that doesn't deal with that angle is (to say the least) incomplete.

Comments (28) + TrackBacks (0) | Category: Drug Development | Drug Industry History | Infectious Diseases

Thallium Poisoning, Again

Email This Entry

Posted by Derek

I agree with something Chemjobber said about this case - there's clearly a lot more to it than we know. Last fall, a student at the University of Southampton in the UK was poisoned with arsenic and thallium. According to this article in Chemistry World, it was more than the usual lethal dose, and both accidental exposure and suicide have been ruled out. The student himself is making a slow recovery; I wish him the best, and hope to eventually report good news.

So, not an accident, and not suicide. . .well, that doesn't leave much but intentional poisoning, does it? As this post details, though, thallium is the murder weapon of idiots who think that they're being high-tech. The Dunning-Kruger effect in action, in other words.

Comments (15) + TrackBacks (0) | Category: The Dark Side | Toxicology

Scientists and Google Reader's Demise

Email This Entry

Posted by Derek

I suspect that many people follow this blog through its RSS feed. And I feel sure that many readers here follow the current scientific literature that way. Journals are updated constantly, and that's the most concentrated way to get all the new information in one place for flipping through. (No more "new journal table" in the library, is there?)

Well, as you've probably heard, the site that many of us have been using to do all this is closing. Google Reader is to be shut down on July 1. Problems had been apparent for some time now, but this still took me by surprise. And now the search is on for a replacement.

Feedly is apparently trying to clone the service on their own, so that's a possibility. And The Old Reader seems to be an effort to recreate the service as well, going back to some sharing functionality that Google stripped out a while back in the interest of promoting Google+. I'll be evaluating these and others.

What I already know is this: many RSS-based services seem to be colorful-picture-tile things (like Flipboard), and for the chemical literature, they're of no use to me. I am feeling more like a dinosaur every time I say this, but I don't own a tablet (or not yet), and I wish that web sites would find a way to deliver their content both ways: in concentrated blasts of scrolling info for people using a more conventional desktop (or who just like it that way) and in big, flippy, roomy, tablet-screen-sized chunks for those who like it that way instead. One size doesn't fit all.

And that's where Google Reader will be missed, unless someone else can step up. The scientific literature needs a tool like this - we have hundreds and hundreds of new articles coming along all the time, and while scrolling through them in RSS isn't ideal, it's a lot better than any other solution I've come across. Looking at the various comments around the web about Reader's demise, I see that it's hard-core information nerds that are mourning it most - well, if scientists don't fit that description, they should. We're industrial consumers of information, and we need industrial-strength tools.

Update: here's the best list of alternatives I've seen so far.

Comments (35) + TrackBacks (0) | Category: The Scientific Literature

March 13, 2013

Who to Manufacture an API?

Email This Entry

Posted by Derek

Here's a very practical question indeed, sent in by a reader:

After a few weeks of trying to run down a possible API manufacturer for our molecule, I am stuck. We have a straightforward proven synthesis of a 300 weight lipid and need only 10 kg for our trials. Any readers have suggestions? Alternately, we will do it ourselves and find someone to help us with the documentation. Suggestions that way, too?

That's worth asking: who's your go-to for things like this, a reliable contract supplier for high-quality material with all the documentation? I'll say up front that I don't know who's been contacted already, or why the search has been as difficult as it has, but I'll see if I can get more details. Suggestions welcome in the comments. . .

Update: this post has generated a lot of very sound advice. Anyone who's approaching this stage for the first time (as my correspondent clearly is) is looking at a significant expenditure for something that could make or break a small research effort. I'm putting this note up for people who find this post in future searches - read the comments; you'll be glad you did.

Comments (67) + TrackBacks (0) | Category: Drug Development

Getting Down to Protein-Protein Compounds

Email This Entry

Posted by Derek

Late last year I wrote about a paper that suggested that some "stapled peptides" might not work as well as advertised. I've been meaning to link to this C&E News article on the whole controversy - it's a fine overview of the area.

And that also gives me a chance to mention this review in Nature Chemistry (free full access). It's an excellent look at the entire topic of going after alpha-helix protein-protein interactions with small molecules. Articles like this really give you an appreciation for a good literature review - this information is scattered across the literature, and the authors here (from Leeds) have really done everyone interested in this topic a favor by collecting all of it and putting it into context.

As they say, you really have two choices if you're going after this sort of protein-protein interaction (well, three, if you count chucking the whole business and going to truck-driving school, but that option is not specific to this field). You can make something that's helical itself, so as to present the side chains in what you hope will be the correct orientation, or you can go after some completely different structure that just happens to arrange these groups into the right spots (but has no helical architecture itself).

Neither of these is going to lead to attractive molecules. The authors address this problem near the end of the paper, saying that we may be facing a choice here: make potent inhibitors of protein-protein interactions, or stay within Lipinski-guideline property space. Doing both at the same time just may not be possible. On the evidence so far, I think they're right. How we're going to get such things into cells, though, is a real problem (note this entry last fall on macrocyclic compounds, where the same concern naturally comes up). Since we don't seem to know much about why some compounds make it into cells and some don't, perhaps the way forward (for now) is to find a platform where as many big PPI candidates as possible can be evaluated quickly for activity (both in the relevant protein assay and then in cells). If we can't be smart enough, or not yet, maybe we can go after the problem with brute force.

With enough examples of success, we might be able to get a handle on what's happening. This means, though, that we'll have to generate a lot of complex structures quickly and in great variety, and if that's not a synthetic organic chemistry problem, I'd like to know what is. This is another example of a theme I come back to - that there are many issues in drug discovery that can only be answered by cutting-edge organic chemistry. We should be attacking these and making a case for how valuable the chemical component is, rather than letting ourselves be pigeonholed as a bunch of folks who run Suzuki couplings all day long and who might as well be outsourced to Fiji.

Comments (10) + TrackBacks (0) | Category: Drug Assays | Drug Development | Pharmacokinetics

March 12, 2013

Is GSK Up to Something Else, Too?

Email This Entry

Posted by Derek

The news about Sirtris prompts me to mention something else that I've been hearing about over the last few days. More than one source has told me that GlaxoSmithKline is thinking about doing some other rearranging/staff cutting, but I don't have enough detail beyond that to elaborate. I wonder if today's Sirtris announcement is part of such a move? At any rate, one of the places where these stories seem to be going around the most, naturally, is inside GSK itself. We'll see if any more announcements come in the near future.

Comments (36) + TrackBacks (0) | Category: Business and Markets

Sirtis Gets Shut Down in Cambridge

Email This Entry

Posted by Derek

Just heard rumors of this earlier this morning, and the rumors are true: GSK is shutting down the Sirtris operation in Cambridge. FierceBiotech has the goods:

GlaxoSmithKline has decided to shutter Sirtris's office in Cambridge, MA, opting to fully integrate their research work now underway into the giant pharma company's R&D operations. A spokesperson for GSK tells FierceBiotech that about 60 staffers currently work at the site in Cambridge, and an yet undetermined number will be given a chance to relocate to the Philadelphia area.

More details as I hear them. I didn't expect this to be Sirtris day around here, but you never know, do you?

Comments (16) + TrackBacks (0) | Category: Business and Markets

Resveratrol Gets Some Details Cleared Up

Email This Entry

Posted by Derek

I've been meaning to write on this paper, from David Sinclair and co-workers, on the mechanism of resveratrol action. The backstory is so long and convoluted that you're going to have to set aside some time to catch up if you're just joining it (paging back through this category archive will give you some play-by-play). But the basics are that resveratrol came on the scene as an activator of the enzyme SIRT1, which connection was later called into question by work that showed a lot of artifacts in the assay conditions used to establish it.

This new paper may well clear some of that up. The fluorescent tagged peptides that were producing the false positive may well be mimicking the natural protein partners, if this analysis is correct. SIRT1, as it turns out, recognizes a hydrophobic domain in the same region of each, which can be the fluorescent tag, or native hydrophobic amino acids themselves.

So it appears that resveratrol (and other synthetic sirtuin activators) are acting allosterically on the protein. This work found a single SIRT1 amino acid mutant (E230K) that doesn't affect SIRT1's catalytic activity, but does completely mess with resveratrol's ability to activate it (the other compounds in this class show the same effect). That makes for a neat story, and it would resolve several questions about the molecular mechanism of action.

But it leaves open the bigger questions: is SIRT1 a human drug target? Do activators exert beneficial effects, and do different ones have different profiles in living systems? There's already plenty of evidence for some of these; the problem is, the evidence points both ways (much of this is summed up and linked to in this post). Resveratrol itself is not, I would say, an appropriate molecule to answer the detailed questions (other than "What does effects does resveratrol itself have?"). It does not have particularly good pharmacokinetic properties, for one thing, and it is known to hit a lot of other things besides SIRT1 (Sinclair himself has referred to it as a "dirty molecule", and I agree).

So it's the follow-on sitruin activators that GSK has that are the real vehicles for answering these very interesting (and potentially important, and potentially lucrative) questions. A quick look at Clinicaltrials.gov shows that work has been done on SRT2104 and SRT2379, but many of these studies have been complete for a year or two now. (Here's the one that's listed as ongoing - it and several others have an anti-inflammatory bent). All we can deduce is that at least two SRT compounds are being (or recently have been) evaluated in the clinic. Fierce Biotech has a bit more from Sirtris CEO George Vlasuk:

About those clinical trials: GSK's massive investment in Sirtris has yet to lead to a drug or a prime-time drug candidate. Sirtris has ended clinical development of multiple synthetic compounds after initial human studies, Vlasuk said. And his team is hunting for the precise mechanism for activating SIRT1 in hopes of creating more potent compounds than resveratrol to treat diseases.

Hmm. I thought that maybe this new Science paper was the precise mechanism. But maybe not? The story continues. . .

Comments (9) + TrackBacks (0) | Category: Aging and Lifespan

March 11, 2013

The Baran Group Blog (And Some Others?)

Email This Entry

Posted by Derek

I wanted to mention that Phil Baran's group at Scripps now has a blog, where group members are putting up posts on several topics, from recent syntheses to jelly beans. Well worth keeping an eye on. And I'd like to take the opportunity to welcome Baran and his group to the blogging world.

Which reminds me: the blogroll at left is (as usual) wildly overdue for updating. I've got a list of sites that I'll put up there, but I'd be glad to take recommendations, because it wouldn't surprise me if I've missed some, too. Please add some to the comments, and I promise to actually clean things up this week (!)

Comments (15) + TrackBacks (0) | Category: Chemical News

Suing A Generic Drug Maker: When And How?

Email This Entry

Posted by Derek

The great majority of prescriptions in this country are for generic drugs. And generic drugs are cheaper in the US than they are in Europe or many other areas - they're a large and important part of health care. And as time goes on, and more and more medicines move into that category, that importance, you'd think, can only increase.

So a case that's coming before the Supreme Court later this month could have some major effects. It concerns a woman in New Hampshire, Karen Bartlett, who was prescribed sulindac, one of the non-steroidal anti-inflammatory drugs that have been around for decades. All the NSAIDs (and other drugs besides) carry a small risk of Stevens-Johnson Syndrome, which is an immune system complication that ranges from mild (erythema multiforme) to very severe (toxic epidermal necrolysis, and I'm not linking anyone to pictures of that). Most unfortunately, Mrs. Bartlett came down with severe TEN, which has left her permanently injured. She spent months in burn units and under intensive medical care.

But now we come to the question that always comes up in modern life: whose fault is this? She sued that generic drug company (Mutual Pharmaceutical), and won a $21 million dollar judgment, which was upheld on appeal. But the Supreme Court has agreed to hear the next level of appeal, and a lot of people are going to be watching this one very closely. Mutual's defense is that the original manufacturer of the drug (Merck) and the FDA are responsible for these sorts of things (if anyone is), and that they are merely making (under regulatory permission) a drug that others discovered and that others have regulated over the decades.

A case with some similarities came before the court in 2010, Pliva v. Mensing. That one, though, turned on the labeling language, and how much control a generic company had over the label warnings. "Not much", said the court, which limited patients' ability to sue on those grounds. That seems proper, but, as that New York Times article shows, it also has the perverse effect of giving people more potential recourse if they take a drug as made by the original manufacturer as opposed to the exact same substance as made by a generic company, which doesn't make much sense.

This latest case does not argue label warnings; it argues that the drug itself is defective. Now, it does not seem fair that a generic company should have to pay for the bad effects of a drug it did not discover, did not take through the clinic, and did not reap the benefits of during its patent lifetime (when any bad effects in the real world should have become clear). On the other hand, there are problems with going the other way and sending all lawsuits back to the original developers of the drug. After all, sulindac has been on the market since the early 1980s, under the regulatory authority of the FDA, which could have pulled it from the market at any time and has not. The agency has also authorized several generic manufacturers to produce it since that time. From a regulatory standpoint, how defective can it be? Allowing the originating company to be sued for all the generic versions, after such a long interval, would seem to open up a "find the deep pockets" strategy for everyone who comes along. (And as that older post argues, if this is made the law of the land, it will add to the costs of current drugs, whose prices will surely then be adjusted to deal with decades of future liability concerns).

And if I had to guess, I would think that the Supreme Court is going to find a way out of coming down firmly on one side of the issue or the other. A 2008 decision, Riegel v. Medtronic, said that medical device makers were, in some cases, shielded from state-level tort claims because of regulatory pre-emption. (But note that this isn't always the case; nothing is always the case in law, which is so close to a perpetual motion machine that you start to wonder about the laws of thermodynamics). But an earlier attempt to use these arguments in a pharmaceutical case (Wyeth v. Levine) got no traction at all in the court. But to avoid having either of those outcomes in the paragraph above, I still think that the justices are going to find some way to make this more of a federal regulatory pre-emption case, and to distinguish it from Wyeth v. Levine.

And if that happens, it will mean what for Karen Bartlett? Well, it would mean that she has no recourse. Something terrible has happened to her, but terrible things happen sometimes. That's a rather cold way of looking at it, and I would probably not be disposed to look at it that way were it me, or a member of my family. But that might end up being the right call. We'll see.

Update: as detailed over at Pharmalot, the Obama administration has reversed course on this issue, and is now directing the Solicitor General to argue in favor of federal pre-emption in this case. But two former FDA commissioners (David Kessler and Donald Kennedy) have filed briefs in support of Bartlett, arguing that to assume pre-emption would be to assume too much ability of the FDA to police all these issues on its own (without the threat of lawsuits to keep manufacturers on their toes). So there's a lot of arguing to be done here. . .

Comments (30) + TrackBacks (0) | Category: Regulatory Affairs | Toxicology

March 8, 2013

Biopharma Startups in India and China - Do They Exist?

Email This Entry

Posted by Derek

So we all know about the amount of biopharma investment going into places like China and India - right? But it's important to keep the categories straight. There's manufacturing, which is its own thing, and there are service organizations, which are a very large part of the market. But neither of those are doing their own R&D. What part of the investment in these countries is going to what we'd think of as traditional venture capital and local research?

There's an article in Nature Biotechnology that tries to answer this question (and it's not an easy one). Here's the take-away:

. . .data on sources of venture capital (VC) that are supporting such innovative biotech startups are unclear because existing investment metrics include not only innovative enterprises but also manufacturing or service firms lacking R&D capability. The quality of published data is also poor, with only one study on healthcare VC activity in China providing data for a single quarter in 2008
and it does not separate innovative ventures. Here, we present a data set of life sciences VC in emerging markets to inform government innovation policy and VC investment strategy. Our data suggest that life sciences VC activity is low in the emerging economies we studied, despite growing levels of activity in that sector and in those regions.

The authors are basing their conclusions (on China, India, Brazil, and South Africa) largely on their own fieldwork, rather than relying on what's in the press, which is probably a wise decision. They found 116 firms backed by 148 financing deals, which may sound like a lot, but the total amounts aren't too impressive yet. Their estimate is that since 2000, about $1.7 billion has been invested, which (by comparison) would be considered a strong quarterly figure in the US. Most of these firms (about 70) are Chinese, and most of the rest are Indian (Brazil and South Africa are round-off errors). The outfits doing the fund-raising are also quite concentrated; there are some big players in both countries, and there's a scattering of everybody else. A lot of the money is from home as well. The great majority of these firms, as it turns out, are targeting oncology (a full 90% of the Chinese ones, for example).

So what are we to make of all this? These numbers are about as good as anyone is going to see, but they're probably still incomplete. At any rate, it seems clear that the amount of money going into new biopharma companies in these countries is still very tiny by industry standards. There are surely several reasons for this - lack of a "startup culture" being a big (albeit vague) one. That covers a lot of ground, including physical infrastructure and fewer experienced investors. It's not like India and China have a long history of funding small new medical research firms - it takes a while to get the hang of it, for sure (assuming that anyone ever does!)

One possibility is that the innovative research being done in these countries is being done more inside the walls of the large international firms that have set up shops there. What I think people have been waiting to see is whether these will eventually lead to more smaller companies spinning out. And then there's the other source of many startups in the US and Europe, academic labs. My impression has been that the academic research culture is very different in China and India from what we're used to in the US, and this is surely having an effect on the whole venture-capital-based world there, too. Eventually, though, the combination of the universities and the talent pool from the larger companies might cause something to happen.

But since no one's quite sure how to make a Boston/Cambridge or San Francisco Bay, it's hard to say what these countries should be doing differently, or whether any such recommendations would even be feasible. Efforts in the developing parts of Asia to make such things happen by fiat have not gone well - does anyone remember Malaysia's big push into the area? Here's a 2003 story on it - "Biovalley" was going to be the next big thing. Just a few years later, it was clear that it wasn't quite working out, and current information is rather hard to come by. India and China (and their investors) surely don't want to go through that experience. Letting things develop on their own, without too much over-targeted encouragement, might be the best course.

Comments (17) + TrackBacks (0) | Category: Business and Markets

March 7, 2013

I'll Just Take a Tour of Your Lab Drawers Here

Email This Entry

Posted by Derek

I enjoyed this from postdoc JesstheChemist on Twitter: "Busted. Just caught someone (who doesn't work in my lab) going through my lab drawers." Now that's a real-life lab comment if I ever saw one. It's a constant feature in academic labs, where there's usually limited equipment of one sort of another. There's less of it in industry, where we're relatively equipment-rich, but it certainly doesn't go away.

Glassware gets rummaged through, whether for that one tiny Dean-Stark trap, a funny-sized ground-glass stopper, or something as petty as a clean 25 mL round bottom. Run out of that fancy multicolor pH paper? The guy next to you keeps it in the second drawer. One-mL syringes ran out, and you need to dispense something right now? Third drawer.

I've seen people borrow things while they're in use. In grad school, I once had a short-path vacuum distillation going, with the receiving flasks cooled in a bath supported by a lab jack. I left for a few minutes while things were warming up, only to find my lab jack pilfered and replaced by a ragged stack of cork rings, which was not what I had in mind. Peeved, I hunted through the labs until I found the jack in the hood of a post-doc who was running something of his own. "I didn't think you were using it", was his response, which prompted me to ask what it looked like when I was actually using it.

Then you have reagent burgling, which is epidemic at all levels of bench chemistry. No one has everything to hand, and you always run out of things. The stockroom may be some distance away, or take too much time, or there may be only one bottle of 2-methyl bromowhatsicene in the lab (and you don't have it). This can be innocent, as in taking 500mg of some common reagent out of a large bottle that someone has handy. Or it can be more serious (but still well-intentioned), in the "I'm going to bring it right back" way. Further down the scale, you have plain nastiness, of the "I need this and screw the rest of you" kind. I told the story here of having had most of a fresh bottle of borane/THF jacked from me, and you know, that happened in 1986 and I'm still a little cheesed off about it. Many readers will have experienced similar sensations.

Once, during my grad school days, I went off on a rare vacation and left notes in the various drawers of my bench. "It's not here!" read one of them, and another advised people "Take this from (fellow student X). He has a lot more of them than I do". When I came back, people told me that they enjoyed my notes. There you have it.

Comments (55) + TrackBacks (0) | Category: Life in the Drug Labs

Peter Kim Retires From Merck

Email This Entry

Posted by Derek

The question now is, should the verb "retires" have quotation marks around it or not? Roger Perlmutter (ex-Amgen) will take over from him. Here's the Reuters story - more details when and if any emerge.

Comments (33) + TrackBacks (0) | Category: Business and Markets

Probing A Binding Tunnel With AFM

Email This Entry

Posted by Derek

UCP%20AFM.jpgEvery so often I've mentioned some of the work being done with atomic force microscopy (AFM), and how it might apply to medicinal chemistry. It's been used to confirm a natural product structural assignment, and then there are images like these. Now comes a report of probing a binding site with the technique. The experimental setup is shown at left. The group (a mixed team from Linz, Vienna, and Berlin) reconstituted functional uncoupling protein 1 (UCP1) in a lipid bilayer on a mica surface. Then they ran two different kinds of ATM tips across them - one with an ATP molecule attached, and another with an anti-UCP1 antibody, and with different tether links on them as well.

What they found was that ATP seems to be able to bind to either side of the protein (some of the UCPs in the bilayer were upside down). There also appears to be only one nucleotide binding site per UCP (in accordance with the sequence). That site is about 1.27 nM down into the central pore, which could well be a particular residue (R182) that is thought to protrude into the pore space. Interestingly, although ATP can bind while coming in from either direction, it has to go in deeper from one side than the other (which shows up in the measurements with different tether lengths). And the leads to the hypothesis that the deeper-binding mode sets off conformational changes in the protein that the shallow-binding mode doesn't - which could explain how the protein is able to function while its cytosolic side is being exposed to high concentrations of ATP.

For some reason, these sorts of direct physical measurements weird me out more than spectroscopic studies. Shining light or X-rays into something (or putting it into a magnetic field) just seems more removed. But a single molecule on an AFM tip seems, when a person's hand is on the dial, to somehow be the equivalent of a long, thin stick that we're using to poke the atomic-level structure. What can I say; a vivid imagination is no particular handicap in this business!

Comments (6) + TrackBacks (0) | Category: Analytical Chemistry | Biological News

March 6, 2013

Safety Warning: Togni's Reagents

Email This Entry

Posted by Derek

togni%27s.pngSome of you may have used the second Togni reagent (shown) as a trifluoromethylating agent. Well, there's a new paper in Organic Process R&D that brings word that it's an explosive hazard. A group at Novasep, in Leverkusen, Germany finds that it has a powerful exothermic decomposition, and (in addition) it's about as combustible as black powder. Sensitivity to impact and friction varied from sample to sample as well, which isn't what you want to hear, either. Their conclusion is that the reagent is "dangerously explosive and may only be transported by approval of the national competent authority." The first Togni reagent (with a dimethyl in place of the carbonyl) wasn't fully evaluated, but it may be just as bad. That one has been available from Aldrich; I imagine that this may change shortly. Too bad - these are useful reagents, but not if you have to suit up to use them safely.

Comments (11) + TrackBacks (0) | Category: Safety Warnings

Anonymity, Fakery, et al.

Email This Entry

Posted by Derek

I wanted to link to this piece at C&E News on the whole question of anonymity when it comes to comments on the chemical literature. This was brought on by the advent of Blog Syn, but it applied before that, and will continue to apply to other situations.

Its author, Fredrik von Kieseritzky, also calls for synthetic details to make it back into the body of papers, rathe than being relegated to the Supporting Information (which is never as carefully refereed as the manuscript itself). That would be a good thing, but I despair of it happening, at least until the major journals break down and admit that their page count restrictions for submissions are, in large part, relics of the days when everyone read them in print. (They serve another useful function, thought, which is getting people to tighten up their writing. "There wasn't enough time to make it shorter" is a real phenomenon).

But the rest of the commentary grew out of this piece by Rich Apodaca, whose morning it is around here. He wonders about the use of pseudonyms in science, where author recognition has long been a big motivating factor. von Kieseritzky's take is that he can see why people go anonymous (and Rich lists some very plausible reasons, too), but that he's never regretted using his own name online.

That goes for me, too. The topic of anonymity has come up here several times over the years: in chem-blogging, and in peer review of publications and grants. I'm glad that I've used my real name over the years on this blog (although it hasn't always been a smooth ride), but I also think that anonymity is a necessary option, although it certainly can be abused.

That opinion is not shared by the (pseudonymous) author of this piece in the Journal of Cell Science. It's a bit of dystopian what-if, an agitated response to the (now taken down) Science Fraud site. "Mole VIII" relates how some people (an extremely small percentage) did indeed fake scientific papers, and how this embittered other people who had been unable to make the careers in science that they wished to. So they started web sites where they cried "Fake!" about papers of all kinds, which forced the authors to spend all their time defending themselves. Many of them were driven out of doing science, whereupon they turned to exposing their former colleagues as the next best thing. And then, in one generation, science was done - stopped forever, in a hurricane of finger-pointing and snide remarks.

What a load. For one thing, I think that fakery, while not rampant, is more widespread than many people think. And even if it isn't, I think that legitimate results stand up to challenges of this sort, while the shady ones collapse at a push. Furthermore, I find the whole cycle-of-bitterness conceit ridiculous. A look back at the history of science will show that accusations of fakery and bad faith have been with us forever, and often in much more vitriolic form than today.

One problem might be that the author is a bit too academic. Try this part:

Soon, there were very few scientists left. And then fewer. Public confidence for publicly funded research disappeared. The only research that was done any more was kept secret and in the corporations. And while this gave us many new package designs for the sale of established drugs, the actual idea of ‘doing science’, of making discoveries to share with a community of interested and devoted researchers, dwindled, and finally, vanished.

Yep, that's about the size of it - package designs. I try to stay alert to threats to the scientific endeavor, and I try not to take it for granted. But I'm willing to put my real name on the opinion that the author of this stuff is being foolish.

Comments (14) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

Open Access For ACS Articles?

Email This Entry

Posted by Derek

Rich Apodaca investigates something that I didn't know, either: that the ACS provides the corresponding authors of papers with links to their articles, which (1) allow for fifty free downloads during the first year after publication, and (2) allow for unlimited free downloads after that. I thought about that for a while, and couldn't remember any examples of such a link, not that I'd noticed, at any rate.

Apodaca's having trouble reducing it to practice, too. He has been trying to get such a link for one of his own papers in J. Med. Chem., and . . .well, his post will tell you about all the places he's looked so far. Let's just say that the ACS does not make it obvious where a corresponding author is supposed to obtain such a URL. Has anyone out there tried this, and has anyone had any success?

Comments (12) + TrackBacks (0) | Category: The Scientific Literature

March 5, 2013

TauRx's Funding Is Odd

Email This Entry

Posted by Derek

I still get inquiries about TauRx and their work on Alzheimer's. There's an awful lot of pent-up demand in that field, and it's getting worse every year. The latest is that the company has ten million more dollars in a follow-on investment option from the Dundee Corporation of Toronto.

Who they? That's what I wondered, too, and the press release occasions more questions than it answers:

Dundee Corporation is a Canadian independent publicly traded asset management company listed on the Toronto Stock Exchange (“TSX”) under the symbol “DC.A”. Asset management activities are focused in the areas of the corporation’s core competencies and include resources, real estate and infrastructure, and more recently, the agriculture sector.

What, then, are they doing investing in biopharma? You can lose your shirt over here, guys, and you can most especially lose it in Alzheimer's. TauRx also has major funding from the Genting Burhad group. And you may well ask "Who they?", too, because they're a large Malaysian company whose core business is casinos and resorts. Now, they're also into cruise ships, and oil and gas, and power generation and (perforce) real estate, but biotech would seem to be rather far down the list.

This is a. . .unique funding setup for a biopharma company. I have to think that there's a reason for it, but I'm not quite sure what the reason is. Speculation, anyone? Thanks to John Carroll of FierceBiotech on Twitter, who doesn't understand what's going on, either.

Comments (16) + TrackBacks (0) | Category: Alzheimer's Disease | Business and Markets

What Really Makes a Biopharma Hub?

Email This Entry

Posted by Derek

Luke Timmerman at Xconomy has a good post on biotech research hubs. A recent survey set him off, not because it ranked the Boston area #1 (a reasonable assessment, and not just because I live here), but because it ranked San Diego #2.

It's not that he has anything against San Diego (nor do I). But it does not outrank the San Francisco Bay area as a biopharma hub, not in any way that I can think of. Luke goes into the details, and shows how this latest survey went off the rails. But he's also calling for someone to come up with a better one, and he has a very realistic list of criteria that should be used.

So what's the harm? San Diego (and Raleigh-Durham, etc.) get to feel good when they finish high in such surveys, and why not? Well, the temptation might be to think that you already have what you need to succeed - heck, that you've already succeeded. But San Diego, for example, could use help in the local venture capital environment, and (as Luke points out) could also use some help even in things like its airport connections. Complacency is not your friend.

Comments (12) + TrackBacks (0) | Category: Business and Markets

March 4, 2013

The IBX Answer

Email This Entry

Posted by Derek

I wanted to point out what looks like the resolution of the Blog Syn story about IBX oxidations. See Arr Oh seems to have discovered the discrepancy that's been kicking the results around all over the place: water in the IBX itself. So it looks like this whole effort has ended up discovering something important that we didn't know about the reaction, and nailed down yet another variable. Congratulations!

Comments (9) + TrackBacks (0) | Category: Chemical News

Putting the (Hard) Chemistry Back in Med Chem

Email This Entry

Posted by Derek

While I'm on the subject of editorials, Takashi Tsukamoto of Johns Hopkins has one out in ACS Medicinal Chemistry Letters. Part of it is a follow-up to my own trumpet call in the journal last year (check the top of the charts here; the royalties are just flowing in like a river of gold, I can tell you). Tsukamoto is wondering, though, if we aren't exploring chemical space the way that we should:

One of the concerns is the likelihood of identifying drug-like ligands for a given therapeutic target, the so-called “druggability” of the target, has been defined by these compounds, representing a small section of drug-like chemical space. Are aminergic G protein coupled receptors (GPCRs) actually more druggable than other types of targets? Or are we simply overconcentrating on the area of chemical space which contains compounds likely to hit aminergic GPCRs? Is it impossible to disrupt protein–protein interactions with a small molecule? Or do we keep missing the yet unexplored chemical space for protein–protein interaction modulators because we continue making compounds similar to those already synthesized?

. . .If penicillin-binding proteins are presented as new therapeutic targets (without the knowledge of penicillin) today, we would have a slim chance of discovering β-lactams through our current medicinal chemistry practices. Penicillin-binding proteins would be unanimously considered as undruggable targets. I sometimes wonder how many other potentially significant therapeutic targets have been labeled as undruggable just because the chemical space representing their ligands has never been explored. . .

Good questions. I (and others) have had similar thoughts. And I'm always glad to see people pushing into under-represented chemical space (macrocycles being a good example).

The problem is, chemical space is large, and time (and money) are short. Given the pressures that research has been under, it's no surprise that everyone has been reaching for whatever will generate the most compounds in the shortest time - which trend, Tsukamoto notes, makes the whole med-chem enterprise that much easier to outsource to places with cheaper labor. (After all, if there's not so much skill involved in cranking out amides and palladium couplings, why not?)

My advice in the earlier editorial about giving employers something they can't buy in China and India still holds, but (as Tsukamoto says), maybe one of those things could (or should) be "complicated chemistry that makes unusual structures". Here's a similar perspective from Derek Tan at Sloan-Kettering, also referenced by Tsukamoto. It's an appealing thought, that we can save medicinal chemistry by getting back to medicinal chemistry. It may even be true. Let's hope so.

Comments (25) + TrackBacks (0) | Category: Chemical News | Drug Assays | Drug Industry History

von Eschenbach Takes Another Whack at Phase III Trials

Email This Entry

Posted by Derek

Here's a new editorial on clinical trials and drug development by Tomas Philipson and Andy von Eschenbach (former head of the FDA). It continues his earlier theme of scaling back Phase III trials (which I commented on here).

These Phase 3 clinical trials served us well in the past. Today, in an era of precision or personalized-drug development, when medicines increasingly work for very specific patient groups, the system may be causing more harm than good for several reasons.

First, because of their restrictive design and the way the FDA interprets their results, Phase 3 trials often fail to recognize the unique benefits that medicines can offer to smaller groups of patients than those required in trials.

Second, information technologies have created improvements in our ability to monitor and improve product performance and safety after medicines are approved for sale. Post-market surveillance can and should reduce dependence on pre-market drug screening in Phase 3 trials.

Third, reducing reliance on Phase 3 trials is unlikely to introduce an offsetting harm induced by more dangerous drugs, since evidence supporting safety is produced in earlier phases. Manufacturers also have powerful incentives to maintain drug safety, since they take enormous financial hits -- well beyond the loss of sales -- when drugs are withdrawn after approval.

I'm still of two minds about this proposal. The idea of moving to less preclinical study and more post-marketing surveillance is not a ridiculous one, but our current system (and the expectations it generates) do make a good fit with it. The nasty details I noticed being glossed over earlier are still with us: how will health insurance companies deal with this change? How do we keep unscrupulous gaming of the system, with companies rushing things to market and spinning out the postmarketing studies as thinly and cheaply as possible? What would keep the real bottom-of-the-barrel types from pumping out high-priced placebos for demanding diseases like Alzheimer's, which compounds would fly through safety studies and reap big profits until they (slowly) were proved ineffective? What would be the legal aspect of all this - that is, when would a patient have the right to sue if something goes badly wrong, and when would they have to just realize that they're taking an investigational drug and that they're part of a research study?

These are real problems, but you wouldn't imagine that they even exist when you read these editorial pieces. I'm a fairly libertarian guy, but these are the sorts of things that occur to me within the first few minutes of thinking about such proposals, which means that there must be many other wrinkles I haven't thought of yet. I agree that increasing the research productivity of the drug industry would be an excellent thing, but I'm really not sure that this is the way to do it.

Comments (9) + TrackBacks (0) | Category: Clinical Trials | Regulatory Affairs

March 1, 2013

The Finest Green in the Lab?

Email This Entry

Posted by Derek

nickelchloride.jpgFor Friday afternoon, I thought I'd put up another color post. That's nickel (II) chloride hydrate, and the only time I've used it was in a modified borohydride reduction. But that was a glorious prep, at least until the borohydride went in and everything turned black. Nickel chloride in methanol is as green as it gets - that's another one that I'm going to have just take a photo of sometime.

It's fake-looking, like some sort of dye, especially when you see it in an organic chemistry lab. Green is one of the harder colors for "normal" organic compounds to take on, so a vivid lime-gelatin-mix reaction really stands out. Does anyone have any other candidates?

Comments (28) + TrackBacks (0) | Category: Life in the Drug Labs

Yuri Milner's Millions, And Where They're Going

Email This Entry

Posted by Derek

You'll have heard about Yuri Milner, the Russian entrepreneur (early Facebook investor, etc.) who's recently announced some rather generous research prize awards:

Yesterday, Milner, along with some “old friends”—Google cofounder Sergey Brin, Facebook CEO Mark Zuckerberg, and their respective wives—announced they are giving $33 million in prizes to 11 university-based biologists. Five of the awards, called the Breakthrough Prize in Life Sciences, will be given annually going forward; they are similar to prizes for fundamental physics that Milner started giving out last year.

At $3 million apiece, the prize money tops the Nobels, whose purse is around $1 million. Yet neither amount is much compared to what you can make if you drop out of science and find a calling in Silicon Valley, as Brin, Milner, Zuckerberg did.

Technology Review has a good article on the whole effort. After looking over the awardees, Antonio Regalado has some speculation:

But looking over the list (the New York Times published it along with some useful biographical details here), I noticed some very strong similarities between the award winners. Nearly all are involved in studying cancer genetics or cancer stem cells, and sometimes both.

In other words, this isn’t any old list of researchers. It’s actually the scientific advisory board of Cure for Cancer, Inc. Because lately, DNA sequencing and better understanding of stem cells have become the technologies that look most likely to maybe, just maybe, point toward some real cancer cures.

Wouldn't surprise me. This is a perfectly good area of research for targeted funding, and a good infusion of cash is bound to help move things along. The article stops short of saying that Milner (or someone he knows) might have a personal stake in all this, but that wouldn't be the first time that situation has influenced the direction of research, either. I'm fine with that, actually - people have a right to do what they want to with their own money, and this sort of thing is orders of magnitude more useful than taking the equivalent pile of money and buying beachfront mansions with it. (Or a single beachfront mansion, come to think of it, depending on what market we're talking about).

I've actually been very interested in seeing how some of the technology billionaires have been spending their money. Elon Musk, Jeff Bezos, David Page, Sergey Brin, etc., have been putting some money behind some very unusual ventures, and I'm very happy to see them do it. If I were swimming in that kind of cash, I'd probably be bankrolling my own space program or something, too. Of course, those sorts of ideas are meant to eventually turn a profit. In that space example, you have tourism, launch services, asteroid mining, orbiting solar power, and a lot of other stuff familiar to anyone who ever read an old John W. Campbell editorial.

What about the biopharma side? You can try to invest to make money there, but it's worth noting that not a lot of tech-era money has gone into venture capital in this area. Are we going to see more of it going as grants to academia? If so, that says something about the state of the field, doesn't it? Perhaps the thinking is that there's still so much basic science to be learned that you get more for your dollar investing in early research - at least, it could lead to something that's a more compelling venture. And I'd be hard pressed to argue.

Comments (20) + TrackBacks (0) | Category: Academia (vs. Industry) | Who Discovers and Why