Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

September 30, 2009

Ignoring Patents?

Email This Entry

Posted by Derek

There's a letter in the latest Nature from two researchers in Halifax that makes a point which isn't made often enough. Why do so many papers in the literature ignore patent references?

Why are patent citations so conspicuously absent across academic journals, with most even omitting formatting instructions for these in their author guidelines? Patents present novel, rigorously reviewed unpublished work, as well as providing an unmatched resource for detail.

We randomly selected one month (December 2008) and reviewed all citations in the reviews, articles and letters/reports in Nature (1,773 citations) and Science (1,367). These citations included textbooks, http://arXiv.org preprints and abstracts — but no patents.

They go on to point out that searching the patent literature, which traditionally was rather a pain, is much easier now, as is access to the patents themselves. And they have a point. When I was in graduate school in the 1980s, getting a procedure out of a patent was really considered an absolute last resort - it was a special order in the library, and you had this vague feeling that there was some sort of trickiness going on, that none of those syntheses were ever actually supposed to work, anyway.

Not so. While the patent literature is indeed full of junk, the open literature is, too. They're not exactly peer-reviewed, true - but journal papers have a much lower chance of having to stand up in court, so things sort of even out. And as far as organic synthesis is concerned, patents are full of real procedures to make real things (and often enough, with real spectral data to support the claims). Most of the compounds I've made in my career that have seen any light of day at all have done so in patents, and they're real as can be.

I've complained several times when refereeing papers for publication about the lack of relevant patent citations in them. And I'd advise others to do the same - this branch of the scientific literature deserves its due.

Comments (23) + TrackBacks (0) | Category: Patents and IP | The Scientific Literature

Microwaves Aren't Magic

Email This Entry

Posted by Derek

Many synthetic chemists these days use microwave reactors to speed up their reactions, especially metal-catalyzed couplings. But there's been a debate ever since the technique became popular about why it works so well. Some people think that microwave irradiation is just a very efficient and fast way to heat up a reaction, while others have hypothesized some sort of microwave-specific effect, outside of the heating behavior. Metal catalysts have been particular favorites for this possibility.

The former view has been gaining ground, though, and I think we can now say that it's won. A new paper from the lab of microwave chemistry pioneer Oliver Kappe has an ingenious way to settle the argument. They've fabricated a microwave reactor vial out of silicon carbide. It's chemically inert and has very high thermal conductivity, but SiC is completely opaque to microwave frequencies. Reactions run in this vessel heat up just as quickly as those run in the same-sized glass tube, and reach the same internal pressures and working temperatures. But the contents experience no microwave irradiation at all.

Kappe and his co-workers ran a wide variety of reactions head-to-head in the two kinds of vial, including a range of metal catalysts. No differences were observed in the yields, purities, or side products for any of eighteen different types of reaction. That's good enough for me: unless someone can come up with a weirdo outlier catalyst, there is no nonthermal microwave effect on organic chemistry.

Comments (17) + TrackBacks (0) | Category: Chemical News

September 29, 2009

Nobel Season 2009

Email This Entry

Posted by Derek

Fall is in the air, which (for a very small group of people) brings thoughts of a call from Stockholm. The Nobel Prizes will be announced next week, starting the Physiology and Medicine on Monday. And as in years past, people are lining up with predictions.

Predicting the Chemistry prize is tricky, since it's so often used as a surrogate for the nonexistent Biology prize (and, once in a while, as an overflow Physics one as well). But let's take a look at the field and see if the Scandinavians surprise us or not.

The two best roundups I've seen so far are from the Wall Street Journal and Thomson/Reuters. For Chemistry, the Journal has a pair of biology prize possibilities going to (1) Hartl and Horwich for chaperone proteins, or (2) Winter and Lerner for antibodies (humanized, monoclonal, catalytic). They also have a material-science one for Matyjaszewski (atom-transfer radical polymerization). Note that that last Wikipedia entry seems to show (at least as of this morning) the hand of an interested editor.

Meanwhile, the Thomson people, using a citation-based algorithm, have no overlaps with this list at all. They suggest (1) Michael Grätzel (dye-based solar cells), (2) Jackie Barton, Bernd Giese, and Gary Schuster (electron transfer in DNA), or (3) Benjamin List (asymmetric catalysis).

And over at the Chem Blog, the current favorites are Grätzel and also Richard Zare, Allan Bard, and William Moerner for single-molecule spectroscopy. Those last two have already picked up the Wolf Prize in Chemistry for that work in 2008, and Zare won one in 2005. It's worth noting that Richard Lerner, from the Thomson list, won back in 1994-1995, along with Peter Schultz, who also is often mentioned when Nobel time comes around.

I think that Grätzel is a good bet, considering that the work seems solid and that solar power is such a hot topic these days. I would like to see Bernd Giese get in on a prize, since I did my post-doc with him, but I consider the electron-transfer work to be more of a long shot, at least for now. List is probably the best shot at a "pure organic chemistry" prize; although I also doubt that this is the way it'll go this year. As always, it wouldn't surprise me a bit if things bleed over from biology - the committee might go as far as to consider telomeres to be chemicals and give it to Blackburn, Greider, and Szostak. And that's certainly worth an award, just not in Chemistry.

We'll know soon. Feel free to put your favorites into the comments, and I'll update this post with the list of suggestions. One of has to get it right, you'd think.

Comments (73) + TrackBacks (0) | Category: Chemical News

September 28, 2009

Which Pfizer / Wyeth Sites Will Close?

Email This Entry

Posted by Derek

I have no solid information on this question myself, but Eric Milgram over at Pharmaconduct is trying a wisdom-of-crowds approach. He's got a survey up of which sites people think will close, and it'll be interested to see how well this matches up with the eventual reality.

At the bottom of the list, naturally, is Pfizer's site at Groton. I think we can safely predict that this one will stay open, but the New London site, right across the river, doesn't fare so well in the voting. In fact, it's the second-highest-ranking Pfizer site on the list, outdone only by St. Louis (the former Monsanto). The rest of the top contenders are all Wyeth, led by Madison and Princeton.

The "wisdom of crowds" method doesn't produce wisdom out of thin air, of course - it's supposed to be a more efficient way of getting to information that's already out there. In this case, though, I don't think that the information is out there, so this should be taken as more of a poll of sentiment, which is certainly how it's presented. To that aspect of it, one thing that Milgram's already noticed is that people who are currently employed at either Pfizer or Wyeth tend to believe that it's those other guys who are most likely to have to close some facilities. We shall see. . .

Comments (44) + TrackBacks (0) | Category: Business and Markets

Chew On This, Enzyme

Email This Entry

Posted by Derek

File this one under "Department of Odd Ideas". There's a paper coming out in JACS that has a neat variation on an idea that's been kicking around for some years now: molecularly-imprinted polymers (MIPs). A MIP is a sort of molded form around some molecular template - you make your polymer in the presence of the desired target molecule, with the idea that you'll then form target-shaped cavities in the resulting gel.

These things have been worked on for years in the analytical chemistry field, since they have the potential to form very robust sensors for a wide variety of substances. The thought has also been that they might serve as pseudo-enzymatic catalysts for some reactions as well, although I get the impressions that that's been harder to realize. From the outside, the whole area seems to be one of those that goes on for years as something that's still developing and hasn't quite taken off.
MIP%20abstract%20scheme.jpg
This latest idea may or may not change that, but it's ingenious. What this group (from two French labs) has done is anchor the initiation point of the polymer to an enzyme inhibitor molecule - in this case, to an amidine inhibitor of trypsin. The resulting polymer turns out to have strong inhibitory activity for the enzyme, about a thousandfold higher than the starting amidine - as well it might, if it's muffling the active site like a huge beach towel. They tried a number of potential polymeric systems, settling on some neutral methacrylates, since charged species didn't seem to give binding (or specificity) at all.

The control experiments support their interpretation of what's going on. The resulting polymers don't seem to recognize (or inhibit) a variety of otherwise similar proteins. If control polymers are formed without the anchoring group, they have no inhibitory effect. Similarly, if the experiment is done with an excess of non-polymerizable inhibitor, the effect goes away as well (since the active site is already occupied).

I'm not sure that these things will find much use as enzyme inhibitors in living systems, unless you're looking to shut down some sort of enzyme in the gut. (In that case, you might be able to give someone a glass full of soluble polymeric stuff, with the expectation that it wouldn't be absorbed and would emerge more or less unchanged. But perhaps there are applications under blood filtration or dialysis conditions, or topical ones. At any rate, it's a neat idea which is now looking for a home. . .

Comments (3) + TrackBacks (0) | Category: Chemical News

September 25, 2009

The Details of the Baucus Bill

Email This Entry

Posted by Derek

Since we were discussing the Baucus health care proposal here the other day, I thought that people would appreciate a chance to read through the provisions of the bill before forming an opinion of it.

Sorry! You can't. It's not online, and it won't even be online by the time the Finance Committee is through with it. Senator Baucus, though, would like you know know that it's because it's just too darn difficult to put it up.

So we'll just have to trust them. I suppose. We may get a chance to look things over before the House votes on anything. Unless some good reason comes up not to do that, of course. It has before.

Comments (23) + TrackBacks (0) | Category: Current Events | Regulatory Affairs

Faked Data at the ETH

Email This Entry

Posted by Derek

A data-fabrication scandal has erupted at a place that doesn't see many of those: the ETH in Zürich. Peter Chen, a physical organic chemist there, has been dealing with problems with some earlier publications (from 2000) on the spectra and ionization energies of carbon radicals. Here's one of the papers, which has now been retracted.

These data couldn't be reproduced, as became clear in the years after these papers came out. An investigation by the ETH showed what appears to be clear evidence of fakery - things like the background noise being exactly the same in what are supposed to be several different experimental spectra of different species. In fact, all the parties involved with the suspect papers agree that data have been fabricated - but none of them admit to having done it.

That's not a happy situation, is it? The official ETH news release on the topic is informative, but only up to a point. It leaves things hanging and announced that Chen is stepping down as the ETH's vice president for research. The Swiss press has picked up the story this week, though, and they're not shy about saying what the ETH doesn't seem to want to. Here's the Neue Züricher Zeitung, saying (translation mine):

The experts who have investigated the scientific fraud case at the ETH-Zürich are sure of the guilty party. Peter Chen, leader of the research group, has been clearly exonerated. . .The Commission came unanimously to the conclusion, that. . .it was likely that a former doctoral student "manipulated and fabricated" the published data. He performed most of the measurements, and could (through these machinations) have considerably shortened his work."

It also appears, if the reports I'm seeing are correct, that this person's lab notebooks have turned up missing, and are the only primary sources for the whole affair that can't be found. Lawyers representing this former student have blocked release of the entire ETH report, but it's leaked to a number of other outlets, including C&E News and Science. One way or another, the story has come out, and it's a pretty damned familiar one, too.

Comments (16) + TrackBacks (0) | Category: The Dark Side

September 24, 2009

The Grant Application Treadmill

Email This Entry

Posted by Derek

There's a (justifiably) angry paper out in PLoS Biology discussing the nasty situation too many academic researchers find themselves in: spending all their time writing grant applications rather than doing research. The paper's written from a UK perspective, but the problems it describes are universal:

To expect a young scientist to recruit and train students and postdocs as well as producing and publishing new and original work within two years (in order to fuel the next grant application) is preposterous. It is neither right nor sensible to ask scientists to become astrologists and predict precisely the path their research will follow—and then to judge them on how persuasively they can put over this fiction. It takes far too long to write a grant because the requirements are so complex and demanding. Applications have become so detailed and so technical that trying to select the best proposals has become a dark art.

And a related problem is how this system tends to get rid of people who can't stand it, leaving the sorts of people who can:

The peculiar demands of our granting system have favoured an upper class of skilled scientists who know how to raise money for a big group [3]. They have mastered a glass bead game that rewards not only quality and honesty, but also salesmanship and networking. A large group is the secret because applications are currently judged in a way that makes it almost immaterial how many of that group fail, so long as two or three do well. Data from these successful underlings can be cleverly packaged to produce a flow of papers—essential to generate an overlapping portfolio of grants to avoid gaps in funding.

Thus, large groups can appear effective even when they are neither efficient nor innovative. Also, large groups breed a surplus of PhD students and postdocs that flood the market; many boost the careers of their supervisors while their own plans to continue in research are doomed from the outset. . .

The author is no freshly-minted assistant professor - Peter Lawrence (FRS) has been at Cambridge for forty years, but only recently relocated to the Department of Zoology and experienced the grantsmanship game first-hand. He has a number of recommendations to try to fix the process: shorter and simpler application forms, an actual weighting against large research groups, longer funding periods, limits to the number of papers that can be added to a grant application, and more. Anyone interested in the topic should read the whole paper, and will probably be pounding on the desk in agreement very shortly.

The short version? We think we're asking for scientists, but we're really asking for fund-raisers and masters of paperwork. Surely it doesn't have to be this way.

Comments (21) + TrackBacks (0) | Category: Academia (vs. Industry) | Who Discovers and Why

Obesity: Hope Springs Eternal (Summer 2009 Version)

Email This Entry

Posted by Derek

Some very interesting papers from the obesity research field have been published in the last few months. There have been a number of these over the years, and (as is widely apparent), none of them have quite lived up to their initial promise. This latest mechanism has been written up by both academic groups and industrial ones, which leads to some speculation about the state of the field - read on.

First, some background: GLP-1 (glucagon-like peptide 1) is a very important metabolic regulator. Peptides that mimic it at its receptor (but with a longer half-life) are marketed diabetes therapies (Byetta (exenatide), liraglutide, and others), and the DPP-IV inhibitors, like Januvia (sitagliptin) and its upcoming competition do something similar by inhibiting the enzyme that normally breaks the peptide down.

In addition to glycemic control, GLP-1 and related ligands also have complex effects on appetite in rodent models. These are still being unraveled, and depend on which peptide you use, and whether it's given out in the periphery or into the brain. More than one mechanism seems to be involved.

Glucagon is another key player in regulating glucose - it's another peptide hormone with its own receptor, and its most noticeable effect is as sort of counterweight to insulin in glucose control. It stimulates the liver to break down glycogen and release glucose, among other things, and people have tried (so far, without success) to develop glucagon blockers as a treatment for diabetes.

There are several other important signaling peptides in this space, such as GLP-2 and oxyntomodulin, and it's been clear for a few years now that there's some sort of opportunity to come up with a mixed-activity ligand that might hit these various piano keys to produce the right chord. (Several such have been reported in the diabetes field). These peptides are all part of the gut-to-brain signaling system, which is rather complex and has been the target of a number of obesity research programs over the years. Signals for satiety, hunger, glucose handling, and energy expenditure are all tangled together there, but in ways that we don't understand well, so it's been a very attractive minefield. For the most part, compounds targeting these systems have been stabilized forms of peptides themselves, and thus have to be given by injection. Small-molecule ligands for these receptors have been much harder to come by.

Now for the new results. A team from Indiana, Kentucky, and Cincinnati reported back in July in Nature Chemical Biology that dual agonist peptides acting at both the GLP-1 and glucagon receptors do a tremendous job on obese rodents. (Here's a PDF from one of the authors). They took two of them into diet-induced-obese mice, and saw very significant weight loss, which appears to have been almost entirely body fat, and was driven by simultaneously higher energy expenditure coupled with lower food intake.

There would indeed be a market for that, and you can bet that the possibility hasn't escaped the metabolics groups at the large companies. At almost the same time, in fact, a group at Merck published a very similar study in obese mice with their own dual-agonist peptide, and saw the same sort of thing: weight loss, improvement in metabolic markers, decreased fat mass, the whole deal.

Now, what does all this mean for the state of the art? Merck wouldn't publish such interesting results without a good reason - you have to wonder if they're far enough along that they felt safe talking about such things, or (alternately) if there are clear problems with the approach that will keep this mechanism from ever being used. Nothing's shown up in the open literature about the latter possibility, as far as I can see. So the race would appear to be on. Is it?

Comments (19) + TrackBacks (0) | Category: Diabetes and Obesity

September 23, 2009

PNAS Shuts a Door

Email This Entry

Posted by Derek

I've written before here about how I actually like reading the Proceedings of the National Academy of Science (PNAS). It's a journal that has published a lot of groundbreaking work, and a fair amount of nonsense, and that mix is largely due to its unusual paths to publication.

Well, one of those paths is drying up. The academy has decided to stop the "communicated by" option (Track III), where someone can approach a PNAS member and ask them to send in a manuscript (each member could do this up to twice a year). Some members seem to mourn the passing of an old tradition, while others are glad that they don't have to pick and choose between manuscripts from their friends. Science has some details, and you can see the PNAS announcement here.

One of the things that may have either sped this along, or at least made people think about the decision more, was a recent paper by Donald Williamson, communicated by Lynn Margulis. Williamson presents an evolutionary hypothesis that is controversial to say the least, the idea that larvae (caterpillars, etc.) are the result of a wholesale gene transfer between completely different phyla. I think that this idea is very likely to be wrong, but in Williamson's defense, he proposes some ways to test it - and if by some chance he's right, he'll rewrite a big chunk of evolutionary theory.

Some people may look at the latest PNAS move and think "Good, now we won't have any more craziness like that caterpillar stuff". But I actually like to see a bit of such craziness, and I worry that there are already too few outlets for it to see publication. It may not have been an appropriate paper for PNAS - but where else would it have been published at all?

Comments (15) + TrackBacks (0) | Category: The Scientific Literature

Pay Them Now, Or Pay For It Later

Email This Entry

Posted by Derek

Time for a brief comment on health care reform, now that Sen. Baucus has presented a bill to the Finance Committee (which, to be sure, I believe has already attracted over 500 proposed amendments). As is well known, the largest drug industry trade group, PhRMA, signed on to the whole idea of a large reform effort early, in exchange for a seat at the table (and a chance to make things go favorably). How's that working out so far?

As Steve Usdin at Biocentury writes, the answer is "fairly blatantly":

The parochial value of PhRMA’s controversial decision to cut a deal with the Senate Finance Committee and the White House became clear last week as details of the committee’s healthcare reform bill emerged that favor big pharma companies over their biotech siblings. The bill also pounds the medical device industry and slams laboratory service providers, sectors that didn’t agree on “voluntary” contributions to healthcare expansion. . .

. . .A 233-page summary of the America’s Healthy Future Act released by Finance Committee Chairman Max Baucus (D-Mont.) includes most of PhRMA’s healthcare reform wish list and has only one major provision pharma companies hope to kill: a commission with powers to constrain Medicare spending.

The tax put on medical devices by this bill has already been noted widely in the press, and I see that Sen. Kerry is already objecting to that provision - naturally enough, since Massachusetts has some big players in that area. The Senators from Guidant and Medtronics (also known as Indiana and Minnesota) are speaking up as well. The trade association for that industry (AdvaMed) apparently couldn't come to terms with Washington, so this tax is their reward - which, in a nutshell, is the sort of thing that keeps gradually turning me into a libertarian.

There are more examples. Biocentury goes on to detail an excise tax provision in the bill that's based on sales figures and market share. But this isn't calculated on total US sales, which is the method various biotech companies were pushing for. No, it's calculated by market share of sales to the US government, which (because of Medicare) tends to emphasize drugs for an older population. In general, if your drug is provided substantially through any government-supported program, (HIV medications come to mind), you're going to see a higher fee. Orphan drugs are exempt from the tax, which must gladden the hearts of several companies, though.

It's still way too early to get worked up over any specific provisions of any one bill, and there's plenty of room to wonder if anything substantial at all will get passed. But it is worth paying some attention to how the process works. When the same tactics are used in the private sector, the unfortunate phrase "protection racket" comes to mind. But government, well, that's different. Clearly.

Comments (28) + TrackBacks (0) | Category: Current Events | Regulatory Affairs

September 22, 2009

Statin Safety?

Email This Entry

Posted by Derek

Senator Charles Grassley of Iowa has sent the FDA a letter asking if the agency has sufficiently considered adverse events from statin drugs. I've been unable to find the text of the letter, but here's a summary at Business Week. (Grassley's own list of press releases, like most other senators and representatives, is a long, long list of all the swag and booty that he's been able to cart back to his constituents.

His main questions seem to be: has the agency seen any patterns in adverse event reports? Is there reason to believe that such events are being under-reported? Is there information from other countries where the drugs are prescribed that might tell us things that we're missing here?

Business Week's reporter John Carey has been on this are-statins-worse-than-they-appear beat for some time now, and it wouldn't surprise me if someone from Grassley's office sent him a copy of the Senator's letter on that basis. Those considerations aside, are statins really worse than they appear, or not?

The muscle side effects of the drugs (rhabdomyolysis) have been known for some time, and it's clear that some patients are more sensitive to this than others. But there are other possible side effects kicking around, such as cognitive impairment. The evidence for that doesn't seem very strong to me, at first glance, and could (as far as I can see) come out the other way just as easily. In the same way, I haven't seen any compelling evidence for increased risk of cancer, although it's quite possible that they may have effects (good and bad) when combined with existing therapies.

The one thing that you can say is that the epidemiological data we have for statin treatment is probably about as good as we're going to get for anything. These drugs are so widely prescribed, and have now been on the market for so many years, that the amount of data collected on them is huge. If that data set is inadequate, then so are all the others. I'm not sure what Sen. Grassley is up to with his letter, but that's something he should probably keep in mind. . .

Comments (20) + TrackBacks (0) | Category: Cardiovascular Disease | Regulatory Affairs | Toxicology

Colorful Junk

Email This Entry

Posted by Derek

Last Saturday night I stayed out until 3:30 AM, then slept in the back of our van. Now, that may sound like a pretty good evening for some of you, but it might seem a little odd for a guy like me. There's a good reason, though - I was out at the Connecticut Star Party, a meeting of amateur astronomers out in the boonies of eastern CT. Fall is a good season for those get-togethers - there are a lot of interesting things in the sky, the weather tends to clear out as cold fronts come through (but it's still temperate, overall), and it gets dark at a reasonable hour. Conditions last weekend were about as good as they can get, actually - I won't go into what I observed, unless it turns out that there are a lot of readers to whom phrases like "Minkowski's Footprint" and "G-numbered globulars around M31" mean something.
jupiter_cassini.jpg
There were good views of Jupiter, though, and that always reminds me of the lab. I didn't spend much time looking at the planet (it tends to ruin your night vision for a while!), but the colors of the cloud belts are striking: yellow, brown, orange, tan, and (of course) the Great Red Spot, which is sort of a light brick color these days. (That's about the right color there in the photo, although that's a lot higher-resolution than you can see with the naked eye, taken as it was from the Cassini spacecraft on its way to Saturn. The black dot is the shadow of one of the moons, giving anyone in Jupiter's cloud deck a total solar eclipse).

What it reminds me of are the reactions on my bench (and some of those older stored samples), which are turning the same colors. And they're doing that for the same reasons. Jupiter's a gigantic stew of organic chemicals, which are being run through all kinds of temperatures and pressures (including plenty of conditions that are too bizarre to reproduce - so far - on Earth), being irradiated by the Sun and constantly zapped by huge lightning storms. The side reactions in my lab tend to make yellow, orange, red, and brown stuff, and Jupiter is nothing but side reactions.

So what is all that stuff? It's rather hard to characterize it, naturally, but I've always assumed that they're some sort of high-molecular-weight condensation products. (There's been some work done on trying to figure out what the astronomical versions of it, called tholins, must be). There must be a fair number of double bonds and a lot of conjugation in there, to get all those chromophores which push the transmitted light down to the yellow-orange part of the spectrum. All the higher-energy wavelengths of light, the purple/blue/green stuff, are being soaked up. No organic compound in my experience has ever decomposed to anything colored blue. They start by going yellow and then head down through orange and red, towards deep brown and thence to black.

So when I purify these things, and all the colorful stuff sticks to the top of the chromatography column and makes bands of who-knows-what up there, I often glance up at the stuff I'm throwing away, and think "Jupiter". And that's probably accurate.

Comments (10) + TrackBacks (0) | Category: Life in the Drug Labs

September 21, 2009

More on T2, and Degrees

Email This Entry

Posted by Derek

Friday's article on the T2 explosion has had a lot of readers, thanks to links from various outside sources. One line from it has attracted a disproportionate amount of comment - the one where I mentioned that the two owners of the company had only undergraduate degrees. This needs some clearing up; I should have explained myself more clearly in the original post.

First off, there are two things I most definitely didn't mean. I do not, of course, mean to imply that anyone without a graduate degree is incapable of running a complex or hazardous chemical process. Nor am I assuming that there's some sort of magic in a graduate degree program that turns a person into someone who actually can run such things. I've seen enough smart people who didn't go to grad school (and enough fools with PhDs) not to believe either of those.

The key thing here (besides intelligence, which is necessary, but not sufficient) is experience. And what experience gives you, among other things, is a sense of knowing what needs to be worried about. That's what the T2 people seem to have lacked. It's no exaggeration that every time I've described this accident to an experienced scale-up or process chemist, their response has been outrage and incredulity. De mortuis nil nisi bonum, and my apologies in advance to any relatives or colleagues of the deceased, but these people were conducting a very hazardous chemical process, and the lack of care they showed while doing so is stunning. No calorimetry to look for exothermic reactions, a totally inadequate rupture disk for venting that large a reactor, no attempt to set up the process as a flow or feed (which also would have given you built-in temperature control), and no backup for the absolutely crucial cooling system.

Now, it's quite possible that if the people who set up the T2 reactor had been through a graduate program that they would have gone on to do the exact same thing. But it might have helped a bit, which might have been enough to keep four people from being killed. Graduate work is supposed to involve research, experiments that haven't been run before. If you get a degree that's worth anything, you've had the experience of having to figure experimental setups out on your own, and that means that you should have had some chances to think about what might go wrong with them. And the larger the scale of your chemistry, the more you should think about that last point.

Having a couple of reactions take off and spray the inside of your fume hood brings home the problems of heat transfer and pressure relief in a way that no textbook can quite match, and that's not something that you'll experience as an undergraduate in most colleges. Now, it's true that you can experience that at work, too, where the lessons will be even more vivid. That's why in an industrial setting an experienced chemist without a doctorate is almost always much more worth listening to than a freshly arrived PhD - if they're any good, they've seen a lot and they've learned from it.

The people running T2 not only did not take proper precautions, they had been told that they needed to bring in a consultant to look over their process. In other words, "get someone in here who can see things that you're overlooking". But they didn't do that. It's also possible that they might have brought someone in and ignored their recommendations, too, and there's no degree program that can keep you from acting like that, either. They'd run this thing over and over just the way it was, and they probably thought that everything was under control. But it wasn't. And they had no idea.

Comments (30) + TrackBacks (0) | Category: Chemical News | Safety Warnings

September 18, 2009

175 Times. And Then the Catastrophe.

Email This Entry

Posted by Derek

I noted this item over at C&E News today, a report on a terrible chemical accident at T2 Laboratories in Florida back in 2007. I missed even hearing about this incident at the time, but it appears to have been one of the more violent explosions investigated by the federal Chemical Safety and Hazard Board (CSB). Debris ended up over a mile from the site, and killed four employees, including one of the co-owners, who was fifty feet away from the reactor at the time. (The other co-owner made it through the blast behind a shipping container and suffered a heart attack immediately afterwards, but survived). Here's the full report as a PDF.
T2%20aerial%20shot.jpg
The company was preparing a gasoline additive, methylcyclopentadienyl manganese tricarbonyl (MCMT). To readers outside the field, that sounds like an awful mouthful of a name, but organic chemists will look it over and say "OK, halfway like ferrocene, manganese instead of iron, methyl group on the ring, three CO groups on the other side of the metal. Hmmm. What went wrong with that one?"

Well, the same sort of thing that can go wrong with a lot of reactions, large and small: a thermal runaway. That's always a possibility when a reaction gives off waste heat while it's running (that's called an exothermic reaction, and some are, some aren't - it depends on the energy balance of the bonds being broken versus the bonds being made, among other things). Heating chemical reactions almost invariably speeds them up, naturally, so the heat given off by such a reaction can make it go faster, which makes it give off even more heat, which makes it. . .well,, now you know why it's called a runaway reaction.

On the small scales where I've spent my career, the usual consequence of this is that whatever's fitted on the top of the flask blows off, and the contents geyser out all over the fume hood. One generally doesn't tightly seal the top of a reaction flask, not unless one knows exactly what one is doing, so there's usually a stopper or rubber seal that gives way. I've walked back into my lab, looked at the floor in front of my hood, and wondered "Who on earth left a glass condenser on my floor?", until I walked over to have a look and realized where it came from (and, um, who left it there).

But on a large scale, well, things are always different. For one thing, it's just plain larger. There's more energy involved. And heat transfer is a major concern on scale, because while it's easy to cool off a 25-milliliter flask, where none of the contents are more than a centimeter from the outside wall, cooling off a 2500-gallon reactor is something else again. Needless to say, you're not going to be able to pick it up quickly and stick it into 25,000 gallons of ice water, and even that wouldn't do nearly as much good as you might think. The center of that reactor is a long way from the walls, and cooling those walls down can only do so much - stirring is a major concern on these scales, too.
T2%20agitator.jpg
What's worth emphasizing is that this explosion occurred on the one hundred seventy-fifth time that T2 had run this reaction. No doubt they thought they had everything well under control - have any of you ever run the same reaction a hundred and seventy-five times in a row? But what they didn't know was crucial: the operators had only undergraduate degrees (Update: here's another post on that issue), and the CSB report concludes that the didn't realize that they were walking on the edge of disaster the whole time. As it turns out, the MCMT chemistry was mildly exothermic. But if the reaction got above the normal production temperature (177C), a very exothermic side reaction kicked in. Have I mentioned that the chemistry involved was a stirred molten-sodium reaction? Yep, methylcyclopentadiene dimer, cracking to monomer, metallating with the sodium and releasing hydrogen gas. This was run in diglyme, and if the temperature went up above 199C, the sodium would start reacting energetically with the solvent. Update: corrected these temperature values

Experienced chemists and engineers will recognize that setup for what it is: a black-bordered invitation to disaster. Apparently the T2 chemists had experienced a few close calls in the past, without fully realizing the extent of the problem. On the morning of the explosion, the water cooling line experienced some sort of blockage, and there was (fatally) no backup cooling system in place. Ten minutes later, everything went up. In retrospect, the only thing to do when the cooling went out would have been to run for it and cover as much ground as possible in the ten minutes left, but that's not a decision that anyone usually makes.
T2%20reactor.jpg
Here you see part of the company's reactor vessel, which ended up on some train tracks 400 feet away. The 4-inch-wide shaft of the agitator traveled nearly as far, imbedding itself into the sidewalk like a javelin. My condolences go out to the families of those killed and injured in this terribly preventable accident. The laws of thermodynamics, unfortunately, have no regard for human life at all. They cannot be brushed off or bargained with, and if you do not pay attention to them they can cut you down.

Comments (60) + TrackBacks (0) | Category: Chemical News | Safety Warnings

September 17, 2009

The Drug Business: A Turbulent Future?

Email This Entry

Posted by Derek

One of this blog's regular correspondents has just been attending a chemistry outsourcing conference (program here), and heard a very interesting talk from Stefan Loren of a Baltimore investment advisory firm, Westwicke Partners. Loren's a product of the Sharpless lab, who went on to Abbott, then Wall Street (Legg Mason and into the hedge fund business), and had some very provocative things to say about our industry:

His talk, "The Pharma Titanic: It's Time to Root for the Iceberg" presented a sobering view of the challenges that big pharma will have to deal with if it wants to survive.

Loren opened with an overview of the US national health care debate. Regardless of the ultimate form that a national system takes, he believes we'll see mandatory insurance; this will be good for big pharma. He also believes that there will be strong pressure for mandatory comparative effectiveness testing...probably not good for big pharma. Who will pay for this and what resources this would require is another matter. Wearing his investment advisor glasses, he sees global pharma sales declining, led by North America, with future growth coming in Asia and Latin America. He also sees evidence of healthcare avoidance in the US: unfilled prescriptions, unfinished courses of prescriptions, and people just not visiting medical and dental practitioners - not a good trend.

The coming wave of patent expirations of the top 10 drugs will hit big pharma hard. Generics will grow: In 5 to 10 years, he predicts that 80 percent of ALL prescriptions will be generic. When coupled with the meager investments in bow wave research over the past 15+ years, as measured by IPOs, there's trouble ahead. Global biotech IPOs are in the toilet and the US is no longer viewed by the investment community as the global leader in biotech. There have been an unprecedented number of bankruptcies in biotech. There is going to be a huge oversupply of production capacity for small molecule manufacturing. ROIs for pharma and biotech are largely negative...it gets worse. He calls this the "death spiral."

Pharma pipelines are seen as very poorly run and wasteful. Poor projects linger far longer than they should. Too much emphasis is placed on me-too and line extensions. Too much emphasis is placed on acquisitions and licensing rather than innovation. Here it comes: he says "I have NEVER seen a merger that worked" We were then entertained by a chart showing Pfizer's stock market performance over the period of time from pre-WLA, through Pharmacia-Upjohn, and now Wyeth...you would not be a happy camper if you had put your retirement account in Pfizer management's hands and their merger mania. Wall Street has a saying "Two dogs don't make a kennel." Of course, what we hear is "this time it's different" along with the usual happy talk about synergies. Loren does believe that mergers can work and can be synergistic if the two companies merging are small...large mergers just don't work and large companies get paralyzed by bureaucratic inertia.

His solution? Break up large pharma into therapeutic areas and build shared networks between distinct entities. Small organizations can operate far more efficiently in decision making about research directions - use the network to maintain manufacturing efficiencies. Small focused companies will revitalize the industry and offer opportunities for scientists coming out of academia. In response to a question from the audience regarding Merck's ambitions to adopt this networked architecture, he doesn't believe they can make it work.

He does see light at the end of the tunnel with respect to supply chain assurance driving a return to sanity. The heparin, glycerin, and melamine disasters have awakened people and the cost of securing global supply chains is going to make US industry much more competitive. It also will focus serious scrutiny on big pharma. The "next heparin" case will have serious personal consequences for big pharma managers. . ."

Well, a good amount of this I agree with, but some of it I'm not sure about. Taking things in order, I don't know about a decline in US sales, but Asia is most definitely where a lot of companies are expecting growth. (And for "Asia", you could substitute "China" and be within margin of error). And his generic prescription figures may not be right on target, but the trend surely is. We've discovered a lot of useful drugs over the years, and anything new we find has to compete against them. The only way to break out of that situation is to find drugs in new categories entirely, and we all know how easy that is.

But as for the US not being the global leader in biotech - well, if we aren't, then who is? You could possibly make a case for "no clear leader at all, for now", but I think that's as far as I can go. And that coming oversupply of manufacturing for small molecule drugs, which may well be real, will be bad news for the companies that have already invested in that area, of course, but good news for up-and-comers, who will be able to pick up capacity more cheaply.

But Loren's comments about mergers I can endorse without reservation. I've been saying nasty things about big pharma mergers since this blog began, and nothing in the last seven years has changed my mind. And I certainly hope that his idea of smaller companies coming along to revitalize the industry is on target, because it's sure not going to be revitalized by (for example) Pfizer buying more people. I've made that Pfizer stock-chart point of his here, as well - like the rest of the industry, PFE stock had a wonderful time of it in the 1990s, but this entire decade it's been an awful place to have your money.

I expect these comments to bring in a lot of comments of their own - so, how much of this future are you buying?

Comments (23) + TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Industry History | Regulatory Affairs

September 16, 2009

Real Electrons

Email This Entry

Posted by Derek

I posted images of real pentacene molecules the other day, but now the single molecule/single-atom imaging field has reached another milestone. There's a paper coming out in Physical Review B from a team in Kharkov using a field emission electron microscope. At heart, that's a pretty old type of machine, first invented back in the 1930s, and it's long provided images of the arrangements of atoms in metal surfaces. (More precisely, you're getting an image of the work function, the energy needed to remove electrons from the material).

But this latest work is something else entirely. The researchers have improved the resolution and sensitivity, narrowing things down to single-atom tips. So instead of a tungsten surface, we have a single carbon atom at the end of a chain. And instead of the behavior of the electons in a bulk metal, we have the electron density around one nucleus. Behold the s and p orbitals. Generations of students have learned these as abstractions, diagrams on a page. I never thought I'd see them, and I never thought I'd see the day when when it was even possible. As always, I react to these things with interest, excitement, and a tiny bit of terror at seeing something that I assumed would always be hidden.
carbon%20orbitals.jpg

Comments (53) + TrackBacks (0) | Category: General Scientific News

September 15, 2009

Lilly Shrinks

Email This Entry

Posted by Derek

The latest re-org announcement is from Eli Lilly. The company is getting braced for the Zyprexa patent expiration (and the possibility that Prasugrel and others won't be able to make up for as much lost revenue as they thought). Their target is a 14% head count reduction by the end of 2011.

For everyone's sake there, if they're really going to do that, I hope they do it quickly. Having that sort of thing hanging around over everyone's head is, to put it mildly, not good for anyone's quality of life (whether they're being let go or not). I haven't heard how these cuts will be distributed (across research, sales, administrative, etc.), but I suspect that details will start leaking out soon. . .

Comments (14) + TrackBacks (0) | Category: Business and Markets

Industrial Research: More Grounded in Reality, or Not?

Email This Entry

Posted by Derek

My post the other day on why-do-it academic research has prompted quite a bit of comment, including this excerpt from an e-mail:

I would also note that mediocrity is hardly limited to academia. I cannot tell you the number of truly dumb things that I continue to see happening in industry, motivated by the need to be doing something - anything - that can be quantified in a report. The idea that industry is where reality takes command is depressingly false, and I would guess that the same thing that distinguishes the best from the rest in academia also applies in the "real world."

Well, my correspondent is unfortunately on target with that one. Industry is supposed to be where reality takes command, but too often it can be where wishful thinking gets funded with investor's cash. I'm coming up on my 20th anniversary of doing industrial drug discovery. I've seen a lot of good ideas and a lot of hard work done to develop them - but I've also seen decisions that were so stupid that they would absolutely frizz your hair. And I'm not talking stupid-in-hindsight, which is a roomy category we all have helped to fill up. No, these were head-in-hands performances while they were going on.

I can't go into great detail on these, as readers will appreciate, but I can extract some recurring themes. From what I've seen the worst decisions tend to come from some of these:

"We can't give up on this project now. Look at all the time and money we've put into it!" This is the sunk-cost fallacy, and it's a powerful temptation. Looking at how hard you've worked on something is, sadly, nearly irrelevant to deciding whether you should go on working on it. The key question is, what's it look like right now, compared to what else you could be doing?

"Look, I know this isn't the best molecule we've ever recommended to the clinic. But it's late in the year, and we need to make our goals." I think that everyone who's been in this business for a few years will recognize this one. It's a confusion of ends. Those numerical targets are set in an attempt to try to keep things moving, and increase the chance of delivering real drugs. That's the goal. But they quickly become ends in themselves, and there's where the trouble starts. People start making the numbers rather than making drugs.

"OK, this series of compounds has its problems. But how can you walk away from single-digit nanomolar activity?" This is another pervasive one. Too many discovery projects see their first job (not unreasonably) as getting a potent compound, and when they find one, it can be hard to get rid of it - even if it has all kinds of other liabilities. It takes a lot of nerve to get up in front of a project review meeting and say "Here's the series that lights up the in vitro assay like nothing else. And we're going to stop working on it, because it's wasting our time".

"Everyone else in the industry is getting on board with this. We've got to act now or be left behind." Sometimes these fears are real, and justified. But it's easy to get spooked in this business. Everyone else can start looking smarter than you are, particularly since you see your own discovery efforts from the inside, and can only see other ones through their presentations and patents. Everyone looks smart and competent after the story has been cleaned up for a paper or a poster. And while you do have to keep checking to make sure that you really are keeping up with the times, odds are that if you're smart enough to realize that you should be doing that, you're in reasonably good shape. The real losers, on the other hand, are convinced that they're doing great.

I'm not sure how many of these problems can be fixed, ours or the ones of academia, because both areas are stocked with humans. But that doesn't mean we can't do better than we're doing, and it certainly doesn't release us from an obligation to try.

Comments (27) + TrackBacks (0) | Category: Academia (vs. Industry) | Drug Development | Who Discovers and Why

September 14, 2009

Abstract Abstracts

Email This Entry

Posted by Derek

Maybe I'm just cranky, or cranky on the subject of design and presentation, but this sort of chemical structure drawing rubs me the wrong way. I'd have a much better chance of understanding the transformation without all the colored dots; they don't seem to be adding anything.

And while I'm in get-off-my-lawn mode, the similar trendlet of coloring the interiors of ring structures annoys me, too. Nicolaou seems to enjoy doing this (here's a recent example), and I can't say that it adds much to my understanding.

OK, Andy Rooney mode off. For now.

Comments (43) + TrackBacks (0) | Category: The Scientific Literature

Norman Borlaug

Email This Entry

Posted by Derek

Norman Borlaug has died at the age of 95, and he's definitely worth remembering. His tireless work on improving agriculture saved hundreds of millions of people from being born to starvation. And it also kept the world from having to tear up even more natural habitats to plant food crops. Update: as pointed out in the comments, here's an excellent interview with Borlaug from 2000).

People tend to forget (or have never known) about the way the world has managed to escape the Malthusian trap over the last two or three hundred years. (A Farewell to Alms
is a book that makes this case at length, more here). And the way that birth rates drop once countries become more prosperous holds out the hope that we won't fall into an even greater version of the same thing. I think that once the Industrial Revolution happened, world population was going to explode eventually. Norman Borlaug was one of the key people who helped keep things together while that happened.

But what about natural, traditional means of growing crops, in harmony with the land and all that? It's easy to forget the agriculture is unnatural, and is a relatively recent invention. (In fact, perhaps it was that step, rather than the Industrial Revolution, that set the world on a path to an eventual population explosion. It just did so more slowly). Once we started clearing land and saving seed, we left the natural way of things behind. To put that another way, that's when the human race stop playing only the cards it had been dealt. And using the highest-yielding seed and the most well-thought-out ways of growing it will keep us from having to clear more of the land we have left.

Comments (30) + TrackBacks (0) | Category: Current Events

September 11, 2009

Antioxidants and Cancer: Backwards?

Email This Entry

Posted by Derek

Readers may remember a study from earlier this year that suggested that taking antioxidants canceled out some of the benefits of exercise. It seems that the reactive oxygen species themselves, which everyone's been assuming have to be fought, are actually being used to signal the body's metabolic changes.

Now there's another disturbing paper on a possible unintended effect of antioxidant therapy. Joan Brugge and her group at Harvard published last month on what happens to cells when they're detached from their normal environment. What's supposed to happen, everyone thought, is apoptosis, programmed cell death. Apoptosis, in fact, is supposed to be triggered most of the time when a cell detects that something has gone seriously wrong with its normal processes, and being detached from its normal signaling environment (and its normal blood supply) definitely qualifies. But cancer cells manage to dodge that difficulty, and since it's known that they also get around other apoptosis signals, it made sense that this was happening here, too.

But there have been some recent reports that cast doubt on apoptosis being the only route for detached cell death. This latest study confirms that, but goes on to a surprise. When this team blocked apoptotic processes, detached cells died anyway. A closer look suggested that the reason was, basically, starvation. The cells were deprived of nutrients after being dislocated, ran out of glucose, and that was that. This process could be stopped, though, if a known oncogene involved in glucose uptake (ERBB2) was activated, which suggests that one way a cancer cells survive their travels is by keeping their fuel supply going.

So far, so good - this all fits in well with what we already know about tumor cells. But this study found that there was another way to keep detached cells from dying: give them antioxidants. (They used either N-acetylcysteine or a water-soluble Vitamin E derivative). It appears that oxidative stress is one thing that's helping to kill off wandering cells. On top of this effect, reactive oxygen species also seem to be inhibiting another possible energy source, fatty acid oxidation. Take away the reactive oxygen species, and the cells are suddenly under less pressure and have access to a new food source. (Here's a commentary in Nature that goes over all this in more detail, and here's one from The Scientist).

They went on to use some good fluorescence microscopy techniques to show that these differences in reactive oxygen species are found in tumor cell cultures. There are notable metabolic differences between the outer cells of a cultured tumor growth and its inner cells (the ones that can't get so much glucose), but that difference can be smoothed out by. . .antioxidants. The normal process is for the central cells in such growths to eventually die off (luminal clearance), but antioxidant treatment kept this from happening. Even more alarmingly, they showed that tumor cells expressing various oncogenes colonized an in vitro cell growth matrix much more effectively in the presence of antioxidants as well.

This looks like a very strong paper to me; there's a lot of work in it and a lot of information. Taken together, these results suggest a number of immediate questions. Is there something that shuts down normal glucose uptake when a cell is detached, and is this another general cell-suicide mechanism? How exactly does oxidative stress keep these cells from using their fatty acid oxidation pathway? (And how does that relate to normally positioned cells, in which fatty acid oxidation is actually supposed to kick in when glucose supplies go down?)

The biggest questions, though, are the most immediate: first, does it make any sense at all to give antioxidants to cancer patients? Right now, I'd very much have to wonder. And second, could taking antioxidants actually have a long-term cancer-promoting effect under normal conditions? I'd very much like to know that one, and so would a lot of other people.

After this and that exercise study, I'm honestly starting to think that oxidative stress has been getting an undeserved bad press over the years. Have we had things totally turned around?

Comments (43) + TrackBacks (0) | Category: Biological News | Cancer

September 10, 2009

To What End?

Email This Entry

Posted by Derek

I was looking through my RSS feed of journal articles this morning, and came across this new one in J. Med. Chem.. Now, there's nothing particularly unusual about this work. The authors are exploring a particular subtype of serotonin receptor (5-HT6), using some chemotypes that have been looked at in serotinergic ligands before. They switch the indole to an indene, put in a sulfonamide, change the aminoethyl side chain to a guanidine, and. . .wait a minute.

Guanidine? I thought that the whole point of making a 5-HT6 ligand was to get it into the brain, and guanidines don't have the best reputation for allowing you to do that. (They're not the easiest thing in the world to even get decent oral absorption from, either, come to think of it). So I looked through the paper to see if there were any in vivo numbers, and as far as I can see, there aren't.

Now, that's not necessarily the fault of the paper's authors. They're from an academic med-chem lab in Barcelona, and animal dosing (and animal PK measurements) aren't necessarily easy to get unless you have a dedicated team that does such things. But, still. The industrial medicinal chemist in me looks at these structures, finds them unlikely to ever reach their intended site of action, can find no evidence in the paper's references that anyone else has ever gotten such a guanidine hydrazone into the brain, either, and starts to have if-a-tree-falls-in-the-forest thoughts.

Now, it's true that we learn some more about the receptor itself by finding new ligands for it, and such compounds can be used for in vitro experiments. But it's not like there aren't other 5-HT6 antagonists out there, in several different chemical classes, and that's just from the first page of a PubMed search. Many of these compounds do, in fact, penetrate the brain, because they were developed by industrial groups for whom in vitro experiments are most definitely not an end in themselves.

I don't mean to single out the Barcelona group here. Their work isn't bad, and it looks perfectly reasonable to me. It's just that my years in industry have made me always ask what a particular paper tells me that I didn't know, and what use might some day be made of the results. Readers here will know that I have a weakness for out-there ideas and technologies, so it's not like I have to see an immediate practical application for everything. But I would like to see the hope of one. And for this work, and for a lot of medicinal chemistry that comes out of academic labs, I just don't see it.

Update: it's been pointed out in the comments that there's a value in academic work that doesn't have to be addressed in industry, that is, training the students who do it. That's absolutely right. But at the same time, couldn't people be trained just as well by working on systems that are a bit less dead on arrival?

And no, I'm not trying to make that case that academic labs should make drugs. If they want to try, then come on down. If they don't, that's fine, too - there's a lot of important research to be done in the world that has no immediate practical application. But this sort of paper that I've written about today seems to miss both of these boats simultaneously: it isn't likely to produce a drug, and it doesn't seem to be addressing any other pressing needs that I can see, either.

And yes, I could say the same about my own PhD work. "The world doesn't need another synthesis of a macrolide antibiotic", I told people at the time. "But I do". Does it have to be like that?

Comments (28) + TrackBacks (0) | Category: Academia (vs. Industry) | Drug Assays | Drug Development | The Central Nervous System | The Scientific Literature

September 9, 2009

"Scratch and Sniff" Turns Into "Zap and React"

Email This Entry

Posted by Derek

Here's an odd idea that might turn into something useful. A group at Berkeley (spanning both the chemistry and physics departments of Cal-Berkeley and the Lawrence labs) have reported a method for encapsulating organic molecules and releasing them inside a reaction when needed.

What they do is form microcapsules, small polymer spheres, from branched acid chlorides and amines. That technology is already known, but in this case they're also incorporating carbon nanotubes inside the capsules, as shown in the photo. If you do this from a solution of some reagent of interest, you now have it, the solvent, and the carbon nanotubes wrapped up in small polymer beads.
nanocapsules.jpg
And if you irradiate these things, the carbon nanotubes heat up rapidly, causing the microcapsules to break open. There's the control mechanism. They've demonstrated this for reactions such as the "click" triazole formation and for olefin metathesis. You can follow the reaction progress, and it goes stepwise, further every time you hit the solution with a near-IR laser, and stopping until you do it again and release more coupling partner.

The limits of this system, so far, are (1) that the microcapsules aren't compatible with the full range of organic solvents, (2) that heat-sensitive reagents probably won't do very well in a system that require local heating to burst the capsules, and (3) that you eventually have to clean out (presumably by some sort of filtration) the capsule and nanotube residue after things have burst. But some of these can be addressed in further rounds of improvements.

For example, there must be different sorts of polymers that can form these beads, for one thing. And if it's possible to encapsulate things on the surface of a larger sheet of solid material, you could just dip that in and pull it back out when you're through, which should cut down on the capsule residue. (That would also allow you to quantitate how much reagent you've released, by following the surface area of the sheet that you've irradiated with the laser). What would really make this something to see would be if a way could be found to release different sorts of capsules at different wavelengths, selectively. . .

Comments (3) + TrackBacks (0) | Category: Chemical News

September 8, 2009

Right Where You Want Them

Email This Entry

Posted by Derek

Imagine a drug molecule, and imagine it's a really good one. That is, it's made it out of the gut just fine, out into the bloodstream, and it's even slipped in through the membrane of the targeted cells. Now what?

Well, "cells are gels", as Arthur Kornberg used to say, and he was right. There's not a lot of bulk water sloshing around in there. It's all stuck to and sliding around with enzymes, structural proteins, carbohydrates, and the like, and that's what any drug molecule has to be able to do as well. And there's no particular reason for most of them to go anywhere particular inside the cell, once they're inside. They just diffuse around until they hit their targets, to which they stick (which is something they'd better do).

What if things didn't work this way? What if you could micro-inject your drug right into a particular cell compartment, or have it target a particular cell structure, instead of having to mist it all over the place? We now have a good answer to that question, but how much good it's going to do us drug discoverers is another thing entirely.

I'm referring to this paper from JACS, from a group at the University of Tokyo. They're targeting the important signaling enzyme PI3K. That's downstream of a lot of things, and in this case they used the PDGFR receptor in the cells, and a phosphorylated peptide that's a known ligand. To make the peptide go where they wanted, though, they further engineered both the ligand and the cells. The cells got modified by expression of dihydrofolate reductase (DHFR) in their plasma membranes, and the peptide ligand was conjugated to trimethoprim (TMP). TMP has a very strong association with DHFR, so this system was being used as an artificial targeting method. (It's as if the cell had been built up with hook-bearing Velcro on the inside of its plasma membrane, and the PI3K ligand was attached to a strip of the fuzzy side). Then to see what was going on, they also attached a fluorescent ligand to the peptide ligand as well.

Of course, this ligand-TMP-fluorescent fusion beast wasn't the best candidate for getting into a cell on its own, so the team microinjected it. And the results were dramatic. Normally, stimulating the PDGFR receptor in these cells led to downstream signaling in less than one minute. In cells that didn't have the DHFR engineered into their membranes, the fluorescent ligand could be seen diffusing through the whole cytosol, and giving a very weak PDGFR response. But in the cells with the targeting system built in, the ligand immediately seemed to stick to the inside of the plasma membrane, as planned, and a very robust, quick response was seen.

The paper details a number of control experiments that I'm not going into here, and I invite the curious to read the whole thing. I'm convinced, though, that the authors are seeing what they hoped to see. In other words, ligands which aren't worth much when they have to diffuse around on their own can be real tigers when they're dragged directly to their site of action. It makes sense that this would be true, but it's nice to see it demonstrated for real. I'll quote the last paragraph of the paper, though, because that's where I have some misgivings:

In summary, we have demonstrated that it is feasible to rapidly and efficiently activate an endogenous signaling pathway by placing a synthetic ligand at a specific location within a cell. The strategy should be applicable to other endogenous proteins and pathways through the choice of appropriate ligand molecules. More significantly, this proof-of-principle study highlights the importance of controlling the subcellular locales of molecules in the design of new synthetic modulators of intracellular biological events. There might be a number of compounds (not only activators but also inhibitors) that have been dismissed but may acquire potent biological activities when they are endowed with subcellular-targeting functions. Our next challenge is to develop cell-permeable carriers capable of delivering cargo ligands to specifically defined regions or organelles inside cells.

Where they lost me was in pointing out how important this is in designing new compounds. The problem is, these are very artificial, highly engineered cells. Everything's been set up to make them do just what you want them to do. If you don't cause them to express boatloads of DHFR in their membrane, nothing works. So what lessons does this have for a drug discovery guy like me? I'm not targeting cells that have been striped with convenient Velco patches.

And even if I find something endogenous that I can use, I can't make molecules that have to be delivered through the cell membrane by microinjection. You can see from the last sentence, though, that the authors realize that part as well. But that "next challenge" they speak of is more than enough to keep them occupied for the rest of their working lives. These kinds of experiments are important - they teach us a lot about cell biology, and there's sure a lot more of that to be learned. But the cells won't give up their secrets without a fight.

Comments (9) + TrackBacks (0) | Category: Biological News

September 4, 2009

Pharma Whistleblowing: How It Works

Email This Entry

Posted by Derek

Here's more detail on the case that led to Pfizer's 2.3 billion dollar fine/settlement, courtesy of Bloomberg. Here's how things got started, apparently:

Pfizer Inc. sales folks had one tough customer in psychiatrist Stefan Kruszewski. He didn’t buy their pitch to prescribe the anti-psychotic drug Geodon to children, a use that hadn’t been approved by federal regulators.

Nor did he go for the so-called off-label uses they suggested, such as treating dementia in the elderly.

Kruszewski didn’t just say no. He went and checked the research and saw Geodon could have serious cardiac side effects not mentioned by the salesmen, who boasted of its relative safety, according to his lawyer, Brian Kenney. And he noticed that Pfizer was paying his peers to promote the drug to other psychiatrists.

But the worst for Pfizer was that Kruszewski didn’t keep it to himself. He found a lawyer, Kenney, who specializes in whistleblower cases, and they took what they had to the government.

So did John Kopchinski, who sold Pfizer’s arthritis drug Bextra but not as aggressively as the bosses wanted. They told the sales force to pitch it for post-surgical pain, acute pain, migraines and a host of other conditions for which the drug had been rejected by the U.S. Food and Drug Administration, says Kopchinski’s lawyer, Erika Kelton.

The six whistleblowers in the case are getting anywhere from $2.3 million to $51 million now that the settlement has been announced (that upper figure is Kopchinski, who seems to have provided the most serious evidence). As I mentioned the other day, I think this is a good thing. It takes a lot of nerve to step up when your employer is doing something outside the limits of the law (and asking you to do it as well). A chance to make up for the certain loss of your job (and the near-certain loss of any future prospects in the field) goes a long way.

And there's an interesting perspective on why a settlement was reached:

. . .Pfizer is the pharmaceutical equivalent of insurance giant American International Group Inc., which was too interwoven into the global economy to be allowed to fail. Likewise, if Pfizer were convicted of a crime, it would face debarment from federal programs. And that would mean that Medicaid and Medicare patients would have to either somehow pay pocket for vital medicines the company produces or go without.

Hadn't thought of that one. I wonder if any company will have the nerve to use this as a negotiating tactic? Perhaps Pfizer already did, come to think of it. . .

Comments (20) + TrackBacks (0) | Category: Business and Markets | The Dark Side

Sepracor: A Desirable Property?

Email This Entry

Posted by Derek

Well, I didn't see this one coming. Dainippon Sumitomo has announced that they're buying Sepracor. My first thought on reading this was "Are they sure they want to do that?"

I say that because the ostensible reason that the Japanese company is pulling out their wallet is that they're looking to replace declining revenues at home. In that case, why are they buying declining revenues over here? Their flagship product (Lunesta) is going to be going off patent in the not-too-distant future, and they don't have a gigantic pipeline of stuff behind it.

The answer seems to be a deficiency that many Japanese firms have felt: a lack of boots-on-the-ground sales staff over here. The US is the biggest single profit center for the worldwide drug industry, and it's impossible for a big company to ignore that. But realizing all those potential profits isn't easy, if you're coming in from a standing start. (It's not like Dainippon Sumitomo has a big profile over here). Says the Boston Globe:

In a note to investors on the sale, Credit Suisse analyst Scott Hirsch said the deal made sense for Sepracor. He noted that the company is generating $300 million to $400 million in cash a year but has a limited pipeline of new drugs in development and its existing products will face competition from generic drugs in coming years. Hirsch also doubted another suitor would step forward with a better bid.

“In our view, if a US firm wanted Sepracor, that likely would’ve happened already, as there have been plenty of lookers over the years,’’ said Hirsch, who has a neutral rating on the stock. “We think Dainippon Sumitomo is more interested in the sales platform and operating leverage than the revenue stream.’’

So where does that leave Sepracor's research operations? It's true that Takeda has apparently been very kind to Millennium's research staff, but that was a more research-driven deal than this one seems to be. I'm sure the folks at Sepracor are looking for a little more clarity on that question. The problem is, the company's revenues have come almost entirely from clever (albeit irritating) patent-busting moves (active metabolites, pure enantiomers, and so on), but these strategies ran out of gas some time ago as the rest of the industry tightened up its IP protection. Rightly or not, Sepracor doesn't have a reputation as an outfit with a lot of great in-house research ideas. Outside of a ready-made sales force, what exactly do they have to offer?

Comments (7) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

September 3, 2009

Real Molecules

Email This Entry

Posted by Derek

Most of you will have heard about the recent accomplishment at the IBM Zürich labs, using an atomic force microscope with unprecedented resolution. They've imaged individual molecules so well that the atoms and bonds are alarmingly clear. I thought I'd put up one of the less-used images from the paper - here are some pentacene molecules (five fused benzene rings) sitting around on a surface. Not a simulation, not a model: real molecules. It gives me a slight chill, to tell you the truth.
pentacenes.jpg

Comments (37) + TrackBacks (0) | Category: General Scientific News

A 2.3 Billion Dollar Attention-Getter

Email This Entry

Posted by Derek

No sooner do I write another post about pharma marketing than Pfizer finds itself paying 3.2 2.3 billion dollars in fines for doing it improperly. 1.2 billion of that is a criminal penalty, and needless to say, they've set the current record.

The issues were off-label promotion of Bexxtra, Geodon, Zyvox, and Lyrica, with the largest penalties coming from the first two. Pfizer's had three other settlements of this kind in the last few years, and that record was definitely a factor this time, as the Justice Department looked for a figure that might get the company's attention. Also supposed to get the company's attention is a five-year "integrity agreement" with the Department of Health and Human Services, but it's worth noting that the company was already supposedly operating under an earlier such agreement when it was promoting Bexxtra. I think the money has a better chance of being noticed, myself.

I think that these kinds of penalties should be levied, in case anyone's wondering. Our current system almost makes sure that it will happen over and over, but that's because we're splitting the difference between two competing principles. The first one is that physicians should have the freedom to practice medicine as they best see fit, which means that they can write prescriptions for drug uses that have not (yet) been approved by the FDA. The second principle, though, is that drug companies should not be free to promote such uses. And I agree with both of those, but sticking to both of them simultaneously leaves open a constant temptation to break the law.

But there are a lot of industries that operate under such conditions, and in each case, they're supposed to control themselves (and get hammered on when they don't). Perhaps this latest fine will be enough of an example to keep the marketing people thinking ahead a bit. If that won't do it, then the way this whole case came up might - it's another example of whistleblower laws at work. John Kopchinski, a sales rep who left Pfizer in 2003, looks to get around $50 million of the settlement for bringing key information to the government's attention, and others are involved as well. I think that's a good thing, too, a useful counterbalance to the financial incentives on the other side.

But for now, we're left with another huge black mark on the industry's reputation. Thank you, Pfizer.

Comments (16) + TrackBacks (0) | Category: Business and Markets | Regulatory Affairs | The Dark Side

September 2, 2009

Lexapro, Forest Labs, and the Hard Sell

Email This Entry

Posted by Derek

Forest Labs has done very, very well with Lexapro (escitalopram) over the years. They're a comparatively small company, and their collaboration with Lundbeck (also a comparatively small company) in the antidepressant field has been the biggest event in their history.

Lexapro is the pure enantiomer of the earlier Lundbeck drug Celexa (citalopram), and it's been a very successful follow-on. (For a nasty spat over generic production of citalopram, see here). I'm generally not too keen on the follow-up-with-the-single-enantiomer strategy, I have to say. In general, I think it's slowly disappearing from the world as regulatory agencies look down on racemic mixtures. (I've never worked on a program myself where we seriously considered taking a racemate to the clinic - we always assumed that we'd end up developing a single enantiomer).

The New York Times has an article out detailing some of Forest's marketing plans, as revealed in documents before a Senate committee. Some of what the article has to say I agree with, and some of it I have to raise an eyebrow at, and we'll get to both of those. First off, in an area as large and competitive as antidepressants, I don't think that anyone should be surprised at what was in Forest's plan: lots and lots of lunches for physicians' offices, plenty of continuing medical education lectures (with plenty of food), and so on. One line shows that the company budgeted $34.7 million dollars to pay 2,000 physicians to deliver about 15,000 talks on the drug to their colleagues.

The Senate seems to be shocked at all this - well, pretending to be shocked, because no national politician can ever really be surprised at any way that money is used to influence anyone's decisions. But I'm not shocked, either. Leaving aside (just for a moment) the question of whether drugs should be promoted this way, the fact is that they are promoted this way, and have been for a very long time. And breaking down that lecture figure, that means a bit over $2,000 per lecture, and we don't know if that figure is supposed to cover just the honoraria for the speakers, or the whole cost of the lectures. Even if we assume the former, that comes to nearly eight lectures per physician per year, giving each of them about $17,000, pre-tax. Compared to the cost of advertising in the medical journals, general-interest magazines, or especially on television, that probably represents an excellent return for the money.

And Forest has been spending plenty of it. The article mentions that Vermont, for example, found that Forest (despite their size) was outspent in that state only by Lilly, Pfizer, Novartis, and Merck. Considering that those companies have many more drugs to sell than Forest does, that's an impressive figure. Of course, the only reason you spend money on marketing is to make even more of it back in sales, and they've certainly been doing that, too.

There are several questions here, and perhaps it's best to take them one by one. First off, is Lexapro worth what people (and insurance companies) are paying for it? The snappy economic answer is that of course it is, since that's the price that's willingly being paid, but let's talk utility instead. It does seem to be a good drug, arguably better than many of the others. It's been run head-to-head with Cymbalta (duloxetine), which is no poor performer itself, and shown to be superior And earlier this year, a Lancet article analyzed 117 controlled trials and found that there were clear clinical differences between the various antidepressants, and that Lexapro and Zoloft (sertraline) stood out as better than the rest.

The article recommended starting with the latter in new patients, I should note, and sertraline's now generic. I think that Forest's battle in the market is both against their similarly expensive competitors (where I think that they can claim to have an edge) and against cheap sertraline, where they may well not. (Update: and against their own (now generic) racemate - I'm digging into that comparison, and it'll be the subject of a follow-up post.) That said, depression is a famously heterogeneous field, and patients often have to try several drugs before somethings works, for reasons that are unclear. So yes, overall, I think that Lexapro is a useful drug, and that patients are getting benefit for their money.

The New York Times article is rather disingenuous on this point, by the way - you'd never know from it that there were differences between antidepressants, since they treat Lexapro and Prozac as interchangable, and you'd never know that there was evidence that Forest's drug might well be near the top of the list.

Next question: is Lexapro worth what Forest is spending to promote it? That question also splits into two, economically, depending on what we mean by "worth". As in the price question, from a strictly accounting perspective, we have to presume that Forest is seeing a financial benefit from their marketing activities; marketing does not run at a loss, not for long, it doesn't. And from a utility/societal benefit perspective, if Lexapro really is superior to most of their competitors, then I think the company is justified in making that case as loudly as they can.

Now we get to the tough one: are Forest's marketing activities appropriate or ethical? The arguing can now commence, because this is where we try to figure out what "as loudly as they can" actually means. I think the industry would be better off if there were less of an arms race in the marketing area. (Update: just to pick one benefit, it would make us look, in general, less sleazy, which is not to be underestimated). Even though marketing doesn't run at a loss, the return from it could be still higher if it were less expensive to do. Huge sales forces are expensive, and one of the reasons the sales forces are so big is that the competition's sales forces are so big, and so on. It's hard for any one company to climb down from its position, just from a game-theory point of view, so the most likely way for this to happen is through across-the-board restrictions on marketing, as enforced by the FDA, the FTC, or by physicians themselves. (I should mention, though, that there has been a voluntary retreat in the area of brand-covered swag). We're already seeing this pendulum swing back in the last few years, and it's fine with me if the process continues for a while longer. Doctors are perfectly free to close their doors in the faces of drug reps, and if I were in their position, I'd be tempted to do just that in many cases.

So if we come back around to that Times headine, it reads "Document Details Plan to Promote Costly Drug". And to that, I can say yes, it's a costly drug, set as high as the company thinks that people will pay for it, and to a level that they think they can make the most money with before its patent expires. And yes, Forest has a plan to maximize those profits, and if I were a shareholder (I'm not), I'd be righteously steamed if they didn't. And they did indeed write that plan down, so there are plenty of documents. I'd rather, myself, that the plan looked different than it does, and that's the way the world seems to be heading. But no matter what regulations come into force, there will always be plans to promote things that cost money.

Comments (37) + TrackBacks (0) | Category: Business and Markets | Drug Prices | Press Coverage | The Central Nervous System | Why Everyone Loves Us

September 1, 2009

Another Iron Reaction Hits The Mat

Email This Entry

Posted by Derek

Beware of iron! That's the lesson that's being hammered home these days in synthetic chemistry. I wrote recently about the discovery that a series of iron-catalyzed couplings were actually being caused by trace amounts of copper compounds. Now there's another re-examination of some similar iron couplings that were reported last year.

If you click on that last link, you'll see that there was already trouble with the original work. The authors themselves appear to have had a hard time repeating it, and earlier this year they retracted the paper. This latest publication (from other workers) details their own attempts to reproduce the original iron-catalyzed work. In most cases, they got nothing at all, but once (and only once) they had a wonderful spot-to-spot reaction take place with para-bromoacetophenone, which must have been just the sort of thing that excited the original researchers.

But it could never be reproduced. The best guess is that this one reaction may have been catalyzed by trace amounts of palladium. That's plausible, because, as it turns out, the coupling can be run at high conversion with one ten-thousandth of a per cent of palladium acetate. Yes, a substrate-to-catalyst ratio of one million to one is sufficient, and that's the kind of activity that makes it very, very hard to assume that trace amounts of palladium salts aren't doing the work.

It also makes you wonder why anyone would use anything else, at least for activated systems like para-Br acetophenone. In the future, anyone trying to come up with a non-palladium coupling protocol had better stick with the tough reactions that don't work well anyway. That will keep this sort of thing from happening again - and those are the kinds of reactions we need help with, anyway. A new catalyst for coupling red-hot electron-poor aryl bromides, on the other hand, will be greeted with yawns, and with suspicion as well.

Comments (24) + TrackBacks (0) | Category: Chemical News

Back

Email This Entry

Posted by Derek

Just wanted to let people know that I am actually around. Yesterday was a home improvement day, so I didn't contribute much to the march of science (or to the blog!). And this morning I have a lot of catching up to do on that march-of-science front, but I'll have a post up at lunchtime.

Comments (2) + TrackBacks (0) | Category: Blog Housekeeping