Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

October 31, 2007

Resistant Little Creatures

Email This Entry

Posted by Derek

The post here the other day on resistant bacterial infections prompted some readers to wonder why the drug industry isn’t doing more to come up with compounds in this field. It’s not like there’s no money to be made, and it’s not like there’s no history of antibiotic research, after all. But since my industry doesn’t have a history of knowingly leaving money on the table (what industry does?), you’d figure that there’s more to the story.

Money aside, there’s a real problem with finding good targets. For as long as I can remember in the industry, the infectious disease field has suffered from a relatively small target landscape. Almost all the known drugs in the area work through just a handful of basic mechanisms, and adding new ones to the list has been very difficult for at least the last twenty or thirty years.

That was supposed to change, in theory, starting about ten years ago. I interviewed around then at a company that was working in the field, and everyone was quite excited about the bacterial genome sequences that were starting to appear. Surely this would open the sluice gates and let that long-delayed swell of new targets come washing down the flumes. Hasn’t happened. Not yet, anyway.

I have the impression that the same problems that have affected the translation of human genomic data to new drugs have been the problem here as well. In some cases, not as many genes came out as some people were hoping for. And of these, the function of many of them was (to put it mildly) obscure. Of the ones whose use was at least partially known, many of them have proved not to be useful targets for killing the bacteria or limiting their growth. And of the ones that made that cut – and we’re down to an all-too-manageable set by now – screening hasn’t turned up much chemical matter for people like me to work on.

In fact, there’s a persistent feeling among many people in the field that bacterial and fungal proteins have a lower hit rate than you’d assume they would. Even enzymes that are fairly homologous to those in higher organisms, so the story goes, don’t turn up as many hits in the screens as expected. I’m not sure if this is true or not, but as folklore it’s pretty well known. The combination of all these factors with the perceived lack of opportunities for profits (even if you do find something) has made for slow going.

In recent years it’s become clear that the medical need has grown to the point that antibiotic research can indeed be financially worthwhile – but there are any number of financially worthwhile drug outcomes that we haven’t been able to realize. (See obesity, Alzheimer’s, and many other therapeutic areas for examples of multibillion-dollar opportunities waiting for a good idea to come along. Resistant bacteria have their name on one more sword stuck in yet another stone.

Update: there's clearly another reason why developing good antibacterials is hard, and it's the same reason we need more of them. Bacteria are well-stocked with efflux pumps to get rid of molecules they don't like (and with other weapons as well), and they evolve so fast that you can watch them do it. I wrote about efflux on the site a while back - another post is well worth doing soon.

Comments (15) + TrackBacks (0) | Category: Drug Development | Infectious Diseases

How Not to Do It: Hydrogen Bromide

Email This Entry

Posted by Derek

I’ve written before about all the fun you can have in a lab with compressed gas cylinders. We use the things all the time in chemistry, but as pieces of apparatus, they can only be pushed so far. The problem is that they demonstrate their unhappiness by venting great quantities of stuff that you’d rather not breathe (if you’re lucky), or by taking off like an unguided missile and punching holes in the walls and ceilings (if you’re not). That latter behavior is flat-out guaranteed to show up if you abuse them – for one of the more spectacular examples, see here.

I’ve never had one take off on me, fortunately, but I haven’t always stuck to the straight and narrow with the things, either. My worst behavior has usually been with lecture bottles, the dilettante-sized gas cylinders that bench chemists often use. Most chemistry departments have a few of these sitting around, generally charged with foul reagents that are needed every three or four years or so. Sulfur dioxide, boron trifluoride, phosphine – that’s the sort of thing. They’re low-use almost by definition. If there’s a regular need for a gaseous reagent, you buy larger cylinders of it, because lecture bottles are by far the most expensive way to go.

In graduate school, I was setting up some Prins reactions, which take some sort of acid component to make them run. If you use an aqueous one, you generally get an alcohol out of them (from water picking up the final cation), but if you go anhydrous you can get all sorts of other compounds. I needed a bromide, so anhydrous hydrogen bromide it was.

We had a fairly crusty lecture bottle of it around, and I eventually located some dubious-looking small regulator valves. I picked the least-corroded looking one and screwed it on. Lecture bottles have a main metal-faucet style valve up at the top, like all gas cylinders, and once you open that it’s up to the regulator valve to stop things down to manageable flows. I had my reaction set up, so I worked some tubing onto the thing and had a go at opening it up.

No dice. Boy, was that thing tight. I reached into the hood and wrestled around with it, to no avail. I took it out and got a better grip in another part of my hood, away from my reaction setup. Tighter than two quarts of fresh frogs in a half-pint pickle pot, as Walt Kelly once put it – the valve wouldn’t budge.

I’ll skip over a couple of intermediate stages and cut right to the final scene, which is me clutching the darn lecture bottle to my chest, hopping around the lab grunting and cursing as I put all my strength into trying to force the stupid valve open. You won’t see a pictograph of that method in the instruction booklet, I’m pretty sure. A recording of what happened next would have gone something like this: “Urk! Unk! Ark! WHOA!”

The valve opened, finally responding to my Conan-the-Barbarian technique, and the cheap regulator then hissed out an orange cloud of gaseous HBr right into my shirt pocket. Not a good storage compartment, actually. I whooped, shut the valve, laid the gas cylinder down fast and stripped off my shirt where I stood. You haven’t really lived in a lab until you’ve taken off your clothes in it, I always say. I staked my claim to this one by standing in it shirtless, splashing saturated sodium bicarbonate on my chest, and glaring at the remains of what used to be one of my favorite shirts.

Comments (23) + TrackBacks (0) | Category: How Not to Do It

October 29, 2007

What We Don't Know About Enzymes

Email This Entry

Posted by Derek

There was an intriguing paper published earlier this month from Manfred Reetz and co-workers at the Max Planck Institute. It's not only an interesting finding, but a good example of making lemonade from lemons.

They were looking at an enzyme called tHisF, a thermostable beast from a marine microorganism that's normally involved in histamine synthesis. It has an acid/base catalytic site, so Reetz's group, which has long been involved in pushing enzymes to do more than they usually do, was interested in seeing if this one would act as an esterase/hydrolase.

And so it did - not as efficiently as a real esterase, but not too shabby when given some generic nitrophenyl esters to chew on. There was some structure-activity trend at work: the larger the alkyl portion of the ester, the less the enzyme liked it. Given a racemic starting material, it did a good job of resolution, spitting out the R alcohol well over the S isomer. All just the sort of thing you'd expect from a normal enzyme.

Next, they used the crystal structure of the protein and previous work on the active site to see which amino acids were important for the esterase activity. And here's where the wheels came off. They did a series of amputations to all the active side chains, hacking aspartic acids and cysteines down to plain old alanine. And none of it did a thing. To what was no doubt a room full of shocked expressions, the enzyme kept rolling along exactly as before, even with what were supposed to be its key parts missing.

Further experiments confirmed that the active site actually seems to have nothing at all to do with the hydrolase activity. So what's doing it? They're not sure, but there must be some other non-obvious site that's capable of acting like a completely different enzyme. I'm sure that they're actively searching for it now, probably by doing a list of likely point mutations until they finally hit something that stops the thing.

So how often does this sort of thing happen? Are there other enzymes with "active sites" that no one's ever recognized? If so, do these have any physiological relevance? No one knows yet, but a whole new area of enzymology may have been opened up. I look forward to seeing more publications on this, and I'll enjoy them all the more knowing that they came from a series of frustrating, head-scratching "failed" experiments. Instead of pouring things into the waste can, Reetz and his co-workers stayed the course, and my hat's off to them.

Comments (10) + TrackBacks (0) | Category: Biological News

Bacterial Infection: Better Or Worse Than Cancer?

Email This Entry

Posted by Derek

There’s been a steady stream of reports in the news about methacillin-resistant Staph. aureus. It’s not a new problem, but (like other nasty infections) it does get a lot of press when the media start paying attention. Works in reverse, too – on the viral front, have you noticed the much reduced number of bird-flu-will-kill-us-all stories this year as we head toward winter? This despite the likelihood of bird flu killing us all being as high (or low) as ever, as far as I can tell.

But the resistant bacteria problem is certainly no joke, and there doesn’t seem to be any reason why it won’t gradually get worse over time. It struck me the other day that antiinfectives, as a drug research field, might be moving toward a similar spot to oncology. In both cases, you have a problem with rapidly multiplying cells, giving you a serious medical outcome - often in cancer, and increasingly with infections. The average tumor is a lot more worrisome than the average infection, of course, but that’s something we can only say with confidence in the industrialized world, and we've only been able to say it for the last sixty or seventy years. As cancer gradually becomes more manageable and infections gradually become less so, the two might eventually meet – or even switch places, which would be bad news indeed. (In some genetically bottlenecked species, in fact, the two problems can overlap, which is fortunately extremely unlikely in humans).

There are, of course, a lot of differences between the two fields, not least of which is that you’re fighting human cells in one case and prokaryotes (or worse, viruses) in the other. But many of those differences actually come out making infectious diseases look worse. The transmissibility of bacteria and viruses make them serious contenders for causing havoc, as they have innumerable times in human history, and they can grow more quickly in vivo than any cancer. It’s only the fact that public health measures allow then to be contained, and the fact that we’ve had useful therapies for many of them, that makes people downrate the infectious agents. If either (or both) of those change, we’re going to be rethinking our priorities pretty quickly.

What this means for drug development is that some researchers will have to rethink their attitudes towards antiinfective drugs. For serious infections, we're going to have to think about these projects the way we've traditionally thought of oncology agents - last-ditch therapies for deadly conditions. Anticancer therapies have long had more latitude in their side effects, therapeutic ratios, and dosing regimes, and antibiotics for resistant infections are in the same position. For some years now, there's been a problem that new drugs in this field would perforce have small markets, since they'd be used only when existing agents fail. That market may not be as small as it used to be. . .

Comments (13) + TrackBacks (0) | Category: Cancer | Drug Development | Infectious Diseases

October 25, 2007

Looking Backwards

Email This Entry

Posted by Derek

A colleague reminded me the other day of a project that he and I had worked on back at the Wonder Drug Factory seven years ago. "Seven years ago", I thought. . .I was the project leader on that one, trying to keep things alive as weird toxicology kept torpedoing everything. In the end, we held it together long enough to get four compounds into two-week tox testing, whereupon every one of them wiped out for yet another set of ugly reasons. Ah, yes. No one's going to have to work on that stuff again, that's for sure.

Hmm, I thought. What was I doing seven years before that? Well, I was back at my first drug industry job in New Jersey. The company had just moved into a new building the year before, and the old site was on its way to becoming a Home Depot. I was spending my days cranking out molecules hand over fist. Boy, did I run a lot of reductive aminations. It's safe to say that during those years I ran the majority of all the reductive aminations that I'll ever run in my life, unless something rather unforeseen crops up. We made thousands of compounds on that project, and I remember pointing out in a talk that nobody makes that many compounds if they really understand what they're doing. This was not a popular line of reasoning, but it's hard to refute, unless saying how much you don't like something counts as a refutation.

And seven years before that? Still in the lab. I was midway through grad school, wrestling with the middle of what turned into twenty-seven linear steps by the time I pulled the plug. (At this point, I began to reflect that I've been doing chemistry for quite some time now). In 1986 I didn't know that I wasn't going to end up finishing the molecule, and I was still hauling buckets of intermediates up the mountainside, only to find them alwyas mysteriously lighter and smaller by the time I got to the top. My response, naturally enough, was to start with larger buckets - what else was there to do?

And seven years before that? That finally takes me over the chemistry horizon, back to my senior year of high school in Arkansas, and to what might as well be a different planet entirely. Although I was interested in chemistry - as I was in most all the sciences, something I've never lost - I'd never heard of a Grignard reagent, and I didn't know what a nucleophile was. Counting up, I see that some time next year will mark the point at which I will have spent a slim majority of my lifetime doing organic chemistry, which is an odd thought. And it makes me wonder what I'll be up to seven years from now. . .

Comments (11) + TrackBacks (0) | Category: Graduate School | Life in the Drug Labs

October 24, 2007

Come On. Improve, Already.

Email This Entry

Posted by Derek

GSK opened up their books today, and the magnitude of their Avandia problem has become clear. This was a big part of the company’s sales, and the recent cardiovascular worries have really knocked down the numbers. The response, as has been the trend this year, is for the company to announce layoffs.

And man, have there ever been layoffs in the industry this year. There’s a list of the larger ones over at FierceBiotech, and it does not make for cheerful reading. In January, Pfizer announces 10,000 job cuts and closes their Ann Arbor site. That same month, Bayer-Schering closes the doors on research buildings in Connecticut and California (layoffs which were announced in 2006 and are thus not on the Fierce list). Bayer-Schering, who really should have run that B-S initial thing past a couple of native English speakers, announces 6,100 more job cuts in March. In July, AstraZeneca doubles down on its earlier layoff announcement and says that 7,600 jobs will disappear, and J&J announces a 4% reduction in its workforce (5,000 jobs). Then in August, Amgen cut over 2,000 jobs of its own.

In September, most everyone held on to the jobs for the moment But Novartis said this month that they’re going to trim over 1200 positions in the US, mostly through attrition. And now we have GSK with disappointing earnings and an announcement of unspecified layoffs, and bear in mind, this is just the news from the big outfits. The usual turmoil has been going on among the smaller companies (Idenix, Palatin, Sonus, and others), whose fortunes depend more on single drugs.

What a year – and hey, there’s still time to announce more layoffs before the holidays, so we may not be through yet. It’s tempting for some people to look at a list like this and say “Outsourcing! China! India!”. And I can’t deny that some of these jobs have headed there, just as some possible hiring expansions have been muted for the same reasons at other companies.

But outsourcing isn’t the whole story. Many of these job cuts have been in the sales forces, and they’re definitely not outsourcing the sales reps to Shanghai. Ditto for the people in Regulatory Affairs and Legal. Outsourcing is changing the size and shape of layoffs, but it’s not providing the motive force for them. That force, simply enough, is just that we’re not selling enough drugs, mostly because we don’t have enough good drugs to sell. Some areas have had too few projects even to start with (anti-infectives?), and everyone has had too few make it all the way through the clinic and the FDA.

And some of those failures have been extraordinarily large and expensive. Unfortunately, this has been the case for a while now. Over the last few years, we’ve had drugs that have failed terribly late in the clinic (torcetrapib, among others), drugs that have made it through trials but failed at the FDA (rimonabant, among others), and (most expensively of all) drugs that have made it to market and been pulled back early in their product lifetimes, after the big promotion money’s been spent and before any of it gets made back (Bayer’s Baycol and Pfizer’s Exubera – among others).

Add in the ones that never lived up to their planned potential (Iressa, Macugen, yes, yes, among others) and you have a gigantic revenue shortfall. Now, it’s true that not all of these would have made it under any conditions. Drugs fail. But do they fail like this, so relentlessly and so expensively? And it’s not that we aren’t killing all sorts of stuff off earlier in the development pipeline – no, these things are what’s left after the dogs are gone.

What to do? If I knew how to answer questions like that, I'd be dictating this from the deck of my yacht. The glass-half-full perspective is that there sure are a lot of opportunities for anything that can open up some new therapeutic areas or help with drug failure rates in the existing ones. It won't take much, considering where we're starting from. Yesterday I was encouraging people to try out some high-risk ideas, and here, in case anyone was wondering, is an excellent place for them.

Comments (13) + TrackBacks (0) | Category: Business and Markets | Clinical Trials | Drug Development

October 23, 2007

Vial Thirty-Three, And More

Email This Entry

Posted by Derek

My apologies for no post today - home events kept me away from the computer for a while, but everything's settled back down now.

I've had several e-mails the last couple of months asking about "Vial Thirty-Three", the saga of which can be found (in reverse chronological order) here. (More specifically, the first time that particular experiment worked was the May 18, 2006 entry, and you can scroll up from there if you wish). When last heard from, I was cranking away on a batch of experiments to finish before the Wonder Drug Factory closed its doors at the end of January.

The last ones got run just before they pulled the electrical plugs out of the walls, and a lot of interesting things came out of them. They were interesting enough, in fact, that they suggested a whole new series of ideas to me during the months I was between jobs. Of course, that did little good, since this isn't the kind of stuff that you can easily pull off in your basement.

But I'm very glad to report that my current employer is interested in this sort of thing, and in plenty more weird stuff besides. That's the good news, and good news it surely is. I have an explicit mandate to look at ideas and technologies beyond what the company is currently doing, and a group to tackle these things full-time. This is just the sort of thing I like to do, and having it as my main job responsibility is so enjoyable that I may never get used to it.

The bad news is that I won't be able to talk about what I'm up to. At the Wonder Drug Factory, my odd experiments were a sideline and were a long shot to work at any rate. I felt safe talking obliquely about them. But now I spend my whole day on this kind of thing - the mutant progeny of Vial Thirty-Three and several other similarly odd ideas. It's a wonderful feeling to see this sort of thing get resourced and watch it move forward, but it's all completely proprietary.

But even if I can't say much, I just wanted to let people know that things are continuing. I'm doing full-time what I used to have to squeeze in as a sideline. Working on this kind of idea has been, in retrospect, one of the best decisions I ever made as a scientist. If any of you have some wild thoughts about experiments that sound a bit weird, but just might work - well, my advice is to somehow make time for them. Sometimes they work. . .

Comments (6) + TrackBacks (0) | Category: Birth of an Idea

October 22, 2007

Surveying the Exubera Crater

Email This Entry

Posted by Derek

Pfizer has pulled the inhaled insulin Exubera from the market, and not because of the FDA, and not because of the lawyers. They’re giving up on it because they can’t take the pain any more. The company sold 12 million dollars worth of the stuff so far this year, a horrifyingly tiny amount. That represents about 0.3% of the insulin market, which we can round off to "zero". The ticket out is a mere 2.8 billion dollar charge against earnings. It's the first time I can remember a company pulling a drug just because it was losing so much money - of course, Pfizer is not a normal company, and these are not normal times, especially for them.

There are plenty of post-mortems around, from the front page of the Wall Street Journal onward. (See the Journal’s Health Blog, Matthew Herper’s blog at Forbes, Pharmalot and the folks at Invivoblog for more). I have my own, naturally, since a disaster of this size admits of many interpretations. Here’s what it says to me:

1. Marketing isn’t everything. The next time someone tells you about how drug companies can sell junk that people don’t need through their powerful, money-laden sales force, spare a thought for Pfizer. The biggest drug company in the world, with the biggest sales force and the biggest cash reserves, couldn’t move this turkey. People didn’t want it, and they didn’t buy it.

The flip side of this is that even the drugs that folks love to hate, the ones that no one can figure out why they do as well as they do, must be doing something for some people. Perhaps other, cheaper drugs would do something quite similar, and we can discuss cost/benefit ratios, but you couldn’t sell them if people didn’t feel that benefit in the denominator. Not many people felt it from Exubera.

2. Internal sales estimates can be a joke. People inside the drug companies have known this for a long time, although they’d often rather not think about it. Analysts have known it, too, but they're forced to pay attention to those numbers anyway. But man, look at the magnitude of this one. Just as Warner-Lambert tried to kill Lipitor before they brought it to market (who needs another statin?), Pfizer was telling analysts a few years ago that their projections for Exubera sales (a billion dollars a year) were just too darn low. Two billion a year by 2010, thank you and please correct the error. Only off by a factor of one hundred, and what’s two log units between friends?

Sales forecasts are not science, and they only bear a superficial resemblance to math (where the phrase "imaginary number" is rather more strictly defined). They are guesses, and some of them are good guesses and some of them are awful, and unfortunately when you first look them over, they all smell about the same.

3. Groups aren’t necessarily smarter. This is the flip side of all the “Wisdom of Crowds” stuff, which only works when a lot of people (who think of a lot of different things) all get a crack at a subject. Inside a company, though, diversity of opinion sometimes doesn’t get much respect, and the problem gets worse in areas like marketing (and worse as you go into the higher ranks). Think of what would have happened to a Pfizer exec who forecast a 0.3% market share and a 2.8 billion dollar charge for Exubera when everyone else was revising their figures up a billion. It would have taken a fantastic amount of nerve to make a call that contrarian, and the rewards for being right (if any) would definitely not have been worth it. Even if someone had a terrible suspicion, it was surely much safer to keep quiet.

Groups of people can, in fact, be quite stupid. People will deliberately not bring their minds to bear on a problem, in order to get along with their co-workers, to not stick their heads up, or just to make the damned meetings end more quickly.

4. Pfizer is in vast amounts of trouble. While not an original thought, it's an unavoidable one. We all know the problems they have, and believe it, they do too. But what to do? I remarked a few weeks ago that Pfizer's situation reminded me of a slow-motion film of a train running toward a cliff, and a colleague of mine said "Yeah, me too, but in this case they're still boarding passengers and loading their luggage".

Comments (19) + TrackBacks (0) | Category: Business and Markets | Diabetes and Obesity | Drug Industry History

October 18, 2007

Understanding Dawns

Email This Entry

Posted by Derek

My graduate school lab, like most of them, had an assortment of people from different countries. That kept things at all sorts of hours, since we’d get the occasional Japanese post-doc who never really seemed to get off JST and worked the zombie schedule. It also made for some adventures in communication. English was the lingua franca of the lab, naturally, but there were a lot of varieties spoken (and attempted).

And although it’s risky to generalize, I think that the ones with the biggest language gap were the aforementioned Japanese. Friends of mine from the country have blamed the problem on the traditional state of English teaching there, and the way that too many students are taught the language as if its phonics really did conform to what’s available in Katakana.

That’s the writing system used in Japan to render words phonetically. Reading fancy Japanese (Kanji) takes some real practice, but any hack (like me) can plow through Katakana with a chart and a little practice. I’ve been asked many times, in wondering tones, if I read Japanese after I pulled out a useful reagent name from a Japanese patent, but I wasn’t reading Japanese – I was reading English. Sort of.

Sounding out the words makes you sound like the most unfortunate expatriate Japanese post-doc you’ve ever heard. “Cyclohexyl”, for example, comes out as “Sa-ki-ru-he-ki-sa-ru”, and “chloro” is “ko-ru-ru”. I’d probably sound even worse than that if I had to speak Japanese, but it does give you some insight as to where the stronger features of the accent come from.

One way or another, we all did communicate in the end. I remember talking with one of our post-docs, trying to learn some Japanese profanity (a well-known gateway into a foreign language, of course). But I couldn’t get the concept across. “Bad word?” he asked, puzzled. “Curse word?”

An idea hit me. “OK”, I said, pointing at his rota-vap where a 1 mL flask was spinning. “How long did it take you to make that stuff?” That stuff, he informed me, was step 17 of his synthesis, and had taken weeks and weeks of work. “Fine,” I said, “what happens if it falls into the water bath?” “Ah! Terrible!” he said, looking fearful at the very thought. “Right”, I told him: “If that happens, what do you say?”

Enlightenment! “Oh! Yes! Those words! Bad word, yes, now I understand!” A great moment in international understanding. We went on to explore the sorts of phrases that are absolutely guaranteed to come in handy in any research lab, no matter where.

Comments (21) + TrackBacks (0) | Category: Graduate School

October 17, 2007

Biogen on the Block?

Email This Entry

Posted by Derek

Well, Carl Icahn has been buying large boatloads of Biogen stock over the last few months, and as those of you who follow the biotech industry are aware, he’s got the company putting itself on the block. That’s one of the normal sequels to an Icahn buying spree, and it looks like it’s going to work out well for him again (see Medimmune for another recent example).

Will it work out for Biogen, though? Looking over the potential buyers, it should also come as no surprise that the serious prospects make a pretty short list. Let’s see. . .huge, but with little presence in biologicals. Loaded with cash, but with serious pipeline problems and some nasty patent expirations coming on. Not averse to doing great big acquisitions to pump up their numbers. Gosh, who does that leave? Yep, Carl Icahn is playing marriage broker between Biogen and Pfizer.

There are also rumors of Pfizer buying Genzyme, for the exact same reasons. (That deal would at least have the desirable side effect of not further enriching Carl Ichan). If Amgen were doing better, we’d be hearing about Pfizer buying them, too, no doubt, but for now takeover rumors are one of the few things they probably don’t have to worry about. (Although you can make a case that they're the value play in the market. . .)

Does a Pfizer deal make sense? Well, they apparently considered a run at Biogen a few years ago, but backed down. And now Biogen is a lot more expensive, so you’d think that would answer the question. But you have to remember that Pfizer’s more desperate than it was back then, too. The disastrous loss of what could have been the biggest drug ever, the CETP inhibitor torcetrapib, changed things most unpleasantly (witness the closure of the Ann Arbor research site). And the market failure of the expensively developed inhaled insulin Exubera isn’t helping, either.

Oh, I wouldn’t want to run Pfizer, even if you paid me. . .well, what people get paid to run Pfizer. Honestly, it wouldn’t surprise me to see them up and buy Biogen, even if they are already in the multiple sclerosis market. They have a bunch of people who know how to manage takeovers, just like the Mongols had, and they could use the revenue, which will be pretty safe until the FDA gets its act together on biogenerics. Pfizer’s used to buying wasting assets, since that’s what you get when you buy a drug company for its big selling productss and ditch most everything else, and biologicals are wasting more slowly than small molecules do. God knows the investment bankers are motivated to see something like this go through at these prices.

But it would be a shame for Biogen. It was a shame when it happened to Warner-Lambert, and Pharmacia-Upjohn, and Affymax, and Sugen. (Update - as pointed out in the comments, Sugen was purchased by P-U before the Pfizer deal.) A lot of people would lose their jobs, and a lot of good research would get crippled or lost in the clouds of dust. And Pfizer would hold itself together until the time came to feed again.

Comments (16) + TrackBacks (0) | Category: Business and Markets

October 16, 2007

Three Things You Need

Email This Entry

Posted by Derek

Scientists who’ve spent a lot of time in research labs will have noticed that self-confidence seems to pay big dividends. If you think back to the people you’ve worked with who got the most things to work (especially the difficult things), you’ll likely also recall them as people who set up experiments relatively fearlessly, with expectations of success. Meanwhile, the tentative, gosh-I’m-just-not-sure folks generally compiled lesser records.

You can draw several conclusions from these observations, but not all of them are correct. For example, a first-order explanation would be to assume that experiments can sense the degree of confidence with which they’re approached and can adjust their outcomes accordingly. And sometimes I’m tempted to believe it. I’ve seen a few ring systems that seem to have been able to sense weakness or fear, that’s for sure, and other molecular frameworks that appear to have been possessed by malign spirits which were just waiting for the right moments to pounce.

But besides being nuts, this explanation is complicated by the few (but statistically significant) number of confident fools who thing everything they touch will work, no matter how ridiculous. These people tend to wash out of the field, for obvious reasons, but there’s a constant trickle of them coming in, so you’re never without a few of them. If self-assurance were all you needed, though, they’d be a lot more successful than they are.

No, I think that confidence is necessary, but not sufficient. Brains and skilled hands are big factors, too – but they aren’t sufficient by themselves, either. You need all three. Most people have them in varying quantities, of course, but you can learn a lot by looking at the extreme cases.

For example, I’ve seen some meticulous experimenters, not fools, who were undone in the lab by their lack of the confidence leg of the tripod. Their tentative natures led them to set up endless tiny series of test reactions, careful inch-at-a-time extensions into the unknown. This sort of style will yield results, although not as quickly as onlookers would like, but will probably never yield anything large or startling. Still, you can hold down a job with this combination, which is more than I can see for the next category.

Those are the confident fools mentioned earlier, who lack the brains part of the triad. They get involved in no-hope reactions (and whole no-hope lines of research) because they lack the intelligence to see the fix they’ve gotten themselves into. The whole time they essay, with reasonable technical competence and all kinds of high hopes, experiments which are doomed. As I said above, these people don’t necessarily have such long careers, but in the worst cases they can pull others of similar bent in their wake (while their more perspicacious co-workers leave, if possible, when they catch on to what’s happening).

Then there are the folks who lack the skilled hands. “Lab heads,” I can hear a chorus of voices say. “These are the people who become lab heads and PhD advisors.” There’s a lot of truth to that. Plenty of people can have good, bold ideas, but be incapable of physically carrying them out at the bench. Even controlling for age and lack of experience, there are plenty of Nobel-caliber people you wouldn’t want near your lab bench. Some of them are out of practice, but many of them were just as destructive when they were younger, too. Surrounded with good technicians, though, they can do great things. Many just face facts and confine themselves to the blackboard and the computer screen.

But if you have reasonable amounts of all three qualities, you’re set up to do well. Confidence is perhaps the limiting reagent in most natures, which is why it stands out so much when it’s combined with the others. A scientist with a lot of nerve is more likely to discover something big, and more likely to recognize it when it comes, than someone who undervalues their own abilities. They’re more prone to setting up weird and difficult experiments, knowing that the chances of success aren’t high, but that sometimes these things actually come through. That’s probably the source of the correlation that I lead off this post with: it’s not that confidence makes these ideas work. Rather, if you don’t have it you probably don’t try many such things in the first place.

Comments (19) + TrackBacks (0) | Category: Who Discovers and Why

October 15, 2007

Checking The Numbers on the Alzheimer's Test

Email This Entry

Posted by Derek

The news of a possible diagnostic test for Alzheimer’s disease is very interesting, although there’s always room to wonder about the utility of a diagnosis of a disease for which there is little effective therapy. The sample size for this study is smaller than I’d like to see, but the protein markers that they’re finding seem pretty plausible, and I’m sure that many of them will turn out to have some association with the disease.

But let’s run some numbers. The test was 91% accurate when run on stored blood samples of people who were later checked for development of Alzheimer’s, which compared to the existing techniques is pretty good. Is it good enough for a diagnostic test, though? We’ll concentrate on the younger elderly, who would be most in the market for this test.The NIH estimates that about 5% of people from 65 to 74 have AD. According to the Census Bureau (pdf), we had 17.3 million people between those ages in 2000, and that’s expected to grow to almost 38 million in 2030. Let’s call it 20 million as a nice round number.

What if all 20 million had been tested with this new method? We’ll break that down into the two groups – the 1 million who are really going to get the disease and the 19 million who aren’t. When that latter group gets their results back, 17,290,000 people are going to be told, correctly, that they don’t seem to be on track to get Alzheimer’s. Unfortunately, because of that 91% accuracy rate, 1,710,000 people are going to be told, incorrectly, that they are. You can guess what this will do for their peace of mind. Note, also, that almost twice as many people have just been wrongly told that they’re getting Alzheimer’s than the total number of people who really will.

Meanwhile, the million people who really are in trouble are opening their envelopes, and 910,000 of them are getting the bad news. But 90,000 of them are being told, incorrectly, that they’re in good shape, and are in for a cruel time of it in the coming years.

The people who got the hard news are likely to want to know if that’s real or not, and many of them will take the test again just to be sure. But that’s not going to help; in fact, it’ll confuse things even more. If that whole cohort of 1.7 million people who were wrongly diagnosed as being at risk get re-tested, about 1.556 million of them will get a clean test this time. Now they have a dilemma – they’ve got one up and one down, and which one do you believe? Meanwhile, nearly 154,000 of them will get a second wrong diagnosis, and will be more sure than ever that they’re on the list for Alzheimer’s.

Meanwhile, if that list of 910,000 people who were correctly diagnosed as being at risk get re-tested, 828 thousand of them will hear the bad news again and will (correctly) assume that they’re in trouble. But we’ve just added to the mixed-diagnosis crowd, because almost 82,000 people will be incorrectly given a clean result and won’t know what to believe.

I’ll assume that the people who got the clean test the first time will not be motivated to check again. So after two rounds of testing, we have 17.3 million people who’ve been correctly given a clean ticket, and 828,000 who’ve been correctly been given the red flag. But we also have 154,000 people who aren’t going to get the disease but have been told twice that they will, 90,000 people who are going to get it but have been told that they aren’t, and over 1.6 million people who have been through a blender and don’t know anything more than when they started.

Sad but true: 91% is just not good enough for a diagnostic test. And getting back to that key point in the first paragraph, would 100% be enough for a disease that we can't do anything about? Wait for an effective therapy, is my advice, and for a better test.

Update: See the comments for more, because there's more to it than this. For one thing, are the false positive and false negative rates for this test the same? (That'll naturally make a big difference). And how about differential diagnosis, using other tests to rule out similar conditions? On the should-you-know question, what about the financial and estate planning implications of a positive test - shouldn't those be worth something? (And there's another topic that no one's brought up yet: suicide, which you'd have to think would be statistically noticeable. . .)

Comments (21) + TrackBacks (0) | Category: Alzheimer's Disease | Biological News

Enzyme Humility

Email This Entry

Posted by Derek

There was a fascinating comment added to the recent discussion here on ammonia synthesis. It was pointed out that the amount of man-made Haber Process available nitrogen is outclassed by the amount fixed biologically. The legumes do their share, but a lot more is handled by free-living single-celled organisms. What's really startling is the estimate for the total amount of nitrogenase enzyme, by weight, that is responsible for the production of at least 100 million metric tons a year of reduced nitrogen: about twelve kilos.

It's important for us, as chemists, to contemplate figures like that lest we forget how unimpressive our own techniques are in comparison. Not all enzymes are that impressive, but many of them are extremely impressive indeed. One of Clarke's laws gets quoted a lot, the one about any sufficiently advanced technology being indistinguishable from magic. But there's no magic involved - these are things that we could do, if we just knew enough about how to do them.

Enzymes use a variety of effects to work these wonders, but a lot of it comes down to holding the reacting species in one place and lining everything up perfectly. It isn't as important to hold on to the starting materials or products, as it is to interact with and stabilize the highest-energy species in the whole process, the fleeting transition state. Various chemical groups can be brought to bear that activate or deactivate specific bonds, and everything works, at its best, with near-perfect timing. If you want molecular level-nanotechnology, this is it, and there's absolutely no reason why it has to be done inside a peptide backbone. If we understood enough, all sorts of other polymers, with all sorts of new functionality built into them, could presumably do things that Nature has never needed to do, under conditions that we could select for.

But we're unfortunately a long way from that. There's still a tremendous amount of argument about how even model enzymes actually work, with some rather exotic mechanisms being proposed. And if we don't understand what's going on, we sure can't design our own imitations. Making enzymes from scratch brings together a whole list of Very Hard Problems, from protein folding to femtosecond reaction dynamics, and making enzymes out of something other than proteins will be even harder. We're going to need to be a lot smarter, as a species, to figure out how to do it.

But learning more about such stuff is one of the things we do best. At least for the last few centuries it has been, and if we keep it up, there seems to reason why we shouldn't be able to figure out this one, too. Then, at long last, human ingenuity will have pulled even with blue-green algae, the fungi that live in rotting logs, and various sorts of pond scum. The little guys have had a big head start, but we're gaining fast.

Comments (8) + TrackBacks (0) | Category: General Scientific News

October 12, 2007

Unnatural, And Proud Of It

Email This Entry

Posted by Derek

The Haber-Bosch ammonia synthesis doesn’t intrude itself into the public consciousness much, but this year’s Nobel gave it a bit of a push. One thing I’ve noticed, though, is that whenever the topic of artificial fertilization comes up, it always kicks up a small dust storm of comment around it.

These vary widely in the reasonableness. Pointing out that artificially fixed nitrogen moved agriculture from (ultimately) a solar-powered base to (largely) a fossil-fuel base is both accurate and a good starting point for further discussion. See the comments to the Nobel post for an example – a person can argue that the Haber process didn’t require fossil fuels per se, or that we use more of them cooking the food than we do growing it (which may be true), or that we use more of them moving the food around (which I think is almost certainly true, and which opens up another set of questions) and so on.

Other good topics for discussion are how close various parts of the world were to a Malthusian food crisis when the ammonia synthesis came along, the other industrial effects of relatively cheap ammonia, the tradeoff of intensive fertilized farming in smaller areas versus more traditional routes in larger ones, etc. But if you’d like an example of an unreasonable comment, I’ll let this one over at Megan McArdle’s Atlantic Monthly blog stand in for a lot of similar fuzzy-mindedness:

"Higher yields due to the petroleum rich Haber-Bosch method also mean faster soil erosion and increased need of rotation etc. Combined with applying this method for inefficient livestock agriculture - it has destroyed NOT saved the rainforest and other ecosystems. Chemical fertilizer in ecology are like statism for the economy. You can force short-term results but nothing more!

At least 800 million people still go hungry.. their way forward into a sustainable future is less livestock agriculture and (more) organic natural farming.

Haber-Bosch is on the same environmental level as coal, oil! Not good, not sustainable, ideologically toxic for survival. We have to get rid of it pronto if we want our children to have "a nice life".

. . .All the social sciences, all the non-biological sciences like chemistry and physics should drop immediately what they are doing and learn more about their mother (and forget as much as possible about their "father" - you know who I mean?)!"

It’s hard to know where to start with this sort of thing. But I think I’ll do what Richard Dawkins did for Prince Charles a few years ago. Dawkins’s “You’re an idiot” style of debate isn’t always productive (for example, I think he does more harm than good to his cause as an atheist), but in this case I think the board across the nose was a good idea. He pointed out that if we’re going to use “naturalness” as a criterion, then agriculture isn’t going to make the cut, either. And that doesn’t mean factory farming and Roundup-Ready seeds; that means agriculture of any kind beyond remembering where the good patch of wild blueberries is and getting there before the bears do:

I think you may have an exaggerated idea of the natural ness of "traditional" or "organic" agriculture. Agriculture has always been unnatural. Our species began to depart from our natural hunter-gatherer lifestyle as recently as 10,000 years ago - too short to measure on the evolutionary timescale.

Wheat, be it ever so wholemeal and stoneground, is not a natural food for Homo sapiens. Nor is milk, except for children. Almost every morsel of our food is genetically modified - admittedly by artificial selection not artificial mutation, but the end result is the same. A wheat grain is a genetically modified grass seed, just as a pekinese is a genetically modified wolf. Playing God? We've been playing God for centuries!

The large, anonymous crowds in which we now teem began with the agricultural revolution, and without agriculture we could survive in only a tiny fraction of our current numbers. Our high population is an agricultural (and technological and medical) artifact. It is far more unnatural than the population-limiting methods condemned as unnatural by the Pope. Like it or not, we are stuck with agriculture, and agriculture - all agriculture - is unnatural. We sold that pass 10,000 years ago.

Dawkins is correct. We live in an unnatural world, and that goes for a lot of prehistory, too. Our world has been unnatural ever since we started applying our intelligence to it. When humans first started building shelters to get out of the cold and rain, I suppose you could say that this is no more than what an animal does when it digs a den. Killing a mammoth partly in order to use its bones for a house is a step beyond that, but in the same league as what beavers do to birch trees. But clearing land, planting seeds in it, tending and harvesting a crop, and saving some of its seeds to plant again is another order of living. Just because it all happened a long time ago (and because no one yet knew how to write it down) doesn’t make it any more in tune with ancient natural harmonies or whatever. (Try this PDF on for size).

We've been trying to fertilize the soil for thousands of years with whatever was on hand - manure, dead fish, the ashes of the plants that were burnt to make the field. And we've been modifying the genetic profile of our food crops over that same time with awe-inspiring persistence and dedication. (Good thing, too). No, when we move from that to artificial fertilizers and genetically engineered seeds, we’re talking about differences in degree rather than differences in kind. Large differences in degree, true, and worth discussing they are, but not on the basis of either their antiquity or their "naturalness".

Comments (21) + TrackBacks (0) | Category: Current Events | General Scientific News

October 11, 2007

Let Us Now Turn To the Example of Yo' Mama

Email This Entry

Posted by Derek

Now we open the sedate, learned pages of Nature Methods, a fine journal that specializes in new techniques in molecular and chemical biology. In the August issue, the correspondence section features. . .well, a testy response to a paper that appeared last year in Nature Methods.

“Experimental challenge to a ‘rigorous’ BRET analysis of GPCR oligimerization” is the title. If you don’t know the acronyms, never mind – journals like this have acronyms like leopards have spots. The people doing the complaining, Ali Salahpour and Bernard Masri of Duke, are taking issue with a paper from Oxford by John James, Simon Davis, and co-workers. The original paper described a bioluminescence energy transfer (BRET) method to see if G-protein coupled receptors (GPCRs) were associating with each other on cell surfaces. (GPCRs are hugely important signaling systems and drug targets – think serotonin, dopamine, opiates, adrenaline – and it’s become clear in recent years that they can possibly hook up in various unsuspected combinations on the surfaces of cells in vivo).

Salahpour and Masri take strong exception to the Oxford paper’s self-characterization:

“Although the development of new approaches for BRET analysis is commendable, part of the authors’ methodological approach falls short of being ‘rigorous’. . .Some of the pitfalls of their type-1 and type-2 experiments have already been discussed elsewhere (footnote to another complaint about the same work, which also appeared earlier this year in the same journal - DBL). Here we focus on the type-2 experiments and report experimental data to refute some of the results and conclusions presented by James et al.”

That’s about an 8 out of 10 on the scale of nasty scientific language, translating as “You mean well but are lamentably incompetent.” The only way to ratchet things up further is to accuse someone of bad faith or fraud. I won’t go into the technical details of Salahpour and Masri’s complaints; they have to do with the mechanism of BRET, the effect on it of how much GPCR protein is expressed in the cells being studied, and the way James et al. interpreted their results versus standards. The language of these complaints, though, is openly exasperated, full of wording like “unfortunately”, “It seems unlikely”, “we can assume, at best” “(does) not permit rigorous conclusions to be drawn”, “might be erroneous”, “inappropriate and a misinterpretation”, “This could explain why”, “careful examination also (raises) some concerns”, and so on. After the bandilleros and picadors have done their work in the preceding paragraphs, the communication finishes up with another flash of the sword:

In summary, we agree with James and colleagues that type-2 experiments are useful and informative. . .Unfortunately, the experimental design proposed in James et al. to perform type-2 experiments seems incorrect and cannot be interpreted. . .”

James and Davis don’t take this with a smile, naturally. The journal gave them a space to reply to the criticisms, as is standard practice, and as they did for the earlier criticism. (At least the editors know that people are reading the papers they accept. . .) They take on many of the Salahpour/Masri points, claiming that their refutations were done under completely inappropriate conditions, among other things. And they finish up with a flourish, too:

"As we have emphasized, we were not the first to attempt quantitative analysis of BRET data. Previously, however, resonance energy transfer theory was misinterpreted (for example, ref. 4) or applied incorrectly (for example, ref. 5). (Note - reference 4 is to a paper by the first people to question their paper earlier this year, and reference 5 is to the work of Salahpour himself, a nice touch - DBL). The only truly novel aspect of our experiments is that we verified our particular implementation of the theory by analyzing a set of very well-characterized. . .control proteins. (Note - "as opposed to you people" - DBL). . . .In this context, the technical concerns of Salahpour and Masri do not seem relevant."

It's probably safe to say that the air has not yet been cleared. I'm not enough of a BRET hand to say who's right here, but it looks like we're all going to have some more chances to make up our minds (and to appreciate the invective along the way).

Comments (21) + TrackBacks (0) | Category: Biological News | Drug Assays | The Scientific Literature

October 10, 2007

Ertl Wins: Down With Witchcraft

Email This Entry

Posted by Derek

As some had speculated, the Nobel in chemistry did take a turn toward physical chemistry this year, for the first time in some while. Gerhard Ertl has won for his work on reactions that take place on solid surfaces, an extremely important (and extremely difficult) field of research.

It’s hard because chemists and physicists have an easier time of it with bulk phases – all solid, liquid, or gas. When you start mixing them, or start trying to understand what happens where they meet, things get tricky. The border between two phases is very different from what’s on either side of it. The key zone is only a few atoms thick, and the interesting stuff there happens extremely quickly.

But some of the most important chemical reactions in the world take place down there. Take the Haber-Bosch process for producing ammonia – “Right,” I’m sure some readers of today’s newspaper are saying, “you take the Haber-Bosch process, whatever it is, and get it out of here.” But by making ammonia from nitrogen in the air, it led to (among other things) the invention of man-made fertilizers. That reaction has kept billions of people from starving to death, and kept huge swaths of wilderness from being turned into farmland. (Read up on Norman Borlaug if you haven’t already for more on this).

You can Haber-Bosch yourself some ammonia simply enough – just take iron powder, mix it with some drain cleaner (potassium hydroxide) and stir that up with some alumina and finely ground sand (silica). Heat it up to several hundred degrees and blow nitrogen and hydrogen across it; ammonia gas comes whiffing out the other end. Now, bacteria do this at room temperature in water, down around the roots of bean plants, but bacteria can do a lot of things we can’t do. For human civilization, this is a major achievement, because nitrogen does not want to do this reaction at all.

The industrial process was discovered in its earliest form nearly one hundred years ago, and was the subject of a Nobel all its own. But no one knew how it worked, which is a good example of how difficult surface interface work can be. You can see what has to happen eventually: the triple bond between two nitrogen atoms has to be broken and replaced by three bonds to hydrogen, whose own H-H bond is also broken. But that nitrogen triple is one of the strongest bonds in all of chemistry, so how is it breaking? Do the nitrogen molecules soak into the iron somehow, and if so, what does “soak in” mean on an atomic level, anyway? Do they sit on the surface, instead – and if they do, what keeps them there? Is that triple bond still in force when that happens, or has it started to break? If so, what on earth is strong enough on the surface of iron powder to do that? Where’s the hydrogen during all this, and how does its single bond get broken? What happens first, and why do you need the hydroxide and the other stuff? And so on.

Ertl and others had long studied hydrogen’s behavior on metal surfaces, while helping to figure out how catalytic hydrogenation works. (That was a reaction accurately described to me as an undergraduate in 1981 as “witchcraft”, and Ertl is one of the people who have helped to exorcise it). So they’d seen how hydrogen got broken into individual atoms and spread between iron atoms on the surface – the surprise for him and his co-workers was that nitrogen turned out to do the same thing, breaking that fearsome triple bond in the process. The biggest step in the whole mechanism happened very early. By running the reaction forward and in reverse (turning ammonia back into nitrogen and hydrogen, an otherwise perverse act for the most part), they were able to work out all the individual steps and the energies involved. Along the way, they figured out what the potassium hydroxide was doing in there, too (donating some key electrons to the iron atoms).

Observing this and other surface processes has pushed the limits of several spectroscopic techniques, such as Auger electron spectroscopy (AES), low-energy electron diffraction (LEED), various forms of photoelectron spectroscopy, and others. Ertl's work has been notable for using a wide variety of methods, since there's no one tool that can give you the answers to questions like these.

He and his associates have studied many other surface reactions, such as the sorts of things that go on in the catalytic converters in exhaust systems. Metal-surface reactions like this are crucial to industrial civilization, and their importance is, if anything, growing. If we're ever going to get fuel cells to work economically, use hydrogen as an energy medium, or do a better job cleaning up industrial wastes, we're going to be using such things. And keeping them in the category of witchcraft won't cut it. It never does. Congratulations to Gerhard Ertl!

Comments (23) + TrackBacks (0) | Category: Chemical News | Current Events

October 9, 2007

Nobel Chemistry Odds

Email This Entry

Posted by Derek

Paul Bracher over at Chembark has posted an extensive list of Nobel odds, just in time for tomorrow's announcement. For the record, I think that if it's a more biologically-oriented award - and hey, in recent years that's just what it's been - then Roger Tsien et al., for green fluorescent protein, is my guess. If it's straight organic chemistry, then my guess is Suzuki/Heck/ and whoever else they can decide on for transition-metal coupling reactions. In physical chemistry, I'd have to go with Richard Zare, for laser studies and various instrumental techniques.

Keep in mind, though, that my track record is pretty ugly. Of course, so is everyone else's.

Comments (7) + TrackBacks (0) | Category: Current Events

Blogroll Update

Email This Entry

Posted by Derek

For those of you who were taking the day off yesterday, there's a Nobel speculation post just below this one. Today, it's time for a long-overdue blogroll update. We have the In Vivo Blog to lead things off, and Away From the Bench, which I guess describes me right now, the fine line between Drugs and Poisons, industry news site Fierce Biotech, England's Peter Murray-Rust, the all-fluorous all the time F- Blog, a reluctant chemist doing Closeted Chemistry, Making Graphite Work (something I've never been able to do), the weirdly named Power of Goo, and the most aptly named Great Molecular Crapshoot.

Comments (5) + TrackBacks (0) | Category: Blog Housekeeping

October 8, 2007

Nobel Season

Email This Entry

Posted by Derek

The Nobel in Medicine has gone this year to the inventors of gene-knockout techniques for mice, which seems well-deserved, considering how much has been learned through such experiments. This is, in fact, one of those discoveries that you'd think was already recognized by a Nobel if you hadn't been keeping count, which is as good a criterion as any. (It's rather odd, for example, that gene knockouts were recognized after RNA interference, don't you think, since a good ten or fifteen years separate the two in real life?)

Wednesday morning is the announcement of the Chemistry award, so I'm throwing open the gates of speculation, as I do every year around here. Our track record (mine and the predictions in the comments) has not been very good, but nobody has a good batting average when trying to read the minds of the Nobel committees. I feel pretty safe in saying that this year will be a "real" chemistry prize - we're one out of the last four, compared to overflow from the nonexistent Nobel in molecular and cell biology.

So, who's it going to be? Last year's uninformed gossip is here, and there's plenty more over at Chembark. Put your bets down, but only with money you can afford to lose. . .

Update: Still more speculation, and even more.

Comments (6) + TrackBacks (0) | Category: Current Events

October 4, 2007

No Problem At All

Email This Entry

Posted by Derek

I was listening to a seminar speaker today, who as an aside mentioned forming an ester as “about the easiest reaction that you can do”. He had a point. If you have a free carboxylic acid, combining it with an alcohol and some acid will generally give you some amount of the ester, and most of the time it’s a high-yielding reaction. But there are problems if the acid is next to a chiral center, or if there are other functional groups that don’t like being cooked with acid catalyst. So while this one is easy, I’m not ready to give it the title. (The speaker had had trouble with it himself).

But if it’s not the winner, what is? Displacement of a leaving group with a powerful nucleophile (fluoride, azide, cyanide) is usually pretty foolproof, but we’ve all dealt with structures that are too hindered to do it well (or decide to eliminate and form an alkene instead when forced). Reduction of an aldehyde to an alcohol is also hard to mess us – good old sodium borohydride – but chiral centers next to aldehydes are untrustworthy in the extreme. You may get your alcohol, only to find that the dying aldehyde left you a scrambled stereocenter in its will.

How about forming an oxime from the aldehyde instead? I haven’t had to do that nearly as much as the other reactions mentioned, but every time it seemed like (as an old labmate of mine put it) “a reaction that my grandmother could do”. And if you want to cheat a bit, and start from a really reactive system, then it’s hard to beat formation of an amide from an acid chloride, which is why you see so many amides. But my pick for a reaction that can’t fail is another one from the leg-up category: the formation of a urea from an isocyanate. It has to be an awfully hindered or non-nucleophilic amine for that not to work, or a really weird isocyanate. Anyone have an easier reaction than that?

The problem with talking about these things, though, is that when they fail you, you feel like a complete dodo. I once had a primary alcohol that I just could not put a TBDMS protecting group on, and I still remember the looks of pity in the eyes of my co-workers when I would complain about it. When something like that turns on you, you find yourself wondering if maybe you should have gone to truck-driving school instead, like Mom always wanted.

Comments (21) + TrackBacks (0) | Category: Life in the Drug Labs

October 3, 2007

More Layoffs, And What They Might Mean

Email This Entry

Posted by Derek

Unfortunately, it appears that Johnson & Johnson is continuing to trim their chemistry staff. I’ve heard from people there that another round of layoffs have hit, most of them to take effect later this year. And as usual, the company doesn’t seem to be making any public announcement about this. Readers with more details are welcome to add them in the comments. . .

This has clearly not been a good year to be a drug researcher here in the US, what with the Bayer and Pfizer upheavals earlier and now this. There seem to be several reasons for this, some of which are specific to the companies involved. Pfizer, for example, was faced with some hard choices after taking some grievous hits in their advanced clinical pipeline, with the torcetrapib disaster being the intolerable last torpedo. Bayer ended up paying a lot more for their merger with Schering AG than they expected to (a merger that was surely going to involve some job losses even before the price went up). J&J, for their part, seems by their actions to believe that their future lies more in running fewer in-house programs and inlicensing more from other people.

And there are trends that affect everyone, on top of these local troubles. Low clinical research productivity at many big firms is proverbial, which is why some of these mergers and re-orgs are happening in the first place. In the preclinical world, a lot of routine (and some less routine) work is going overseas, which is no news to anyone. The changes in the industry are catching even really good scientists, so it’s definitely not safe to be doing an OK job on things that pretty much everyone else is doing. There aren’t any safe jobs in the business, and there haven’t been any for quite some time now, when you look back on it.

My belief is that we’re witnessing a broad shift in this country to a larger fraction of researchers being employed at the smaller companies. One thing that the US has which not many other places have imitated is our venture-capital culture. Our mechanisms for funding ideas are second to none in their speed and scope. Given that, I think that we may be heading into a world where drug research is broken down into smaller independent units – startups. These shops open up (and close down) with greater speed, and their successes and failures are likewise magnified.

Instead of Huge Company X moving along with some projects working really well and some dragging along, imagine each therapeutic area (or in extreme cases, each project) split out into a separate company. Some will work, some won’t, and some will move up and some will disappear. This affects the way these projects are run, naturally. In a smaller company, there’s more pressure to get something to the clinic (and the market), and at the same time there’s an increased willingness to take chances and try out new approaches to get there.

If this idea of mine is true, it means that we’re all, on the average, probably going to end up working for a longer list of companies than we might have planned on. (I already have!) It also means that the locations that have the best small-company culture will have a leg up, since they have access to a larger (and more easily accessed) pool of equipment, facilities, and potential employees. Keep in mind that this is the voice of someone who’s worked for larger companies, and is now working for a smaller one in Cambridge – but think about it.

Comments (33) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

October 2, 2007

Why Now, And Not Before?

Email This Entry

Posted by Derek

Talking as I was the other day about flow chemistry makes me think again of a topic that I find interesting, perhaps because it’s so difficult to refute anything: the counterfactual history of science. It’s a bit perverse of me, because one of the things I like about the hard sciences is how arguments can actually be settled (well, at least until new data come along that upset everything equally).

But here goes: if flow chemistry does catch on to become a widely accepted technique – and it may well deserve to – then what will have taken it so long? None of the equipment being used, as far as I can see, would have kept this all from happening twenty-five years ago or more. Some pumps, some tubing, a valve or two, and you’re off, at least for the simplest cases. Some guy at Pfizer published a home-made rig that many people could assemble from parts sitting around their labs. So why didn’t they?

Easier to answer is why flow chemistry didn’t become the default mode of organic synthesis. The requirement for pumps and pressure fittings made it unlikely to be taken up back in the days before HPLC systems were so common. Something could have been rigged up even a hundred years ago, but it would have been quite an undertaking, and unlikely to have caught on compared to the ease of running a batch of stuff in a flask.

But since the 1970s, the necessary equipment has been sitting around all over the place, so we get back to the question of why it’s finally such a hot topic here ins 2007. (And a hot topic it surely is: the other day, Novartis announced that they’re handing MIT (just down the road from me) a whole bucket of money to work out technology for process-sized flow reactors).

My guess is that some of it has been the feeling, among anyone who had such ideas years ago, that surely someone just have tried this stuff out at some point. That’s an inhibitory effect on all sorts of inventions, the feeling that there must be a reason why no one’s done it before. That’s not a thought to be dismissed – I mean, sometimes there is a good reason – but it’s not a thought that should make the decision for you.

There’s also the possibility that some of the people who might have thought about the idea didn’t see it to its best advantage. The ability to have high temperatures and pressures in a comparatively small part of the apparatus is a real help, but if you’re thinking mostly of room-temperature stuff you might not appreciate that. Ditto for the idea of solid-supported reagents (which, in its general non-flow form, is another idea that took a lot longer to get going than you might have thought).

And there’s always the fear of looking ridiculous. Never underestimate that one. Microwave reactions, remember, got the same reception at first, and that must have gone double for the first people who home-brewed the apparatus: “You’re running your coupling reaction in a what?” I can imagine the rolling eyes if some grad student had had the flow chemistry idea back in the 1980s and starting sticking together discarded HPLC equipment and hot plates to run their reactions in. . .

Comments (7) + TrackBacks (0) | Category: Who Discovers and Why

October 1, 2007

All Sorts of Holes

Email This Entry

Posted by Derek

One of the things I like most about science is how thoroughly you can be taken by surprise. A good check on a field’s vigor is whether or not its practitioners are being ambushed by new data. By that standard, what at first looks like an embarrassment for the ozone-hole chemists actually makes them look pretty good.

The chemistry of ozone depletion over the Antarctic is well understood. Or is it? One of the key molecules in the process is chlorine peroxide (also known as chlorine monoxide dimer). It’s understood to be split by sunlight into reactive free chlorine radicals, which go on to catalyze the conversion of ozone into plain oxygen. In the process, the peroxide forms again, and the whole cycle starts over. While this is by no means the only means by which chlorine depletes ozone, it’s long been thought to be the main one.

But chlorine peroxide is a difficult molecule to work with. Extremely unstable by sea-level laboratory standards, it’s been hard to isolate in pure form for study. And despite the generally accepted cascade of ozone depletion reactions, it hadn’t even been detected in the Antarctic until 2004, which difficulty had been largely chalked up to its short lifetime. Now, though, a team at JPL has produced the best synthetic samples of chlorine peroxide to date, and they’ve checked how quickly it decomposes in the presence of ultraviolet light. And, well. . .the problem is, the stuff falls apart much more slowly than anyone had predicted – many, many times more slowly. If they’re right, it’s hard to see how the accepted chemistry of chlorine peroxide-driven ozone depletion can be correct.

This has produced all sorts of surprised reactions in the atmospheric chemistry world, summed up here at Nature News. Everyone is taking this report seriously, as well they should, and a number of explanations are already being tentatively advanced. All of them are going to require a lot of revision of what we thought we knew, though. (I should note that the depletion of ozone itself isn’t in question; that’s an experimental fact. Just how it’s being lost is the problem). Nature quotes researcher Marcus Rex:

"Overwhelming evidence still suggests that anthropogenic emissions of CFCs and halons are the reason for the ozone loss. But we would be on much firmer ground if we could write down the correct chemical reactions."

I have little doubt that this will get figured out eventually. The reason I’m optimistic is that this area of research is going along the way it’s supposed to. People are spending the time and effort to check assumptions, and when something turns up unexpectedly, the results are published in a good journal for everyone to see and argue over. That will lead to another round of theorizing, then more rounds of experimentation as people try to prove the latest ideas right or wrong. And thus we close in on the truth. That’s exactly, exactly how it should work.

Comments (6) + TrackBacks (0) | Category: Chemical News