About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
January 31, 2005
How do you know if someone's a good chemist? For now, I'll restrict that to "good bench chemist", because that's hard enough to answer on its own. People can put up a surprisingly good front, but there are some things to watch out for.
Knowing things like name reactions is just as often a blind as a good indicator. Those are a charmingly antiquated part of organic chemistry where reactions are referred to by their original discoverers (or popularizers, in some cases.) So you have ancient classics like the Dieckmann condensation or the Williamson ether synthesis, both of which are still in use every day of the week despite their 19th-century pedigrees, and more modern ones like the Suzuki coupling or the Heck reaction. There are scores of these things - one of these days I'll go on about them some more - and they're a semi-beloved, semi-loathed feature of all introductory organic classes. It's nice to have a good familiarity with them, because many of them are quite important, but name-dropping can also be a noisy distraction that tyros use to hide their other deficiencies.
Being quick with ideas and mechanisms up on the blackboard is usually a good sign, but not being so isn't always a bad one. Some people think differently than others, and at different paces. There are always people who need to go look out the window for a few minutes before coming up with the answer. The biggest quick-draw artists I've seen with blackboard reaction pathways have clustered at the top and bottom of my personal rankings. They were either very good chemists indeed, or that was about the only thing that they were good for.
Productivity at the bench is harder to fake, but it can be done in some cases. In the early days of combinatorial chemistry, some folks would hit on an easy-to-extend series of analogs (often by working out some handy way to set up and purify the reactions faster than usual), and run these out to impressive lengths to wow the onlookers. Back in the early 1990s you could really impress folks by suddenly turning in, say, 112 sulfonamides all at the same time, but now it's become a bit more commonplace. But being able to produce that many compounds means that you're at least fairly hard-working and organized, traits not to be underestimated.
One thing to watch is whether a person's chemistry can be duplicated by anyone else, and how good they are at duplicating things in turn. If someone consistently gets lower yields or messier products than other people running the same kind of thing, it's probably a warning sign. And if no one else can get a person's reactions to work as well as they did originally, it's almost always a red flag. It's very rare that someone has such consistently good hands that they always get higher yields. More likely, they're pulling the wool over your eyes, or over their own as well.
If you have a very green chemist coming into a lab - say, a summer undergrad or a first-year graduate student in academia - there's a foolproof way to test their bench skills. Have them reproduce a preparation from Organic Syntheses. That's a series (over 80 volumes worth) of useful procedures in organic chemistry. They're either preparations of particularly useful intermediates or illustrate new reactions and the preferred ways to run them. They're more detailed than the standard writeup in a chemical journal or patent (especially some patents I've seen), and they're checked by another research group entirely, with their comments appended. They're completely foolproof, and if you find someone who can't get one to work, odds are that you are, in fact, dealing with a fool.
+ TrackBacks (0) | Category: Life in the Drug Labs
January 30, 2005
Although I generally don't comment on current political events here, I wanted to congratulate the Iraqis who voted in their election this weekend. From a scientist's point of view, it would be a fine thing if they (and the other countries in the region) could have their affairs in good enough order to join the research efforts that are going on in so many other countries.
You don't necessarily have to be a rich country to do some useful science, if you pick your targets well. Cuba, of all places, seems to have a pretty respectable expertise in biotech and vaccines. And (to be frank) the position of many Middle Eastern countries in the rankings of world science isn't due to lack of money. The Gulf States, for example, could bankroll some serious projects - but, for the most part, they don't. (I'm not going to comment on the large physics engineering project that seems to be underway in Iran!)
I'm showing my biases here, because I think that scientific research is one of the greatest endeavors of the human race. The more hands and minds we have working on the big problems, the better the chances of solutions. But the Middle East (broadly defined, and with the conspicuous exception of Israel) is a desert for science. Most of the countries in that part of the world are hardly visible in the scientific literature - in this PDF article, you'll see that this entire region (along with Africa) is completely ignored. In my field, I see occasional papers from Egypt and Iran, but that's just about it.
There are plenty of competent (and potentially competent) people in these countries - just look at what some of them have accomplished as expatriates. The social, economic, and educational problems in these countries are (among other things) a tremendous waste of human potential. We need it, they need it, and I hope that eventually it finds an outlet.
+ TrackBacks (0) | Category: Current Events
January 27, 2005
Having a project die after it goes into clinical development is never a good thing (unless you count the money saved by finding out that it was a loser and not taking it on even further.) There's a minor problem associated with this situation, though, that I've never heard discussed very often. What do you do with all those batches of compound that you made?
And there tend to be quite a few of them. As a compound goes on into the clinic, the serious scale-up chemists get to work on it, because there are a lot of compound-intensive tests that have to be done. Toxicity testing in animals moves to larger species and longer dosing times (four weeks, then ten to twenty weeks), and those things really chew up piles of compound.
Medicinal chemists like me start out making 15 milligrams of something, and we'll move up to a few grams when we have a winner. I've helped out on campaigns for the first rounds of real toxicity testing, making (say) a hundred-gram batch of compound, but only once or twice in my career. It's very rare for me to use over a hundred grams of a reagent, much less make a hundred grams of some product. So it's quite a sight to see rows of brown-glass screw cap bottles, each with 100 grams of final compound in them.
Or, for that matter, to see a two-kilo drum of the stuff with a snap-on closure on it like a batch of driveway sealer. What do you do with all this material after the compound has hit the wall? You already know that you're not going to take it into the clinic again. You don't need that much to stock the repository to run it through future assays. It's probably not much use as a starting material for other reactions. What good is it, then?
Every place I've worked has shelves of these things. My impression is that none of us have figured out what to do with them. In the end, it's just too painful to mark them down as chemical waste for the guys in coveralls to haul away.
+ TrackBacks (0) | Category: Drug Development
January 26, 2005
Next year will mark the 25th anniversary of the first chemical reaction I ever did in a research lab. So I've been around, but I'm not old enough to remember the days before standard glass fittings. I think you have to go back to the 1950s for that, back to the days of rubber stoppers and hand-bent glass tubing.
Nope, for my entire lab lifetime it's been standardized glass joints. Those of you in the labs will have hardly given a thought to these, since they're part of your everyday experience, but I find that nonscientists are often quite taken by these. The joints are just ground glass (the "frosted" look as you'd see in a decorative mug), made to a standard diameter, angle, and length. (Here are some shots of equipment with them.) They make all flasks, adapters, columns, condensers and what-have-you interchangable, no matter where you bought it. You can assemble anything you feel like assembling, as long as you have the pieces and the patience. It's like glassware Lego.
Standard taper joints are described by two numbers, like 24/40. The first is the inside diameter of the wide part of the joint, in mm, and the second is the length of the ground glass section down to the other end. 24/40 is pretty much the standard in most organic research labs, except for small glassware, which is 14/20 (you can see by the ratio that that one isn't as deep a joint.) Most of my work has been done with those two.
Teaching labs, for reasons that are obscure to me, often use 19/22 glassware. I used it as an undergrad, but I haven't seen any in years, and it would probably look funny to me now. But I should talk: in the last few years, I've become a fan of the oddball 29/42 glassware, which (as the number shows) has wider openings than the usual stuff. Big flasks tend to have that size joint on them, but I've got all sorts of stuff with it now, in all sorts of sizes.
I like it because it's easier to get things out of the flask. Scraping stuck powders from the inside walls of a round-bottom flask is something every chemist spends a fair amount of time on, and with these flasks I don't have to bend a metal spatula to pieces to reach everything. Any working chemist has a collection of metal spatulas that have been tortured into all kinds of shapes to reach up into glassware, but I have fewer than most. Now it's the 24/40 glassware that looks odd and narrow to me.
Some of you may suspect that I like the 29 glassware because it keeps people from
stealing borrowing my stuff. Well, I think that's how my predecessor in this lab got into it, but I've come to enjoy it on its own merits. I encourage my bench-chemist readers to give the size a try. Give your colleagues a chance to smile and shake their heads behind your back, too!
+ TrackBacks (0) | Category: Life in the Drug Labs
January 25, 2005
Last week, the Wall Street Journal ran an article (if you have a subscription, it's here)on the Novartis research center in Cambridge (the MA one) and its director, Mark Fishman. I wrote a couple of blog posts back when the place was first starting up, which I'll have to unearth at some point and compare to reality. So far, that reality looks a bit different from the way that the rest of Novartis (and much of the rest of the drug industry) are doing things.
The article makes much of the emphasis on genetic causes of disease that Fishman has spoken of, which is only natural, since he's known as a force behind the decipherment of the zebrafish genome. (If you're not in the biomedical sciences, you may wonder what the big deal is about zebrafish. Actually, they're one of the most heavily studied creatures in developmental biology, sort of a fruit fly with fins.)
That genomic strategy is probably going to be either a big winner or a big loser, but it's too soon to say which. He has several others which fall into that category, too. From the article:
"Dr. Fishman also is changing how Novartis selects diseases to study. Traditionally, it and other big companies have decided by calculating the size of the potential market. Dr. Fishman thinks Novartis has a better chance of increasing sales and curing people if it goes after illnesses whose genetic makeup it has the best chance of deciphering, and diseases for which few treatments are available. In some cases, that has meant studying illnesses that affect relatively few people, such as multiple sclerosis.
Most drugs today are tested on large groups of patients suffering from a common illness. But Dr. Fishman believes that many diseases, such as hypertension and diabetes, can be divided into subgroups. So Novartis has begun conducting more specific drug trials in smaller groups of patients, which Dr. Fishman says has the added benefit of saving money and time."
That first one is sort of the Gleevec story writ large, but I've said several times that Gleevec's status as a billion-dollar orphan drug says more about the state of the oncology market than it does about the compound. In general, drugs are going to sell in proportion to the size of their market. (Now, you can underestimate that size or overestimate it, of course - this is far from being a science. Pfizer underestimated Viagra's potential at first, and Pfizer's competitors seem to have turned around and then overestimated the exact same market.) I worry that some of these decipherable genomic targets will be for diseases that affect very few people. The resulting drug will end up with quite a price tag. But, still, Novartis is a Swiss company, and I assume that the Swiss have thought this through.
And as for the smaller trials, that's indeed where the industry would like to go. But there's a shortage of biomarkers (so far) for being sure that you're dividing up your patients into the right groups of responders and non-responders. Validating a new genetic marker could, in some cases, be just as expensive as developing a drug. And if you pick the wrong one, as one of my commentators once pointed out, you could find yourself testing your drug against a group that's actually worse than random chance would have given you.
I don't want to give the impression that Fishman's ideas won't work. They're good ones, and worth trying. I think, though, that some other drug companies may be just as glad that someone else's money is trying them out first. But perhaps Novartis will eventually have their revenge.
+ TrackBacks (0) | Category: Drug Development
January 24, 2005
So what are these cancer animal models that I was speaking of so poorly? On the face of it, they actually seem like they'd be pretty good, other than being rather disgusting. (That said, it's important to keep in mind that they're not as disgusting as watching people die from cancer when you could be doing something about it.)
What you do is take human cancer cell lines and implant them in a mouse, a process called xenografting. When they form a tumor, then you treat the mouse with your drug candidate and see if the growth rate slow, stalls, or reverses compared to untreated controls. Sounds pretty simple.
But the complications show up very quickly as you look closer. For one thing, these human cancer cells are often cell lines that have been propagated for a while in vitro, and there's room to wonder about how much they've changed since their days as primary tissue. There's also the issue of the number of different cancer cell types you could use - hundreds, thousands, more? We know what tissue they came from, and we know some of the biochemical differences between them, but nowhere near all. Not even most of the important differences, if you ask me, since we don't even know what some of those important differences are yet.
What we have are characterizations like "Cell line such-and-such, non-small cell lung cancer, resistant," or "colon, slow-growing, responds to everything." Each cell line has its own reputation. At least the fact that these reputations are pretty constant gives you some confidence that we're all talking about roughly the same cells, which is no small thing. (More than once in the history of cellular research, people have realized that cell lines which were all supposed to be the same thing had drifted apart.)
Another level of difficulty is that these things are implanted, rather than growing in situ in the tissue of interest. Any cell biologist will tell you that the matrix a cell grows in is one of the fundamental variables of cell culture. Now, once the tumor has formed, the cells are surrounded by other cancer cells, which is closer to the real situation. But they're still being vascularized by mouse blood vessels, which obviously respond to mouse signals and carry mouse blood. That's the fundamental animal model problem, and it's a tough one.
Finally, these aren't any old mice. In order to get the cells to "take" when they're injected, these mice have a severely compromised immune system. They mostly have no thymus, for starters (and no hair, either, as a side issue.) Here's one - if you find hairless dog and cat breeds cute, you probably won't mind these guys, either. They don't make very good pets, though, because (as you'd imagine), they will catch every disease available, and likely as not die from it.
At bottom, these models are probably too permissive. As I mentioned the other day, they can make compounds like Iressa look just fine, when we now know that they confer no real benefit in humans. (If our market were nude mice with good health insurance, we'd be set, though, as would the mice.)
So what good are they, and are we really doing a good thing by running them? Well, it's hard to imagine that your compound is going to do any good in humans if it doesn't at least work in the nude mice, so they serve a screening function. It's true, though, that for some years now, if the compound hasn't worked in the mice it's never gotten to humans, so we don't have as many checks on that idea as we'd need to be sure of that assumption. But we see a lot of disconnects like Iressa, which argues for false positives being more of a problem than false negatives.
And I'm not sure how good the models are at rank-ordering compounds, either. I can justify their use as a pass/fail, but that's about it. We should be doing better, and people are trying to. And a lot more are trying behind closed doors - better animal models would simultaneously help large numbers of desperate patients and save the drug industry about a billion dollars. More on all this in another installment.
+ TrackBacks (0) | Category: Cancer
January 23, 2005
So Johnson and Johnson is the latest company to try to broaden their market for a drug and run into cardiovascular side effects. Their Alzheimer's drug Reminyl (galantamine), makes some money, but is hardly a blockbuster. It's a natural product (derived from daffodil bulbs, of all things), and it's a cholinesterase inhibitor, the same mechanism as the two other Alzheimer's drugs on the market. None of them are gigantic sellers, because they don't do all that much for people, especially once they have serious symptoms. But if you could show beneficial effects in the pre-Alzheimer's population, then the potential number of patient could be much larger. I should, in fairness, point out that the potential benefits to the patients could be larger, too: earlier treatment before the disease has had more time to do irreversible damage.
Cholinesterase inhibition is a pretty crude tool to help Alzheimer's, but it's all that we have at the moment. The idea is the turn up the volume of neuronal signals that use acetylcholine as a transmitter molecule, by inhibiting the enzyme that would break it down and sweep it out of the synapse. I don't see an obvious connection between this mechanism and the cardiovascular effects that showed up in J&J's trial.
This is another illustration of the same thing that's bringing down the COX-2 inhibitors. The larger the population that takes your drug, and the more clinical trials you run, the better your chance of finding the side effects. All drugs have side effects, and if you turn over enough rocks you'll see them. But without expanding the patient population, you won't be helping all the people you could help, and you won't be making all the money you could make. It's like walking through a minefield. It's what we do for a living over here. What a business!
+ TrackBacks (0) | Category: Alzheimer's Disease | Toxicology
January 21, 2005
BTW, you may have noticed that the site has picked up some sponsorship, over there on the right. I'm glad to have 'em. Anyone who wants to reach the Pipeline readership - working in or interested in science, well-read, and (judging from my mail) not the sorts to keep things all bottled up inside them - feel free to apply within.
+ TrackBacks (0) | Category: Blog Housekeeping
January 20, 2005
Chemists use glass; biologists use plastic. That's more true than not, because the biologists are using living (or recently living) systems, which like watery stuff. Plastic pipets and petri dishes are just fine. But over in chemistry, we're using ethyl acetate, dichloromethane, and DMSO, which will turn those items into cloudy, sticky ghosts of their former selves.
Usually the plastic and glass worlds coexist reasonably well, like planets of oxygen-breathers and hydrogen-breathers in a science fiction novel. But there can be trouble. A glass-for-plastic switch is no problem (unless you drop the thing on the floor), but the reverse can be messy indeed. I recall a friend of mine down the hall from me in graduate school, who was preparing to purify one of his precious compounds on the lab's HPLC system. As a good chemist should, he took up his compound in some solvent and filtered it, so as not to introduce grit and crud into the pumps or the chromatography column.
He picked up a membrane filter. They all look like disks of plastic, to be placed in a fritted glass holder, and they're all full of microscopic holes to let your solvent pass through. He poured his solution in, and watched in horror and disbelief as the white membrane filter dissolved and merrily sluiced through the glass frit along with his compound. Clearly the wrong kind of white plastic disk. I heard his shouted curses all the way down the hall.
Not too long before he had that experience, I had come to the end of my tether with my current batch of starting material. This was around step 18 or 19 of the synthesis, and I had just enough material for me to run one more step - two, if it actually worked well. I was getting down to some pretty tiny sample volumes (by the standards of organic synthesis, anyway.) But we'd just gotten some disposable 1 mL syringes in, and that looked like just what I needed to take my compound up in 100 microliters of solvent and add it to my carefully prepared reaction. (Remember, this was graduate school, and twenty years ago, to boot. A box of 1 mL syringes was an event worth noting in my lab.)
Came the moment, and I tried to syringe in my solution. But the plunger was putting up a fight - in fact, it was good and stuck. What I didn't realize was that this was one of the generic medical-supply syringes with the black rubbery tip on the plunger - for a good seal, you know, with blood and so on. Most organic solvents make that stuff swell up like a puffer fish, though, and I was just finding this out. I pushed harder. Nothing. I had to get that stuff in there, though, so I really put my thumb into the job, and SPLAT! The barrel of the syringe blew apart from the needle, and my compound sprayed all over the front of my shirt.
I didn't handle it too well. When I realized what had happened, which didn't take long, I started yelling and cursing, and then spotted a stack of cork rings. With plenty of venting left to do, I started throwing them out the door, with a different obscene adjective to accompany each one. My labmate, realizing that something was amiss, had already moved out of the flight path. But one of the inorganic chemistry professors was coming down the hall and just missed catching a cork ring in the ear. "Think I'll go around the other way," he said, taking a quick look around the door into my lab.
+ TrackBacks (0) | Category: Life in the Drug Labs
January 19, 2005
I'll be occupied on and off in the next few months with writing several scientific papers (nothing wrong with bulking up the ol' resume, especially in this climate.) There's always the question of which journal to fire these cannonballs of wisdom towards. Two factors compete: where you'd ideally like to see the paper appear, and where you realistically think you can get it accepted. You aim for the intersection of those lines.
Sometimes the answer is clear - for example, if you've got a comprehensive report on a fairly new diabetes therapy, a good solid paper from discovery to clinic, you're probably going to send it to Diabetes. The situation is more complicated at the higher and lower ends of the scale. A startling head-turner of a paper has a number of venues to choose from, depending on its focus and who you might know in the various heirarchies - Science, Nature, Cell among others. I won't be sending any of this year's papers to those folks, sad to say.
You'd think that the premier journal in medicinal chemistry would be the Journal of Medicinal Chemistry. It may well still be, and I'm sure that the American Chemical Society (its owner) thinks so. I need to check this out, but it's my impression that the journal has had a shrinking percentage of industrial papers in it over the years. Some upstarts have siphoned off some of their raw material. A particular competitor is Bioorganic and Medicinal Chemistry Letters, which began life (and still spends a good part of its time) as a dumping ground, but has slowly changed into something more.
One big difference between the two is that J. Med. Chem. publishes both full papers and short communications, while BOMCL features only the latter. (That's the meaning of "Letters" in scientific publishing.) A full paper naturally includes a full experimental section, with preparative details of all the compounds and their physical characteristics. And there we come to the real split between the journals. For a full paper, J. Med. Chem. wants more details than I happen to have.
They want combustion analysis on important compounds, and I just flat out don't get that level of data on most of them. That's a primitive-sounding (but, in theory, very effective) method of checking a compound's purity. You burn a small sample of it and carefully measure the amount of carbon dioxide, water, etc. that come off. That gives you the percentage of the compound's weight that was made up of carbon, hydrogen and the other oxidizable elements, which is why we often call it a "CHN" analysis.
Then you see how well the experimental value matches up with your theoretical amounts. It comes out to a couple of decimal places, so you can distinguish pretty close matches, in theory. In practice, the compounds usually have to be thoroughly dried and handled carefully before this test, because many of them will soak up a bit of water from the air. Some of them actually crystallize with water molecules in their lattice, as part of the repeating crystalline pattern, and combustion analysis is a good way to see if your compound has done that. It also means that if you're willing to assume, say, one/third molecule's worth of water of crystallization, or some damn such, you can finagle the numbers to come out to most anything you need. (Mind you, a whole paper's worth of such fudge-factoring would get a frosty reception.)
But, for the most part, I don't care very much if my compounds combust well. During a drug discovery project, we don't have time (or material) to send samples away to be analyzed (it's a specialized job.) We rely on NMR (proton, some carbon) and the combination of HPLC and mass spectrometry. Those are enough to characterize a compound for a patent (well, except for some outlier countries like Taiwan), and they're enough to convince us that we've made the right thing and affect the expenditure of millions of dollars. But it's not enough for J. Med. Chem.
Well, not until recently. They've slowly been loosening the noose the last few years, offering high-resolution mass spectral data or data from two different HPLC systems as alternatives. Not that we usually have those, either, but it's a start. But I think I'll let the combustion lab do the work: if I'm going to be sending J. Med. Chem. anything this year, I'd better start getting ready for a Wonder Drug Barbecue.
+ TrackBacks (0) | Category: The Scientific Literature
January 18, 2005
I've mentioned before that one of our big problems in the drug industry seems to be finding compounds that work in man. I know, that sounds pretty obvious, but the statement improves when you consider the reasons why compounds fail. Recent studies have suggested that these days, fewer compounds are failing through some of the traditional pathways, like unexpectedly poor blood levels or severe toxicity.
In the current era, we seem to be getting more compounds that make it to man with reasonable pharmacokinetics (absorption from the gut, distribution and blood levels, etc.) and reasonably clean toxicity profiles. Not all of them, by any means - there are still surprises - but the stuff that makes it into the clinic these days is of a higher standard than it was twenty years ago. But that leaves the biggest single reason for clinical failure now as lack of efficacy against the disease.
That failure is the sum of several others. We're attacking some diseases that are harder to understand (Alzheimer's, for example), and we're doing so with some kind of mechanistic reason behind most of the compounds. Which is fine, as long as your understanding of the disease is good enough to be pretty sure that the mechanism is as important as you think it is. But the floor is deep with the sweepings of mechanistically compelling ideas that didn't work out at all in the clinic - dopamine D3 ligands for schizophrenia, leptin (and galanin, and neutropeptide Y) for obesity, renin inhibitors for hypertension. I'm tempted to add "highly targeted angiogenesis inhibitors for cancer" to the list. The old-fashioned way of finding a compound that works, and no matter how, probably led to fewer efficacy breakdowns (for all that method's other problems.)
Another basic problem is that our methods of evaluating efficacy, short of just giving the compound to a sick person and watching them for a while, aren't very reliable. If I had to pick the therapeutic area that's most in need of a revamp, I'd have to say cancer. The animal models there are numerous, rich in data, and will tell you things that you want to hear. It's just that they don't seem to do a very good job telling you about what's going to work in man. I will point out that Iressa, for one, works just fine in many of the putatively relevant models.
The journal Nature Reviews: Drug Discovery (which is probably the best single journal to read for someone trying to understand pharma research) published a provocative article a couple of years ago on this subject. The author (the now late) David Horrobin, compared some parts of modern drug discovery to Hesse's Glass Bead Game: complex, interesting, internally consistent and of no relevance to the world outside. They got a lot of mail. Now the journal has promised a series of papers over the next few months on animal models and their relevance to human disease, and I'm looking forward to them. We need to hit the reset button on some of our favorites.
+ TrackBacks (0) | Category: Animal Testing | Drug Assays | Drug Development
January 17, 2005
Over at Sean Carroll's "Preposterous Universe", there's a post on a physicist's advice to students who want to become scientists. Don't even try, he tells them. No jobs, no money, no thrill, no hope. It's depressing stuff. Carroll is a physicist himself, so he has quite a bit to say on the topic. (Link found via yet another physicist.)
Reading the whole thing, though, I was struck by how far from my own experience it is. The drug industry's going through a rough patch, for sure, but there are companies still hiring. And although we've had some layoffs, and more are in the offing, there are still thousands upon thousands of us out here. We're gainfully employed, working on very difficult and challenging problems with large real-world implications. (And hey, we're getting paid an honest wage while we're doing it, too.)
That's when it hit me: the article that Carroll's referring to isn't warning people away from becoming scientists. It's warning them away from becoming physics professors. Very different! Those categories intersect, all right, but they're not identical. There are other sciences besides physics (no matter what Rutherford said), and in many of them, there's this other world called industry. (The original article doesn't even mention it, and Carroll disposes of in his first paragraph.)
Some of this is (doubtless unconscious) snobbery - academic science is pure science, after all, while industry is mostly full of projects on how to keep cat litter from clumping up in the bag or finding new preservatives for canned ravioli. Right? And some of it reflects the real differences between physics and chemistry. To pick a big one, research (and funding) in physics has been dominated for a long time by some Really Big Problems. The situation's exacerbated by the way that many of these big problems are of intense theoretical but hazy practical interest.
I am not knocking them for that, either, and I'll enter my recent effusions about the weather on Titan as evidence. I'd love to hear that, say, an empirically testable theory of quantum gravity has made the cut. But that kind of work is going to be the domain of academia. I think that it's a sign of an advanced civilization to work on problems like that, but advanced civilization or not, it's not likely to be a profit center. Meanwhile, chemistry doesn't have any Huge Questions at the moment, but what it has are many more immediately applicable areas of research. Naturally, there are a lot more chemists employed in industry (working on a much wider range of applications.)
Many of the other differences between the fields stem from that basic one. Chemistry has a larger cohort of the industrially employed, so the academic end of the business, while not a jolly sight, isn't the war of all against all that you find in physics, astronomy, or (the worst possible example) the humanities. The American Chemical Society's idea of worrisome unemployment among its members would be clear evidence of divine intervention in many other fields. So those of us who get paid, get paid pretty well. And we don't do three, four, five-year post-docs, either, which is something you find more of in fields where there aren't enough places for everyone to sit down. Two years, maximum, or people will think that there's something wrong with you.
All of this places us, on the average, in a sunnier mood than the physics prof who started this whole discussion (whose article, to be sure, was written four or five years ago.) I was rather surly during grad school, but for the most part I'm happy as the proverbial clam. As I've said, if someone had come to me when I was seven years old and shown me the work I do now, I would have been overjoyed. Who can complain?
+ TrackBacks (0) | Category: Academia (vs. Industry)
January 16, 2005
As a chemist, I can't help but be fascinated by the photos from Titan. Shorelines, watersheds - uh, make that "ethanesheds"? - pebbles (made of ice) that seem clearly to have been eroded by flow or tumbling. . .it's great stuff. The line I heard about Titan being a huge Urey-Miller experiment that's been running for a billion years seems about right, and that means that there could be all kinds of odd stuff piled up on the surface. Chemistry isn't fast at 180 Kelvin, but a billion years is a mighty long time. I just hope that the rest of the data (the mass spectrometry and so on) comes out soon.
One of the things that's struck me is the additive effect of small details of chemistry and physics. Think about it - if you were given the Earth's atmospheric composition, temperature, axial tilt and other variables, you could deduce a lot. You'd predict oceans and seasons, clouds and rain, and much else besides if you thought about it long enough. But could you predict the fantastic variability of the colors in sunsets and sunrises? The billowing shapes of cumulus clouds piling up into a thunderhead? The hundreds of patterns of frost, or how ice looks forming around the sides of a fast-running stream?
Titan must show the same kind of thing, up close. What do the waves look like in those lakes and swamps, with all our variables changed: lower gravity, higher pressure, lower temperature and with hydrocarbon liquids? What's that fog look like when it rolls in past the cliffs, and what shapes have those cliffs been carved into? Does the acetylene seep into the icy ground, hit thick deposits of ancient alkanes and carve out caves like nothing we've ever seen?
Know what I want? I want some sort of insulated, radioisotope-powered version of the Mars rovers running around down there. The sad part is that it's unlikely that such a thing will happen in my lifetime. Man, do we ever need a cheaper way off this planet. (Try Rand Simberg and his extensive links listing for others who agree with that sentiment.)
+ TrackBacks (0) | Category:
January 14, 2005
I'd like to remind everyone that something very unusual is happening today: we appear to have successfully landed on Titan, Saturn's largest moon. Word came in a few minutes ago that the Huygens probe has sent at least two hours of observations back, which at least means that its parachutes opened.
Huygens carries all sorts of fine spectroscopic equipment to figure out what's going on under Titan's massive cloud deck, along with down- and sideways-pointing cameras and a spotlight. At this point, we don't know if it landed with a thunk, a splat, or a splash, but we'll be finding out later today when the information is sent back to Earth by the Cassini spacecraft (in orbit around Saturn).
Figuring out what we're seeing might take a bit longer. Titan is one of the most alien places you could find in our solar system. Barring some really excellent new technology, which I certainly hope for, this will be one of the few landings on another world that we'll all get a chance to see. It's a great day for the species.
+ TrackBacks (0) | Category: General Scientific News
January 13, 2005
Not much blogging time tonight - I spent all day sitting in front of my computer, anyway, working on a manuscript for one of the chemistry journals. I find that writing blog entries is (usually) no problem, and writing things like my Contract Pharma column come fairly easily, too. But not scientific papers, that's for sure.
I think it's because there's not very much actual writing involved. The text of my article isn't much to look at - just a few paragraphs, really. But getting the figures and tables right, looking up all the data and resolving all the discrepencies - it's all finicky detail work. All the problems you uncover are probably about three years old, too, since that's the (youngest) vintage of stuff we can publish, and you can imagine how that helps resolve things quickly. The construction project can be enjoyable, but most of the time it's like building something with more parts than you're in the mood for.
But I'm going to get plenty of it this year, which isn't a bad thing. For various reasons, I have several pieces of work that have reached the, um, mature stage of publication-worthiness. I'd rather that they were reaching a sick patient in a Phase III trial somewhere, or better yet, a pharmacy shelf, but I'll have to settle for Bioorganic and Medicinal Chemistry Letters.
+ TrackBacks (0) | Category: The Scientific Literature
January 12, 2005
My lab is starting work on another new project, which this time means another five-membered ring system that I haven't worked on before. I've done a lot of them, but this class I've missed out on somehow.
If you're going to do organic chemistry in a drug company, you'd better be on good terms with nitrogen atoms. I'd say that drugs with them outnumber drugs without by 10:1. The reasons aren't mysterious - amines can take up protons or donate them at just around the pHs that you find in living systems. And even the not-so-basic ones are still essential players. Nitrogen is a fine partner for hydrogen bonds, both donating and receiving, and those are the common currency of biomolecules, in my opinion.
So you'll be making plenty of amines and amides, and you'll be making plenty of the small-ring nitrogen compounds - imidazoles, oxazoles, pyrazoles and so on. My advice to grad students and post-docs is to learn some heterocyclic chemistry if you plan to interview in the drug industry, assuming we ever get around to hiring people again. (OK, don't panic - it's not that bad, not yet - but I fear that there's going to be a sadly reasonable supply of experienced candidates available this year.) Tell 'em that you know how to crank out the thiazoles and the oxadiazoles - it sure won't hurt, and it might just help.
As those of you outside chemistry may have suspected from all that gibberish, these systems have a systematic nomenclature. It gets a little tongue-twisty after a while, though, particularly if you're not being paid to say them (and make them.) I turned out a whole series of isoxazolidinones once, for example, a name that doesn't come trippingly off the tongue without a little preparation. Active compounds, you ask? You bet! So active that every one of them did something in mice, and what's more, every one did something different. But that, unfortunately, is not a feature that we value. So much for our "commitment to diversity", eh?
+ TrackBacks (0) | Category: Life in the Drug Labs
January 11, 2005
Regular reader Qetzal pointed out in a comment to the "More Fun With DNA" post that a lot of neat discoveries seem - after you've heard about them - to be something that you could have thought up yourself. I know what he means. I've had that same "Yeah. . .that would work, wouldn't it. . ." feeling several times.
There's an even higher degree of the same thing, thinking that surely that new discovery has already been done. Hasn't it? Didn't I read that somewhere a year or so ago? I'm trying to remember the British literary/political figure who said it, but the quote was that the most important thing he had learned at Cambridge was not to be afraid of the obvious. I think that a lot of us are, and it's not to our benefit.
So there's a useful New Year's resolution, if anyone has room for a spare one. Shut that voice up once in a while, the one that shows up in your head when you have a wild idea, the one that says that if this were really as good as it sounds, someone would already have done it. A lot of really great stuff hasn't been done, and if too many people listen to the lesser side of their natures, it won't be.
+ TrackBacks (0) | Category: Who Discovers and Why
January 10, 2005
A few weeks ago I linked to Slate's Jack Shafer and his criticism of a New York Times article on the FDA. I had trouble seeing the point of that article myself, so Shafer's comments helped put my mind at ease.
Well, Times editor Bill Keller had some reaction to Shafer's attack, which (after an e-mail-induced delay) Shafer's now published along with his own rebuttal. It doesn't end there - Keller has still more words for him, joined by the reporter of the original story (Gardiner Harris) in one last Slate piece here. That's a lot of reading, admittedly, but it's interesting stuff, both for people who are interested in the FDA and the drug industry and for people who are interested in the New York Times.
Overall, I still come down more on Shafer's side of the argument. As much as I could make out, the original article seemed to be about how the FDA doesn't have (or use) the funds to monitor the safety of drugs any more, preferring to devote resources to getting new ones approved. (I know, I know, there are a lot of us in the industry sitting around tapping our feet and wondering where those resources must be going.) Keller and Harris dispute this interpretation, but I think that Shafer makes some good points.
But the exchange gets bogged down in the details of Gardiner Harris's original example, the mid-1990s safety problems with Seldane and the subsequent rise of Claritin in the antihistamine market. Shafer doesn't get this one very straight, and Harris finally says "Simply put, safety concerns about Seldane were the dominant force behind Seldane's fall and Claritin's rise in those early years. Anyone at Schering-Plough, Claritin's maker, would confirm this."
Well, as it happens, I was working at Schering-Plough in those days, and I have to say that Harris is right. There's something else that at first would seem to confirm Harris's thesis: Schering-Plough had a mighty long wait for Claritin to get approved. But I don't think it had anything to do with lack of funds. The drug was already over-the-counter in Canada before it made it out of the FDA here, about which there was much bitter comment, but that wasn't a general phenomenon.
In the long run, the delayed timing wasn't all that bad, since Seldane (and Hismanal) started having trouble early in Claritin's US lifetime, opening up a market opportunity that otherwise wouldn't have been there. That's the door through which Schering-Plough then famously rolled the fanciest direct-to-consumer advertising campaign that anyone had ever seen, which paid off very well indeed.
That's a point that I wish some critics of the drug industry would appreciate: companies spend money on advertising because they plan to make even more money than they spent. That's what advertising is supposed to do, y'know. In the end, Schering didn't have an answer for Claritin's patent expiration when that finally came, and the company went into the tar pit for a while. That was after I'd left, I should add - post hoc ergo propter hoc, doubtless.) But that shortfall wasn't for lack of trying during the 1990s, and much of the money that financed those research projects came from the mighty sales of Claritin.
Oh yeah - there's something that Gardiner Harris says in his letter to Shafer that I wish someone would pass on to, say, Marcia Angell. I like it so much that I should set it to music. Quote Harris: "New drugs rarely supplant older drugs unless they are demonstrably better or safer." How true, how true.
+ TrackBacks (0) | Category:
Over at Uncertain Principles, Chad Orzel's commentators got into a discussion of how you list people in a large multi-author publication. My system is that the first author and the last author are the people who did most of the work and/or were in charge. It's worth amoment to think about the gap that can open up between those two descriptions. Between those bookend names, I've opted for alphabetical order when I've written or helped write papers, because the alternative is an institutional-sized easy-open pressurized Can O' Worms.
Assigning credit in a scientific project is an awful job. Multiple people will be sure that they thought of an idea first and that everyone else just borrowed it from them. Some people will be livid at how others on the team got by without seeming to ever contribute anything, while they carried the whole project on their backs. Meanwhile, some of that latter group will be furious at the first ones, who from their perspective got to do the easy stuff that generated all the cheap and flashy results while they labored in the salt mine.
Sometimes these things can be resolved by enough tedious effort, but most of the time they can't. And it's almost always not worth the effort - at least for a journal article. Now, for a patent, things are very different, as one of Chad's commentators rightly points out. Everyone listed on a patent has to be able to state clearly what their contribution to the invention was. If you can prove that a patent has people on it that did not contribute (or left out people who did), you can get the thing invalidated. That's not easy, but it has happened, and the mere threat is enough to make everyone take inventorship pretty seriously.
My quick-and-dirty test for inventorship has been to tell people to ignore the whole draft of the patent application except the claims. Go straight to those, and find something there that you thought of, and be ready to point to it. Ideally, you should check to make sure no one else is going to point to the same thing. Best are the things that you thought of first and were also the first to do. No one can take that away from you.
Next best are the things that you thought of first and handed off to someone else to accomplish - if they didn't add anything to your idea, you're probably an inventor and they certainly aren't. Being the first one to try someone else's idea in the lab doesn't mean much in inventorship terms, and quite rightly. Now, if the person you handed things off to added something meaningful, you may both be inventors, which is were things can become interesting. Sometimes the original idea has been mutated so thoroughly that the final claim is really the work of the second person, with nothing recognizable from the first one.
I tell people who work for me that if they want to be on the patents coming from our lab, they'd better have some ideas of their own to show when patent-writing time comes. Naturally, I try to fulfill my end of that deal by letting people work on their own stuff as much as possible. The only way we can end up in trouble is if we pick a total-loss part of the molecule to work on and end up with nothing worth including in the patents. You want to keep a sharp eye out for that situation, and be ready to steer yourself (and your lab) out of it.
| Category: Patents and IP | The Scientific Literature
January 6, 2005
I mentioned hooking up small molecules to DNA yesterday. A comment to that post prompts me to write about something I've been thinking about for some time: the work of David Liu at Harvard. I have several of his papers in my files, and he's recently published a long review article in Angewantdte Chemie, for those of you with access to the journal (43, 4848, the International Edition, of course.) Turns out that he has an informative website summarizing the work, too.
In short, what he's been doing is trying to get chemical reactions to go in a much different way than chemists usually do. The inside of a reaction flask is a very weird and specialized environment. We have to really bang on things to make them react - high concentrations, special solvents, catalysts, lots of heat. By the standards of living systems, it's the Spanish Inquisition. Meanwhile, cells make all kinds of things happen by keeping the reactants around in very low concentration (or trickily compartmentalized, a factor not to be ignored), and then sticking them together with other reactants inside the active site of an enzyme. The middle of an enzyme is like a reaction flask that's just big enough for the two molecules, and all sorts of unlikely chemistry happens under those conditions, things that you just can't get away with in bulk solutions.
I should declare my biases here: I find this principle tremendously appealing, and I've had a number of idea spasms in this area myself, which have come on like malarial relapses over the last two years. A number of scattered reports of this kind of thing that have shown up over the last few years; I long to join them. Reducing these brainstorms to practice hasn't been easy, but I continue to think that this general area of research has a huge amount of untapped potential for organic chemistry and drug discovery.
Liu has been taking advantage of the ferocious drive that single strands of DNA have to combine with their complementary partners. He and his group have added chemical linkers to the 3' and 5' ends of complementary strands and decorated them with molecules that could react with each other when they're jammed together by the zipping-up of the DNA ladder. This gives you several interesting possibilities by taking advantage of the huge molecular biology infrastructure of manipulating DNA. Foremost of these is, as I mentioned in the last post, the peerless signal amplification of the PCR reaction, which lets you run everything on microscopic scale and turn up the volume later to see what happened.
Liu's group has tried all sorts of variations on this idea, with different reaction types and different linkers at different positions up and down the DNA chain from each other, and results have been very encouraging. A lot of things are going on. They've found a number of different reactions that can take place under DNA-templating conditions, and they're still expanding the list. They act differently, in surprising ways. Sometimes it's the rate of DNA hybridization that determines the reaction course, and sometimes it's the rate of the small-molecule reaction they're trying to encourage. Along the way, they've shown that some reaction sequences that would normally be incompatible in the same flask can be made to happen in an orderly fashion on the DNA templates.
They've also recently reported using these systems to discover new reactions - splitting and recombining the reactants in classic combinatorial chemistry style, but with that microscale advantage that DNA labeling gives you. You could have thousands of reactions going on in amounts of solvent that a chemist like me wouldn't even notice in the bottom of a flask. Some of these reactions will only work under the DNA-template conditions, which is useful on that side of the research, but not so good for making real-world (that is to say, my-world) quantities of compounds. But some of them look like they can make the leap to non-DNA conditions.
That's just a quick overview - for more details, see Liu's site link above. This is a quickly evolving area, and I'm sure that a lot of neat ideas are waiting to be tried (or even thought of in the first place.) I'm a fan. This is something new, and the more completely new approaches we have to do organic chemistry, the better off we are.
+ TrackBacks (0) | Category: General Scientific News
January 5, 2005
You know what I don't miss about chemistry after years in the drug industry? Big, long, multi-step syntheses. Oh, we'll gear up to do eight- and ten- and thirteen-steppers here, even though some of those steps are just things like hydrolyzing methyl esters, stuff that blindfolded grannies should be able to do. But what I'm happy to leave the mighty academic natural product synthetic schemes behind, the ones where step fourteen finds you just getting warmed up.
As I've mentioned here before, I did that kind of thing in graduate school, and I swear it's scarred me for life. I pulled the plug on my total synthesis at step 27, about six steps short of the end (this is, if everything had worked perfectly, obese chance.) I've never regretted it. The benefits of getting out of grad school are huge, spacious, and well-appointed compared to the benefits of being able to say that I finished my natural product. Any of my readers in grad school, take note.
Long linear sequences are a slog. You have to start them in the largest buckets you can find, because you're never, ever going to have enough material. Now, we do large scale work in the drug industry, yes indeed, but that's because we intend to finish on large scale. If you're going to do six-week toxicity testing, you'd better have a fine keg of material on hand before you start. But those academic syntheses need huge amounts at the beginning in order to have anything at all by the time they finish. You work until you can't handle or characterize the stuff any more, then you trudge back down the mountain and start porting the loads back up the trail.
An example: I got to the point where I needed to take an optical rotation on the material from about step 25 or so. For those outside the field, this is an analytical technique that involves shining polarized light through a solution of your compound. If it's not an even mix of left-handed and right-handed isomers, that is to say, if there's some chiral character to the sample, the light will rotate. The degree of rotation can be used as an indicator of compound purity - I'm tempted to add "if you're a fool." They're not the most reliable numbers in the world, because some things just don't make the light twist much. And in those cases, a small amount of an impurity that rotates light like crazy will throw everything off. It's happened more than once.
Well, in my case, I loaded a half milligram or so of my precious stuff into the smallest polarimeter tube we had and jammed it into the machine. Hmm, I thought, a rotation of 0.00 degrees. A singular result, since I knew for certain that the molecule had six pure chiral centers. So I went back upstairs and loaded the whole batch into the tube, walking very carefully down the hall with this investment of several months of my life held in both sweaty hands. This time I got a specific rotation of about 1.2 degrees, which means that all those chiral carbons were roughly canceling each other out. Did I believe that number? Not at all! Did I put it in my dissertation? You bet! Gotta have a number, you know.
And that's how you work - purifying things through increasingly tinier columns, collecting them in slowly shrinking vials, running all the instruments for longer and longer with the gain turned up higher and higher, trying to prove that it's really still in there and really still what it's supposed to be. Then it's back to the buckets. Never again!
+ TrackBacks (0) | Category: Academia (vs. Industry)
January 4, 2005
Speaking of odd ideas that might have applications in drug discovery, there's an interesting one in the latest issue of Nature Methods (2, 31). A group at the Molecular Sciences Institute in Berkeley reports a new way to detect and quantify molecular binding targets. And if you think that this sounds like something we're interested in over here in the drug discovery business, you are correct-o-matic.
This idea piggybacks, as you might expect, on the mighty king of detection and quantification in molecular biology, PCR. The ability to hugely amplify small amounts of DNA is unique, the biochemical equivalent of a photomultiplier , and many people have taken advantage of it. In this case, they also make ingenious use of weird beasts called inteins, about which a great deal of background can be found here. Briefly, inteins are sort of DNA parasites. They insert into genes and are read off into an extraneous stretch of protein in the middle of the normal gene product. But then they quickly clip themselves out of the protein - they have their own built-in cut-and-splice mechanism - and leave the originally intended protein behind them, none the worse for wear.
The MSI group takes the molecule of interest - say, a protein ligand - and attaches an intein to it. They take advantage of its splicing mechanism to have the intein remove itself and attach a stretch of specially whipped-up DNA, which serves as a tag for the later PCR detection. They call this conjugate a "tadpole", for its shape in their schematics (the DNA tag is the tail, naturally.) Said tadpole goes off and does its thing in the assay system, binding to whatever target it's set up for, and you do a PCR readout.
The paper demonstrates this in several different systems, going all the way up to a real-world example with blood serum. What's impressive about the technique is that it seems to work as well as antibody methods like ELISA. Getting a good reliable antibody is no joke, but these folks can make smaller proteins with much worse intrinsic affinity perform just as well. And if you turn around and do the trick starting with an antibody, you can increase the sensitivity of the assay by orders of magnitude. And you get a real quantitative readout, with about +/- 10% accuracy. To give you the most startling example, the authors were able to detect as few as 150 single molecules of labeled bovine serum albumin in a test system.
The "News and Views" piece on all this in the same issue points out that the technique gets round some real problems with the existing methods. Labeling proteins with DNA or fluorescent tags is a messy and imprecise business, and it can be very hard to tell how many labels your target carries (or how many different species are really present in your new reagent.) The intein method is one-to-one label-to-protein, with no doubts and no arguing. Cell biologists are going to have to get used to knowing what they're looking at, but I think that they'll be able to adjust.
The news article calls the technique "ultrasensitive, amplified detection of anything," and that's pretty close. As the MSI authors point out, it removes the limitations of antibody technology: no longer can you detect only the things that an immune system has a reaction to. Screening of protein libraries could provide low- to medium-affinity partners (which is all you need) for all kinds of poorly-studied molecules.
I'd be interested in seeing if the system can be adapted for small (i.e., drug-sized) molecules conjugated to DNA. They wouldn't be tadpoles any more, though - more like eels - and might behave oddly compared to their native state. But even if you stick with the larger protein molecules, important biology may well be a lot easier to uncover. And we've got an endless appetite for that stuff. It's good news.
+ TrackBacks (0) | Category: General Scientific News
January 3, 2005
I jumped the gun a bit with my year-end piece yesterday, in which I got my hand-wringing out of the way for now. The most casual observer can see that the drug industry is in something of a fix. What I'm asking now is for ideas on how to get ourselves out of it.
In the short term, I think we're just going to have to reef in the sails and hold on tight. On the new drug front, what's in our clinical pipelines now is what's in 'em, and we're not going to change that soon in any way that's going to help. On the political front, I don't have any faith at all in PhRMA (the industry trade association)'s ability to slide us out of this jam, and in fact, I worry that they'll end up making it worse.
Solutions will have to come in the longer term. I can think of a lot of things the drug industry needs - better clinical sucess rates, for example. Anything's better than the roughly 85% failure rate we have now. For that, we need a better understanding of basic biology, mechanisms of efficacy and toxicity. That's where most of the failures are coming from these days, from the hard parts.
But the most important thing we need, as far as I can see, is more drugs that people really need. I'm not talking about the lifestyle enhancement stuff here. People buy those products - sometimes - but they don't respect them, and they're not willing to do any favors for the people who make them. I mean the big tough ones - Alzheimer's, intractable cancers, diabetes, major depression. These are very hard areas to work in, and we don't always have a good idea of what we're doing. But we're going to need good products for conditions like these to remind people of what the drug industry can do.
And we're going to have to resist the temptation to hype the incremental advances that we make in these areas. Headlines about a Wonder Cancer Drug don't do anyone any good when the WCD turns out to help about 5% of its initially targeted patient population. Crazy hype about stem cells might move some IPO stock, but we all know that it's going to take many years of ridiculously hard work - and many bonfires of venture capital - to realize their potential. If we keep jumping up and down, jabbering about the great stuff that's coming real soon now, everyone's going to tune us out. Some have already.
+ TrackBacks (0) | Category:
January 2, 2005
Silly me. I thought that 2003 was a bad year for the drug industry, and said so here a year ago. How was I to know that 2004 would make it look like a bowl full of strawberry ice cream? That trend had better not continue, or by about 2007 I'm going to be wearing a paper hat and standing behind a french-fry machine.
On one level, things can't continue to deteriorate the way they did this year. The world still needs pharmaceuticals, and it still doesn't have any way to get them other than from the drug companies (large and small.) Those drugs are going to continue to have side effects (large and small!), and regulatory agencies, doctors, and patients are going to have decide case by case if those are acceptable risks or not. We can't just pull everything off the market and start over. There's no "over" to start from. So in that sense, at least, things can't go on like this.
But there are some bad events coming toward us as if they were riding on rails. Celebrex is in trouble, clearly, and I'll sit down to a bowl of gym-sock soup if Novartis isn't delaying - or canceling - their own COX-2 launch. Crestor is in trouble, too, which means that AstraZeneca may have to decide what they want to do this year: take on Lipitor in a mighty struggle, or be the second company to have their statin yanked from the market. (They can ask the folks from Bayer how much fun that is.) And there are other companies and drugs with problems, which we'll be talking about in the coming weeks (hi, Lilly!)
Merck's already laying off people, and I suspect that they're going to end up laying off even more. Pfizer and AstraZeneca are going to have to follow suit if their compounds get pulled, which would make this a truly awful time to try to find a job in the pharmaceutical industry. The last time we went through something like this was during the uncertainly of the Clinton health care plan proposal, and people were holding on to their jobs with every adhesive at hand.
That wasn't a good time to go for that big promotion or pay raise, for those of you who weren't around back then. At some companies, the attitude was "Raise? You're lucky you have a job," and I wouldn't be surprised to see some of that again. (I remember being asked to hand out a 0.8% pay raise to one of my direct reports back then, which was a pretty memorable experience for both of us. I advised him not to spend it all in one place.)
But you know, we came out of that one. And we can come out of this situation, too. Some good clinical results, some decent product launches over the next couple of years, a few months without a major product going down in a buzzing cloud of lawsuits - is all that too much to ask? We're going to find out. I advise my readers in the industry to just try to hang on - we're going to be telling stories about this era for a long time, and we just need to hack our way through to the other side of it.
My other advise is going to sound a bit perverse, but it's sincere: if any of you research types have been harboring some odd ideas, some out-of-the-ordinary stuff that's seemed too risky to try - well, now's the time. We're going to need some new magic tricks, and they might as well come from you. This might seem like the worst time to take chances in the lab, but no one thinks about doing that stuff when times are good. It's mother-of-invention time, folks. Let's get cranking - the job you save may be your own.
+ TrackBacks (0) | Category: Drug Development