About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
February 28, 2007
Since I'm still on the job-hunting trail, after the events described here, I think I'd find it a bit therapeutic to complain about one part of the process that's a complete waste of time.
Now, there are open positions that are advertised, both online and in the various science and trade publications, and there are some that are handled mostly by recruiters. I'm working both of those, naturally, since at my level of experience it's generally harder to find a position. Friends of friends, former colleagues, company websites, online job boards, headhunters of every description - if this isn't the time to pull out all the stops, when is?
But there are recruiters, and there are recruiters. I've spoken with several who really seem to know their business, and I'm glad to have had the chance to contact them. But I've also spoken with several who don't seem to have the first idea of what they're doing. Let's just say that I've been pitched more than enough positions for "Formulations Chemist" and "Clinical Research Data Scientist" and God only knows what else. There are so many things wrong about these inquiries that I hardly know where to start.
For one thing, it shows that either the recruiter involved knows nothing about the industry, or they haven't even looked at my CV - and it's a good question as to which of those is a worse sign. I've had headhunters confidently forward me positions that focus on, say, developing generic injectables: what in my background makes that even remotely a match, unless all the other resumes they have on hand are from Linux developers and salespeople? The other day, I had someone pitch me a job that, while actually in medicinal chemistry, was at a level I wouldn't have interviewed for in 1992, much less now. And they seemed surprised that I wasn't considering it seriously.
Another problem with these is what's happening on the other end. Here's some company, paying a search firm to go out and beat the bushes for them, but the outfit's actually just randomly hitting up everyone who's walked across a drug company parking lot. You wonder what kind of progress reports these people are submitting on how their trained placement professionals are on the case, as in the background someone sits on the phone asking a cell biologist if they've ever considered running a mass spec lab. "Hello. . .hello? Cut off again. . ."
Well, at any rate, there are some good ones out there. But they sure stand out against the background.
+ TrackBacks (0) | Category: Closing Time | How To Get a Pharma Job
February 27, 2007
SciTheory has a post, complete with links to the relevant articles in Science, etc., on a recent batch of trouble in structural biology. Geoffrey Chang and his group at Scripps have been working on the structures of transporter proteins, which sit in the cell membrane and actively move nonpermeable molecules in and out. There are a heap of these things, since (as any medicinal chemist will tell you) a lot of reasonable-looking molecules just won't get into cells without help. It's even tougher at a physiological level, because (from a chemist's perspective) many of the things that need to be shuttled around aren't very reasonable-looking at all - they're too small and polar or too large and greasy.
Many of these transportersm especially in bacteria, fall into a large group known as the ABC transporters, which have an ATP binding site in them for fuel. (For the non-scientists in the audience, ATP is the molecule used for energy storage in everything living on Earth. Thinking of an ATP-binding site as a NiCad battery pack gets you remarkably close to the real situation). Chang solved the structure of one of these, the bacterial protein MsbA, by X-ray crystallography back in 2001, and it was quite an accomplishment. Getting good X-ray diffraction data on proteins which spend their lives stuck in the cell membrane is rather a black art.
How dark an art is now apparent - here's the original paper's abstract in PubMed, but if you look just above the abstract, you'll see a retraction notice, and it's not alone. Five papers on various structures have been withdrawn. As SciTheory says, anyone who doubted the original MsbA structure had some real food for thought last year when another bacterial transporter was solved at the ETH in Zurich. These two should have looked more similar than they did, to most ways of thinking, but they were quite divergent.
And now we know why. Chang's group was done in by some homebrew software which swapped two columns of data. In a structure this large and complicated, you can have such disruptive things happen and still be able to settle down on a final protein picture - it's just that it'll be completely wrong. And so it was. The same software seems to have undermined the other determinations, too.
This is important (as well as sad and painful) on several levels. For one thing, transporters are essential to understanding resistance to antibiotics and cancer therapies, and they're vital parts of a lot of poorly understood processes in normal cells. We're not going to be able to get a handle on the often-inscrutable distribution of drug candidates in living systems until we know more about these proteins, but now some of what we thought we knew has evaporated on us.
Another point that people shouldn't miss is the trouble with relying too much on computational methods. There's really no alternative to them in protein crystallography, of course, but there always has to be a final "Does that make sense?" test. The difficulty is that many perfectly valid protein structures show up with odd and surprising features. Alternately, it's unnerving that the data for these things can be so thoroughly hosed and still give you a valid-looking structure, but that just serves to underline how careful you have to be.
And we're talking about X-ray data, which (done properly) is considered to be pretty solid stuff. So what does this say about basing research programs on the higher levels of abstraction found in molecular modeling and docking progams?
+ TrackBacks (0) | Category: In Silico
February 26, 2007
F. Albert Cotton's recent demise brings up a question that traditionally comes up in the fall, during Nobel season. Cotton himself never won the prize, although his name came up constantly in the list of contenders. There's a group of scientists (a select one) in every Nobel-bearing discipline that fills this role. Some of these people eventually get Nobel recognition, of course, and when that happens a good number of onlookers are relieved that ol' So-and-So finally got it, and another host are surprised, because they'd already sort of assumed that ol' So-and-So had received one years before.
But as time goes on, it seems to become clear that some eminent people are just not going to win, and I'd have to have put Cotton in that category. The Nobel committee had years in which to act on his behalf; they never did. The question then is why. Theories abound, some of them conspiratorial (and thus unprovable for another hundred years or so), but most trying to discern what makes some work Nobelish and some not.
One of the strongest arguments is that doing a lot of good work across several areas can hurt your chances. It seems to help the committee settle on candidates when there's a clear accomplishment in a relatively well-defined field to point at. Generalists and cross-functional types are surely at a disadvantage, unless they can adduce a Nobel-worthy accomplishment (or nearly) in one of their areas. That's not easy, given how rare work at that level gets done even when you've devoted all your time and efforts to one thing.
The current example in organic chemistry is George Whitesides at Harvard. He's an excellent chemist, and has had a lot of good ideas and a lot of interesting work come out of his group. But it's all over the place, which is something I really enjoy seeing, but the Nobel folks maybe not as much. Just look at this bio page from Harvard, and watch it attempt to pull all his various research activities under some sort of canopy. It isn't easy.
To drag the late Isaiah Berlin into it again, Whitesides clearly seems to be a fox rather than a hedgehog. Hedgehogs tend to be either spectacularly wrong or spectacularly right, and that last category smooths the path to greater formal recognition. For more on fox/hedgehog distinctions in other disciplines, see Daniel Drezner (international relations), Andrew Gelman (statistics), and Freeman Dyson (physics), and for an application of the concept to drug research, see here. Which sort of creature does Whitesides stock his research group with? Paul Bracher would know.
(Readers are invited in the comments to submit their own candidates for scientists who always seem to be on the Nobel list, but haven't won, and any alternate theories about why this happens).
+ TrackBacks (0) | Category: Current Events | Who Discovers and Why
February 25, 2007
The other day I made a quick comment that I wasn't sure which would have a higher rate of return - biotech stocks or lottery tickets. Some folks liked the comparison, and others didn't, naturally. But there are some points worth thinking about in it.
For one thing, we have to distinguish between the gains realized by the companies themselves, versus those realized by their stocks. The former figures have already been calculated fairly recently (2004) by David Hamilton in the Wall Street Journal (subscriber link here), and I wrote about their figures at the time.
The best estimate was that since the first biotech company went public, total operating losses in the industry have amounted to some 40 billion dollars. Genentech and Amgen do what they can with all the black ink that they generate, but they're overwhelmed each year by the tide of the red stuff. I can only imagine what this figure would be if it included non-public biotechs, every single one of which (as far as I know) has run at a loss. After all, when you start to look like you're going to turn a profit someday, you're already public, right?
During this period, investors have put about 100 billion into the public companies, so we know where 40% of that money has gone, at any rate. Ah, but you're saying, these investors got stock in return, and how's that done, eh? Undeniably, some of the issues have made people fantastic amounts of money - Amgen, for example, has returned several hundred-fold on an investment at its IPO price in the early 1980s, although surely no human being has held it for that entire time. Of course, somewhere around 15 or 20 per cent of all the biotech companies that have gone public over the years turn out to have returned nothing at all, having disappeared in a blizzard of worthless stock, so that does cut into things. Still, biotech has been up over that time - but compared to what? As a whole, the article suggested, the sector has failed to even come close to the S&P 500's rate of return over the last 25 years. (And I'm not sure if that comparison includes transaction costs, which because of all the turnover in the sector would skin you alive over time).
So, how's that lottery ticket comparison look? If you're looking for the next Amgen or Genentech, well, those are two stocks out of several hundred that have gone public. Those are far better odds than the jackpot in a state lottery, true (although the jackpot has an even more insane rate of return). How about the overall odds of winning, though? Looked at more broadly, most state lotteries will cause you to lose about half of every bet that you put into them (a rate which casino operators can only envy). The figures above suggest that (on an operating basis), biotech has done worse, splitting about 41/59. On a stock investment basis, it appears that you'll make money overall, but not as much as you'd make by parking the same cash in the indices, and I'd call that a loss, myself. You may not think so, but if you don't, please send the difference to me so I can give it to Vanguard myself.
I should mention that the original WSJ article is itself full of comparisons to casinos, Las Vegas, and lotteries. The point, unfortunately, is well taken. Next time, we'll talk about probability of ruin, and things will really start looking grim.
+ TrackBacks (0) | Category: Business and Markets | Drug Industry History
February 23, 2007
F. A. Cotton died this week, and another gigantic name in chemistry departs. As an inorganic chemist, he was technically outside my field, but no one's really outside the range of influence of someone like that. If you're an organic chemist, you use organometallic reagents and catalysts, and if you use those, you owe F. A. Cotton some appreciation. 50 years of research, 1600 papers, some extremely influential books - he really cleared some brush, and we're unlikely to see his kind again.
+ TrackBacks (0) | Category: Current Events
February 22, 2007
An undergraduate reader sends along this request:
I was wondering if you had some recommended readings for a second year student, eg books that you have read and made a palpable impression on you when you were my age.
That's a good question, despite the beard-lengthening qualification of "when you were my age". The books that I would recommend aren't the sort that would require course material that a sophomore hasn't had yet, but rather take a wider view. I would recommend Francis Crick's What Mad Pursuit, for one. It's both a memoir of getting into research, and a set of recommendations on how to do it. Crick came from a not-very-promising background, and it's interesting to see how he ended up where he did.
Another author I'd recommend is Freeman Dyson. His essay collections such as Disturbing the Universe and Infinite in All Directions are well-stocked with good writing and good reading on the subject of science and how it's conducted. Dyson is a rare combination: a sensible, grounded visionary.
Another author to seek out is the late Peter Medawar, whose Advice to a Young Scientist is just the sort of thing. Pluto's Republic is also very good. He was a fine writer, whose style occasionally comes close to being too elegant for its own good, but it's nice to read a scientific Nobel prize winner who suffers from such problems.
I've often mentioned Robert Root-Bernstein's Discovering, an odd book about where scientific creativity comes from and whether it can be learned. I think the decision to write the book as a series of conversations between several unconvincing fictional characters comes close to making it unreadable in the normal sense, but the last chapter, summarizing various laws and recommendations for breakthrough discovery, is a wonderful resource.
Those are some of the ones that cover broad scientific topics. There are others that are more narrowly focused, which should be the topic of another post. And I'd also like to do a follow-up on books with no real scientific connection, but which are good additions to one's mental furniture. I have several in mind, but in all of these categories I'd like to throw the question open to the readership as well. I'll try to collect things into some reference posts when the dust eventually clears.
+ TrackBacks (0) | Category: Book Recommendations | General Scientific News | Who Discovers and Why
There's a whole list of posts below live-blogged from the CMPI conference that I attended on Wednesday, and I think that the organizers will be putting up some audio files of the proceedings soon. It was an enjoyable meeting, and I met a number of interesting people. Since it also involved politics (on which everyone's an expert), the discussion was often livelier than at many scientific conferences. The entry barrier for speaking up about (say) the effect of efflux transporters on toxicokinetics, fascinating subject though that is, is higher than for talking about where the FDA should get its money and how they should spend it.
While in DC, I also got a chance to meet Megan "Jane Galt" McArdle, and we had a nice talk on economics, scientific research, academia, kitchen implements and the eighteenth-century novel. (Get yourself a liberal arts education, and you'll never run out of conversational topics, is my advice). Next time I'm in the area, she threatened to get Tyler Cowen to show up, who will probably take us to some Papuan or Zanzibarean restaurant in a decrepit strip mall. (Not that he's wrong about those being reliable places for good ethnic food).
Blogging will now resume its one-a-day pace here: five per day in real time is about my limit (particularly in a room where there's no place to plug in the laptop - although the CMPI people did well by us in getting wireless access set up outside the hotel's usual exorbitant charges).
+ TrackBacks (0) | Category: Blog Housekeeping
February 21, 2007
The last panel of the day (I missed a good part of one in between, unfortunately) is on the FDA's Critical Path initiative and personalized medicine in general. It's moderated by Greg Simon of FasterCures, and features Michelle Hoffman of Drug Discovery and Development, Robert McBurney of BG Medicine, Gualberto Ruaño of Genomas, John Swen of Pfizer, and Janet Woodcock of the FDA.
Hoffman makes the point that some of the hyper-sceptical reporting of drug and medical issues is a reaction to the genomics hype of a few years ago. (I know, some of you out there who've seen stories that were ripped right from an idiotic press release are wondering where this sceptical reporting is, but I think she's talking about, say, the New York Times.
McBurney spoke about his academic background, saying that he cares even more about data now than he did back then, since millions of dollars are riding on the results. He also mentions the genomic craze, using a good analogy - that a caterpillar and the corresponding butterfly have exactly the same genetic sequence. "I have the same genome I did when I was born," he said, "but some things have changed along the way". His company has recently signed a deal with the FDA to look at preclinical liver toxicity, wirh funding from several large drug companies.
Ruaño is speaking about reverse genomics, "bedside to bench" work for figuring out drug and tox mechanisms. He's summarizing a recent paper in Mol. Psych. on the metabolic effects of antipsychotic drugs - the weight gain and prediabetic symptoms seen in a subset of patients. He and his company did a large parallel search for DNA markers between the patient populations on the two ends of the weight-gain distribution. As it turned out, in olanzapine-treated patients, an ApoE marker was higher in the heavy group, and and ApoE4 one was higher in the lean. For risperidone-treated patients, the leptin receptor and the NPY5 receptor fit the same pattern. They're starting to use their markers prospectively to predict how new patients will respond.
That leads into John Swen's view from Pfizer. He makes the point right at first that he doesn't blame the media for the overhyping of new technologies, as opposed to the people promoting them. (He's got a point, although I'd share the blame out a bit more - compare Michelle Hoffman's view at the beginning of this post). His view of the Critical Path initiatives is that it's going to be long slog to get biomarkers and transitional medicine to work out - worth it, certainly, but not something that's going to start delivering in a short time frame. (No argument here!) He also thinks that we could be doing a lot better than we are in things like new clinical trial designs (which is interesting coming from a company that's run the first large published Bayesian clinical trial).
And finally, Woodcock of the FDA is being asked about how the whole Critical Path initiative is going to fare at its current level of funding. She also feels that the media are very cynical about the sorts of technologies that are being promoted, which corroborates the over-reaction theme. She also says that the parts of the scientific community that are "more vested in the reductionist model" are also pushing back a bit. (My take is that the minute something useful comes out of the whole personalized medicine field, most of the critics will shut up with great alacrity. Success has a thousand fathers, for sure, and nowhere more than in a drug company). She largely dodges the funding question, saying that's it not really the agency's job to lobby for funds, but says that the biggest obstacle she faces right now is getting enough reviewer time to evaluate proposals properly. She thinks that the single best use of the money, though, is personalized medicine (which I find a bit arguable at this point, but eventually she may well be right).
+ TrackBacks (0) | Category: Current Events | Press Coverage | The Central Nervous System
The third conference is on the CATIE and ALLHAT trials, the large comparative studies of antipsychotic and hypertensive medications. These studies are taking a real beating, I have to say. Herbert Meltzer of Vanderbilt took on the CATIE work, saying that its design was too complex and tried to do too many things at once. He pointed out that the study's results - that older and newer antipsychotics were essentially equivalent - is very much at odds with evidence-based medicine. He says that its conclusions haven't had that much effect with clinicians, because it's so at variance with their experience.
Michael Weber of SUNY-Downstate has a lot of bad things to say about the ALLHAT study, too. He points out that the HAT part stood for "Heart Attack Treatment", and that although the diuretic treatment group showed somewhat better blood pressure data, the heart attack outcomes were no different. His other surprising claim was that a large number of African-American trial subjects ended up in groups that did not meet the best standard of care for that population, and asked what would have happened if a drug company ran a similar trial. He was clearly frustrated with the initial coverage of the results in places like the New York Times, which he said were the result of a very well-planned press offensive by the study's authors.
Ralph Snyderman of Duke spoke about the problem of working on complex diseases that aren't driven by a single molecular defect (which, more and more, is what we're left to work on). These things are terribly heterogeneous, on more than one level - for instance, referring to his specialty, he said that as far as he's concerned rheumatoid arthritis is at least three diseases, and perhaps as many as six or seven.
Susan Horn of the Institute for Clinical Outcomes Research made the case for "practice-based medicine", trying to work out the real-world effects of compounds after they've been launched. Meltzer wasn't so sure about how well these sorts of studies replicate, though.
In other news, Matt Herper of Forbes has reluctantly admitted that he doesn't find medical journals to be the most exciting reading in the world - his challenge is turning these results into things that people will read voluntarily. He had a great quote about the difficulty of turning ambiguity into a story, mimicing an editor: "What you you mean these experts don't know? Call them back and get them to tell you!"
Post updated in sections - I've been recharging my laptop batteries - DBL
+ TrackBacks (0) | Category: Press Coverage | The Central Nervous System
Now I'm listening to Andy von Eschenbach, the new FDA commissioner, who's giving a speech on communication and regulation. I think I can refer to him as "Andy", since I'm eating a ham and cheese sandwich in front of him (not to mention blogging his speech).
The main thing I've taken away is that the agency plans to announce some new outlets and methods to disclose information - he's not ready to say what those are yet, but promises that details will be forthcoming. Now questions are coming from the floor - the first one is on direct-to-consumer ads and the recent recommendations by the Institute of Medicine. von Eschenbach answers by saying that the FDA has to recognize the right to free speech, but has to make sure that things are factual. (Not the time to get into a discussion of commercial speech, clearly).
Answering another question, von Eschenbach seems to want to move the FDA away from a reactive stance on drug-safety issues. That's probably a good idea, but considering the kinds of events that bring these things to the front page, reaction is surely always going to be a big part of the process.
Now there's a question about the adverse event reporting system - how to make it useful without overloading people. (This was a feature of the second panel discussion). He's answering that adverse events are only part of the problem - there's unexpected efficacy as well, and any system needs to be able to pick up on all sorts of events. (I agree, but I think that the former will always far outweight the latter).
Now a representative of PhRMA is asking about transparency - as an MD, he's contrasting the open discussion at at mortality and morbidity conference among physicians with what takes place at the FDA/national press level. von Eschenbach replies that acquiring the data is only the first step, and that transforming raw information into knowledge needs to be more transparent. He's saying that the general public wants the end product, not so much all the raw data. (I'd add that these days there will always be people, far between but very committed and vocal, who will want to see the raw numbers, too).
An attendee from Pharmaceutical Executive magazine asks about making sure that different points of view are considered, and on whistleblowers in general. von Eschenbach's reply is that he'd like to have things run so that people wouldn't feel the need to go outside the usual processed. "If people wanted Andy von Eschenbach to do everything himself," he says, "there would just be the Andy Agency". He expects people to adhere to the way the FDA does business, and wants them to come to him if they have a problem.
Steve Projan of Wyeth is now saying that the FDA doesn't seem to have the resources to do what it wants to do, and asks about the renewal of the PDUFA legislation. (There's a whole panel on that in the afternoon). von Eschenbach's reply isn't very specific, as probably befits an issue that's the subject of current legislative wrangling. He regards PDUFA fees as straight fee-for-service, and regards them as useful, but only one part of his resourcing.
The last two questions are on drug labels - the questioner is asking about the inclusion of genomic information on warfarin and tamoxifen labels. And the final question is on regulation of diagnostic test regulation, and the burden on direct-to-consumer genetic tests - the questioner is saying that many primary care physicians aren't that well trained in genetics, and that these tests might as well go to the consumer rather than using the MD as a gatekeeper. "Uh. . .how much time do I have left?" says von Eschenbach, mock-nervously.
He answers that drug labels are changing constantly, and that the agency has to be certain that any infomation that's given out so broadly is really accurate and valuable. He says that the various "omic" disciplines are going to have to make sure that they've got very well established data before it can go on a drug label, but that he knows that this is coming on. As for the regulatory burden on tests, he seems leery of turning these things loose on the public, and would rather have these "integrated into the medical model".
+ TrackBacks (0) | Category: Current Events | Press Coverage
The second panel is going on now, moderated by Steve Usdin of BioCentury, and featuring Helen Boucher of Tufts, Frank Burroughs of the Abigail Alliance, Scott Gottleib of the American Enterprise Institute, and Steve Projan of Wyeth.
One subject that's coming up a lot (as it did in the first panel) is the associate of SSRI therapy with suicide (or suicidality). That's a good example of the tricky nature of drug regulation, crossing over from pre-approval to marketed compounds. Some of the earlier panelists (and questioners from the audience) bemoaned the media coverage on the issue - the current panel is talking about it as an example (some parts good, some bad) of how to study ongoing safety issues, with a big problem being who's going to pay for such things. Surveillance, everyone agrees, is probably the best way to get useful data on drugs and their performance in the real world, but (as has been pointed out), no one wants to hear about how that's surely going to drive up drug costs.
Other areas coming up are antibiotics (and the dearth of new ones) and off-label use of cancer therapies (and other drugs) and how much to regulate it.
The conflict between openness and giving lawyers bait to sue everyone is also being discussed - tort reform has been referred to more than once, as you'd figure. The debate about whether you want to report only data that's reached statistical significance has shown up as well (I think that the alternative is chaos, personally, but not everyone agrees).
Steve Projan made a good point about the problems with Ketek (which, as others have noted, haven't had anywhere near the coverage that the Vioxx problems did). As he says, if you drop Ketek and switch to ampicillin, you'll end up killing more people through anaphylactic shock.
Note: post edited after original version, to incorporate more info - DBL
+ TrackBacks (0) | Category: Current Events | Press Coverage
Well, I'm sitting in the audience now at the CMPI conference. My panel was the first of the day, and was pretty lively. Moderated by Rob Pollock of the Wall Street Journal, it featured Ed Silverman of the Newark Star-Ledger (and now of the very useful Pharmalot, Paul Coplan (who does risk managment at Wyeth), Tim Hunt (public affairs at Biogen-Idec), and Paul Seligman (safety policy at the FDA), and Diedtra Henderson of the Boston Globe.
Vioxx was a big point of discussion, as an example of media reporting on medical and pharma issues. There was a noticeable split between the reporters on the panel and the pharma people on this - the discussion was civil, but you could see the differences in opinion on how well the issue had been covered. With Biogen represented, the Tysabri withdrawal (and return) was also a big topic.
I suppose the main point I'd make in reference to that split came when Ed Silverman mentioned that a good thing that came out of the Vioxx coverage was that it started debate, and that that was always a good thing. I agreed with him, up to a point, adding, though, that I thought that informed debate was more useful. My problem with much of the Vioxx coverage was (as I said about that Michael Crichton op-ed the other day) that it made people feel as if they'd been informed when they hadn't been.
There was general agreement that risk/reward (especially absolute risk versus relative risk) was a key concept in reporting these things, but that it could be difficult to get across to a general readership. The other agreement was the companies should try to be as open as possible about clinical data and adverse events, with (naturally) different ideas about where the cutoff of possibility would fall.
+ TrackBacks (0) | Category: Current Events | Press Coverage
February 20, 2007
I have some down time here at the Hartford airport, which gives me a chance to talk about one of the routine, but pleasurable, things about doing organic chemistry: making stuff. By that I mean making something that most certainly wasn't there when you started.
For example, in the post the other day about the odors of various lab solvents, someone mentioned 2,2-dimethoxypropane. That's not in my top five, but it is pretty nice, and certainly distinctive. You can buy it by the liter, but it's also not hard to make (as grad students in underfunded academic labs know). You take some acetone, which as I mentioned the other day has a clear, strong solvent smell to it, and some methanol - thin and harsh. Add a couple of drops of sulfuric acid or the like, which you can forego enjoying the aroma of unless you're downright perverse), and heat it up.
After a few hours at a gentle boil, you can distill off the product. It's a clear liquid, and looks identical to the solvents you started with. The first clue is the different boiling point, and the second is the smell - strong and somewhat herbaceous. It's new, all right, and you made it with your own hands. (This sort of distillation has its own pleasures, which I'll go on about in another post sometimes - I really haven't done much of it in recent years, and that's a bit of a loss).
The effect is even more dramatic when you have liquid starting materials that produce solid crystalline products. All chemists enjoy crystals - if you don't, you either shouldn't get into synthesis, or you should strongly consider getting out. Having a forest of bright needles or beveled plates come out from what was, a few hours before, a mixture of thin, smelly liquids is something I've never tired of. It's something that would have passed for magic a few hundred years ago, and in a way, it still is.
Well, I've just been unexpectedly upgraded to first class, so this is already looking like a good trip. I'll try to blog some during the conference tomorrow, when I get a chance.
+ TrackBacks (0) | Category: Life in the Drug Labs
February 19, 2007
I'm headed out tomorrow afternoon to Washington, DC, for this conference on "The Media and Medical Science", sponsored by the Center for Medicine in the Public Interest. They've asked me to participate on one of the panels, namely "Does Media Coverage Reflect Reality, and Does It Matter?"
Since I've no problem unburdening myself of my opinions, this should be pretty enjoyable (for me, anyway - the audience will have to take its chances). Should any readers be in the area and able to attend, I'll look forward to meeting you!
+ TrackBacks (0) | Category: Blog Housekeeping
February 18, 2007
It's now been nearly three weeks since I smelled any ethyl acetate or acetone. Those were the two last vapors I was exposed to in my former lab, as I cleaned out some dirty flasks, and those are two of the most common solvents that organic chemists breath in. Neither of them is particularly hard to deal with - acetone has a clear, penetrating solvent-y smell, and ethyl acetate, as a typically fruity ester, comes close to being pleasant. There's plenty worse out there. Hexane and methylene chloride are all over the place in a typical synthetic lab, too, and they're a bit less appealing with their flat paint-cleaner character. (They're rather less appealing from a toxicology standpoint too, for that matter).
Of the other common lab solvents, THF has a rather pungent ethereal smell - not something you'd line up for, by any means, and diethyl ether itself fills up your nose with great speed and thoroughness. Somehow, there's rarely a thin whiff of ether in the air - it's either nothing or a choking blanket of the stuff. Acetonitrile is something you'd think would have an interesting reek, but it defies expectations (and breeds doubt as to the broad-spectrum utility of the human nose) by having absolutely no smell at all.
Many of the really polar solvents have that feature. DMF has a smell to it, but it's surely traces of dimethylamine that account for most of it - in my experience, the pure stuff doesn't have much character at all. DMSO is the same way. There's something oddly scented there, and you can tell as it takes up olfactory room that you're not smelling regular air, but it's not as strong as you'd figure. As with DMF, you have to wonder how much is due to traces of impurities, such as reduced sulfur compounds, of which it wouldn't take much.
And the most pleasant of the bunch? Pure ethanol, for my money. It's not pleasure by association, either, because I don't really drink at all (and never have). But straight ethanol's combination of fruitiness and pungency is unique and appealing. Its cousins don't make the cut. Methanol's dim and harsh, and the propanols are no improvement: n-propanol (an uncommon solvent) is rather nasty, and isopropanol (the well-known rubbing alcohol smell) is not unpleasant, but rather strong, clinical, and somehow alien. n-Butanol, for its part, is quite foul in the manner of butyl compounds everywhere. Our noses have it in for straight four-carbon chains, and there's nothing to be done about it. Nope, it's ethanol, and it's not even close. Any other nominations?
+ TrackBacks (0) | Category: Life in the Drug Labs
February 16, 2007
I know that people have been having some problems leaving comments here the last few days - I've had some myself; it's rather disconcerting to find that comments you're making to your own blog are being treated like radioactive spam. But I think the issue has been resolved, so if anything else odd happens, please let me know.
If you don't blog yourself, you'd probably be amazed at the volume of comment spam that comes in. Well, if your e-mail address is out on a web page like mine, maybe you wouldn't, considering what that does to your inbox. I pull two or three hundred offers of winning lotteries, African fortunes, dubious business propositions, and outright gibberish per day into my address. But there's a nearly equal volume of comment spam, which was always seemed to me one of the most pathetic attempts at advertising I've ever seen. Why bother? Well, because it's basically free, and hey, one out of every ten million people might click by accident. . .
There are keyword and lookup filters behind the scenes here that do an excellent job of catching all this garbage. Before they were implemented, I'd come in to take a look at the site in the morning and find the last fifty comments were a repeated offer to do things with farm animals or something, which was a welcome way to start the day, naturally. But if the setting get a little too aggressive, actual comments start getting flung into the bit-bucket. Every day or so I take a look at the pile and rescue a few strays, but we'll see how things look under the current settings.
+ TrackBacks (0) | Category: Blog Housekeeping
February 14, 2007
A lot of rather heating commentary is coming on on the subject of Michael Crichton's gene-patent article, and on gene patents in general. The subject is large enough that it'll need to be broken down to discuss. For today, here's my take on one aspect, what a patent lawyer would call "composition of matter".
The patenting of isolated genes as chemical entities is tricky. Yes, they are chemicals, and when they're isolated and purified like that they really are in a different state than found in nature. But their size is so far removed from many of the other things patented as substances that I can't help but wonder if a principle is being pushed too far. (The obvious other example here is the patenting of isolated proteins, which of course is also well established, for better or worse).
An analogy occurs to me, and working through it will show some of the complications of this area: suppose I isolated and purified a single molecular weight form (one particular isomer) of some long industrial polymer that's usually made and used as a mixture. Can I patent that? Can I then go after people who sell the mixture, because it includes my proprietary substance?
Now, there are some differences here compared to patenting a gene, because the original polymers I'm thinking of are man-made, and there's a lot of prior art around them. And no doubt some of it includes language that covers polymers of a range of molecular weights and the like, and the older ones have long since entered the public domain anyway. The biggest problem with using this as a path to riches is that I don't think I can turn around and go after people whose polymers have my patented isomer in them, because I believe I'd have to show that it's an essential part of their system (and it probably won't be). So I likely won't be able sneak in on some of that big polyethylene money this way.
How about polymers that aren't man-made, like cotton or silk? We have no good way (at present) to produce or isolate individual single isomers of such things, the way we can with stretches of RNA or DNA. If I invent one, I'll most certainly apply for a patent on the method of doing that, just like someone who invents a new way to separate or purify DNA would. I don't think anyone should have a problem with that, because that would be an inventive step by anyone's definition. But can I then turn around and get composition-of-matter patents on some of the things I can isolate with my new technique? Judging from the genomics examples, I'd say that I could, if I could pass a further test.
That's a big one, though: having to show some utility for them. As I mentioned yesterday, that issue that came up with a lot of the early gene patent applications - back in the far-off days of the 1990s, people just immediately shotgunned the PTO with applications for every gene they came across, often with only the haziest uses in mind. Eventually the rules were tightened up - you can't just march in with your gene now and say "could be useful for a diagnostic test for a disease in which this gene is involved" and get a good reception. (There's also the problem that most of the genetic landscape is already the subject of one application or another by now)! I could have some difficulty showing a particular utility for a particular isomer of (say) a silk protein, but it could probably be done. Perhaps the presence of a particular one would prove to be important for imparting some property to the finished silk, for example.
There would be other patenting difficulties, even if I got mine issued. A big one would be the "doctrine of equivalents", which is the patent law way of saying that a difference that makes no difference is no difference. If I claim a newly isolated pure polysaccharide of X hundred or thousand monomer units, is there anything different about it compared to the X+1 isomer? There had better be a difference at some point if I want to have a patent that will do me any good, and to be on the safe side I'd better try to patent everything out to that point.
DNA, RNA, and proteins are perfectly suited to pass that test, though, since very small changes can be demonstrated to lead to totally different properties and functions. The doctrine of equivalents comes in when you start looking at silent mutations - a base change that doesn't change the amino acid that gets coded for, or (in a protein) a conservative amino acid switch in a part of the structure that doesn't affect anything. Court cases have been fought over just these sorts of issues.
As you can see, the thing that makes DNA, RNA, and proteins different is that they're structurally simple enough to make, handle, and isolate, and structurally complex enough so that there are huge numbers of potential variations. There are also highly evolved systems than can be exploited for their production and alteration, which gives everyone a big head start. Biologically, they're leveraged tremendously, so that seemingly trivial changes can sometimes have huge consequences - and, of course, these consequences bear on human health, which makes them of great social and financial importance. A better recipe for intellectual property wrangling I could hardly imagine. Next time, we talk utility, where even more fun is to be found.
+ TrackBacks (0) | Category:
February 13, 2007
Today's New York Times has a passionate op-ed by Michael Crichton on the subject of gene patents. Now, as my previous posts will demonstrate, I'm no fan of over-patenting. And the whole topic of gene (and protein) patents is a very interesting and important one.
Unfortunately, though, it's also very complex, and Crichton's piece manages to complete reduce the subject to tinkling fragments. The op-ed is so vigorously argued that its readers will probably come away feeling as if they've been informed, but I'm afraid that they're going to end up knowing less than when they started. I hate to be this blunt about it, but Crichton's done his cause a great disservice by spreading ignorance and confusion.
The official position of the Patent Office is that products of nature are not patentable. But. . .an isolated or purified one, in a form not found naturally, can be. Single genes, ripped out of their context in genomic DNA and expressed as a pure form, are considered to be new chemical substances, and thus can indeed be patented. We can argue about whether this is a proper interpretation or whether it's a good idea, but to ignore the point completely (as Crichton's piece does) isn't going to help anyone understand the problem.
You'd also never guess from reading Crichton that the subject of utility is of great importance in patent law. There's a profound difference between a patent on a gene, and a patent on a use for a gene. (That may sound trivial, but only if you've never been involved in writing or analyzing any patents). Ten years ago, the US Patent Office was getting swamped by gene applications with very little thought to their use (other than some pro forma statements, but they raised their standardsper se shouldn't be allowed, you'd still have the use issue to deal with. The word "utility" does not appear in today's op-ed.
You'd also never know that the whole subject is being contested, very seriously and expensively, in court cases all over the world. The Metabolite case, which the Supreme Court recently dodged, is the one with the highest recent profile, and there will be more. It's not like the topic hasn't created controversy.
If you want a thoughtful analysis of the problems of gene patenting, start with this analysis (PDF) from the Congressional Research Service. Reading and understanding it will put you way ahead of the readers of the New York Times and, it seems, way ahead of Michael Crichton.
+ TrackBacks (0) | Category: Patents and IP | Press Coverage
February 12, 2007
So, you're asking yourself, "Why do people invest in biotech and small-pharma stocks?" You could especially ask yourself that after reading this New York Times article from Sunday, which describes how Xoma (yep, they're still around) has vaporized $700 million dollars, and counting, in its 25-year history.
Well, here's why: as I write this, Onyx Pharmaceuticals is up a solid 90% on the day. They're partners with Bayer on the kinase inhibitor Nexavar (sorafenib), and the companies today reported positive data in treating hepatic cancer. This wasn't long after the drug had pretty much whiffed on melanoma, so the news came as a bit of a surprise (thus that 90% updraft).
My guess is that it came as a surprise to the people doing the study as well. Liver cancer is a bigger market than anything that Nexavar is approved for, and you'd think that it would have been one of the first trials run if it were considered a high-percentage play. But cancer is tricky, and we don't understand it worth beans. You have to do the experiments, and you have to realize going in that you only have a vague idea of how they might go.
So that's one reason that biotech stocks continue to get buyers - for the same reason that lottery tickets do. It would be interesting to know which one has returned more money over the years, although I'm afraid I already know the answer. But long-term, biotech has the edge, because (slowly and with infinite pains) we're learning what we're doing. . .
Disclosure: I have a financial interest in Bayer stock - I have no exposure to Onyx (damn it all) or Xoma.
+ TrackBacks (0) | Category: Business and Markets | Cancer
This weekend brought reports that the widely rumored Sanofi-Aventis / Bristol-Myers Squibb merger deal has been called off. No one at either company is confirming this, but then, no one at either company ever confirmed that a deal was being worked on in the first place.
I'm quite happy to hear this, naturally, since I've been ranting about pharmaceutical mergers for years now. But I'm afraid that this isn't a case of Sanofi-Aventis realizing that perhaps they shouldn't hogtie their research productivity just as they need it to expand. No, if the deal has indeed been put aside, it seems likely to have been done in by disagreements over Plavix and perhaps by the rise in the BMS share price. That last factor, although cited in some of the news reports, seems a bit odd. You'd think that a company would factor those things in when they cost out one of these ideas, but who knows?
So as the Plavix situation gets settled in court this year (one way or another), I would expect this deal to come back to life. S-A's chairman, Jean-Francois Dehecq, seems to enjoy this kind of thing, and once a CEO gets a taste for engulfing other companies they often don't seem to know when to quit. For their part, Bristol-Myers Squibb seems to want to remain an independent company, and I salute them for it.
+ TrackBacks (0) | Category: Business and Markets
February 8, 2007
Here's another one of those topics that is a bigger concern in academic labs than in industrial ones: stealing supplies from each other. The difference is easy to understand, and can be summed up (as a terrifying number of things can) by the word "money". Industrial labs generally are the Land of Research Plenty, so people don't spend much time looting and pillaging.
But boy, do they have to unlearn those habits. Most academic labs run on tight budgets, so valuable reagents and pieces of equipment get hoarded. People would practically steal things out of my lab coat pockets. I remember going on vacation in graduate school and leaving notes in the drawers in my lab: "Please don't take this. It's the only one I have" or "Go steal one of these from So-and-So. He has more of them than I do". When I came back, people told me how much they liked the notes.
Deprivation leads people to all sorts of money-saving (but time-wasting) attempts to economize. I mentioned a disastrous attempt to recycle lumpy, brown waste acetone here, and there are more stories like that to be found whenever chemists gather. Graduate student time is the one cheap commodity in academia, so you see people redistilling used solvents or washing and re-using silica gel, both of which are (to me) roughly the same as trying to dry out uneaten pasta so it can be boiled again later.
A group down the hall from me in those days used all sorts of exotic mixtures, and whoever made up decent quantities of them was sure to see pilferage. A friend of mine got tired of making things for his colleagues to steal, so he started labeling his bottles with the names of freshwater fish. A midnight raid on his cabinet would present the would-be shortcutter with a row of jugs labeled "Rainbow Trout" and "1:1 Catfish / Smallmouth Bass". That slowed things down for a while, anyway.
+ TrackBacks (0) | Category: Graduate School
February 7, 2007
When a drug candidate runs into toxicity trouble, the first question that comes to everyone's mind in the lab is: mechanism-based or not? If the project is a follow-on compound to something that's already made it to the market, the answer is probably already clear - after all, if the first one was clean, why shouldn't the second one be?
But if you're working on a new target, this is a major headache, with major implications either way. If the tox is related to the compound's mechanism of action, you're probably going to have to abandon the compound, and perhaps abandon any hope of a follow-up while you're at it. A really solid link to trouble can kill a target for you and for everyone else in the industry. That sound like bad news, and in the short run it probably is - but in the long run, it's better to know these things. There are enough things to waste time on already, so getting rid of one isn't such a catastrophe, unless it's your own drug.
On the other hand, if the toxicity isn't mechanism-based, then it's likely due to something odd about the particular compound, some off-target effect that it has. Chasing these things down can be extremely difficult, and often there's no way to really tell what went wrong. You just have to move along another compound, from a different structural series if possible, and hold your breath. At least you know what to look for first. But there's always the horrible possibility that the follow-up compound will show an equally ugly but completely different tox profile, which brings on thoughts of truck-driving school, where you at least would know what the hell is going on.
Of course, the usual reservations apply here (toxicology is full of these). For example, it's always possible that the compound is toxic in one species, but not in another. Happens all the time, actually. But in that case, you'd better have a really, really plausible reason why humans are on the safe side of the line, and convincing ones can be hard to find. Maybe all the problems are caused by a metabolite, and not by the original drug (that one's far from unknown, too). Back to the lab you'll go for that one, too, because you don't know how human will react to the metabolite, and you can't be quite sure how much of it they'll produce relative to the animals, anyway.
Barring these, though, either the compound is dead, or the whole structural class of compounds is dead along with it, or the whole universe of compounds that work the same way is dead. None of those are necessarily appealing, but those are the main choices, and there's nothing written down - anywhere - that says that you have to get one that you like.
+ TrackBacks (0) | Category: Drug Development | Toxicology
Antiviral drugs are one of those big unmet medical needs that we talk about in the drug industry. The reason we talk about them is, of course, that from a business standpoint - and this is a business, for sure - "unmet need" is equivalent to "unmade profit".
The problem is, the reason that some of these big opportunities are unclaimed is that they're not easy to address. As I've said here before, one big problem with antivirals is that there are a very limited number of good targets for drugs. After all, viruses are pretty stripped-down to start with: they do a limited number of things, but they do them very well indeed. Compared to a relatively target-rich therapeutic area like cancer, infectious disease is a desert.
One well-known oasis, though, contains the viral proteases. Many viruses carry these as a key part of their machinery, to help "unpack" necessary proteins from larger precursors. Famously, that's how many of the anti-HIV drugs work, and the same general strategy should be applicable to several other viral types.
Hepatitis C has been one of the big targets for many years now. Various development programs have come and gone, but no one has been able to really nail this one. Vertex is now in the middle of trying to, and as Adam Feuerstein points out, they're really betting a large part of the company on the attempt. Over the next few months, results should start coming out for their PROVE trials of telaprevir (VX-950), and for Vertex's sake, the drug had better work. A herd of competitors, probably led by Schering-Plough, is ready to take over should anything slip.
"Work" is defined as "work well enough so that people don't have to take injections of interferon". That'll depend, as always, on the balance of efficacy and toxicity, and it's the side effect profile that everyone will be watching, since it's widely assumed that the drug will in fact do some good against the disease. The nerve-wracking thing about working for a small-to-medium sized company has always been that your future ends up depending on single events like this, and I wish everyone at Vertex good luck. (Of course, as people at Pfizer will tell you, your future even at a gigantic company can end up depending on the results of one clinical trial - this industry is getting altogether too exciting for a lot of people to take).
+ TrackBacks (0) | Category: Clinical Trials | Infectious Diseases
February 5, 2007
Here's an interesting press release on a potential new class of anticancer drugs. It has a nice hook ("Lab mistake leads to cancer finding!"), and the work itself isn't bad at all. It's an neat biochemical result, which might eventually lead to something. You have to know a bit about drug discovery and development to spot the problem, though - and not that many people do, which provides the ecological niche for this whole blog, frankly.
The discovery (from the University of Rochester) has to do with PPAR-gamma compounds, an area of research I've spent some time in. I didn't spend enough time there to understand it, mind you - no one has spent enough time to do that yet, no matter how long they've been at it. I wrote about some of the complexities here in 2004, and things have not become any more intelligible since then. The PPARs are nuclear receptors, affecting gene transcription when small molecules bind to them. There are, however, zillions of different binding modes in these things and they affect a list of genes that stretches right out the door. Some get upregulated, some down, and these vary according to patterns that we're only beginning to understand.
The Rochester group found that a particular class of compounds, the PPAR-gamma antagonists, had an unexpected toxic effect on some tumor cell lines. Their tubulin system was disrupted - that's a structural protein which is very important during cell division, and is the target for other known oncology drugs (like Taxol). The PPAR ligands seem to be messing with tubulin through a different route than anyone's seen before, though, and that definitely makes it worth following up on.
But the tone of the press release is too optimistic. (I should turn that line into some sort of macro, since I could use it twenty times a day). It mentions "high-dose" PPAR antagonist therapy as a possible cancer treatment, but take a look at the concentrations used: 10 to 100 micromolar. Even for cells in a dish, that's really hammering things down. And there's hardly any chance that you could attain these levels in a real-world situation, dosing a whole animal (or human). As blood levels go, those are huge.
But how about using more potent compounds? Of the three that are mentioned in the paper, BADGE is pretty dead, but the other two are actually quite potent. Tellingly, nothing happened at all with any of them up to 1 micromolar. These things will mess with other PPAR-gamma driven processes at much lower concentrations, so you have to wonder what's really going on here. And keep in mind that other PPAR compounds whose mode of action is roughly the opposite of these have been suggested as potential anticancer agents, too - this sort of thing happens all the time with nuclear receptors, and reflects their head-grabbing complexity.
This is still worth figuring out; don't get me wrong. There might be a new mechanism here that could lead to something, eventually, although it looks to be a tough problem. But that's the part of this work that's interesting - the level of activity seen here isn't. If I had a dollar for every compound that affects tumor cells at 50 micromolar, I wouldn't need to be sending my CV out these days.
+ TrackBacks (0) | Category: Cancer | Drug Assays
So begins my first week without employment since the late 1980s. And I'm not sure that that period counts, since it was just after my postdoc ended and I was looking for my first real job. I had a Humboldt fellowship in Germany - West Germany at the time, of course - and I'd tried sending letters from there back to potential employers in the US. I should have taken all of them and buried them under a rock by the light of the full moon - it wouldn't have produced any fewer results. I realized what was happening after a while, and prepared another thick pile of envelopes for my return. Once in a US airport, I promptly mailed them out, and then the phone began to ring at last.
I feel rather cut off from things, I have to say, because I'm used to constant SciFinder access and plenty of online journal subscriptions. There's not much I can do about either one of those, though - SciFinder's rates are astronomical for what-you-want when-you-want searching, for example, which makes me glad that I used the service so heavily while I had access to it. And I also feel cut off from doing what I usually do - think up weird research ideas and test them out. The burst of activity I detailed here is the last time I've been in the lab, well, other than to throw an awful lot of stuff away.
What will be interesting will be seeing what kinds of ideas I get after this break. Rather than going rusty, my guess is that I'll have some interesting stuff built up and ready to go. I've written about how one of the things that I disliked about graduate school was the constant, forced attention on one single project and problem. Situations like that have always done me harm when they've gone on too long - here's hoping that this one will do me good.
+ TrackBacks (0) | Category: Closing Time
February 4, 2007
The Scientist has a very interesting article in the latest issue, titled "Why Pharma Must Go Hollywood". The author, an executive in the industry, makes some good points. After pointing out the low-lying-fruit component to everyone's recent productivity problems, out comes this:
"The second critical and fundamental cause of pharma's productivity problem, which fortunately is potentially remediable, is what former R&D director for Burroughs Welcome, Glaxo, and Warner Lambert, Pedro Cuatrecasas, has referred to as the "pervasive mismanagement" of the R&D process. Cuatrecasas noted in a recent article in the Journal of Clinical Investigation that the rot started in the early 1970s when managers with business school or legal backgrounds, but no significant foundation in science or medicine, began to invade the upper echelons of pharma and introduce structures and practices such as "management by objectives" from industries lacking any significant R&D enterprises. This invasion was motivated by a desire to increase the efficiency of R&D and to prioritize maximizing the return on investment.
An even more stifling trend has been the recent importation of the "six sigma" business improvement methodology into aspects of pharma R&D. Six sigma was designed to improve manufacturing processes, but has been well documented to quench innovation. The intellectual bankruptcy typical of many current pharma leaders is well illustrated by the typical pharma response to faltering productivity and the resultant fall in earnings. Take, for example, Pfizer's acquisitions since 2000 of Warner-Lambert and Pharmacia. Rather than investigating and addressing the fundamental etiologies of the problem and contrary to the readily available data in the business literature, the leadership plunges into the short-term fix and ego-satisfying drama of a merger, which is almost guaranteed to stifle innovation even further."
As you can imagine, my response to this is to stomp my feet and throw roses, because it's exactly the sort of thing I've been saying around here for a long time. (I'm not alone, either). It's bizarrely refreshing to hear the phrase "pervasive mismanagement" used to describe the drug industry. I find myself sitting around repeating it in my idle moments, with mental illustrations from my own experience.
The "Hollywood" part of the article is the author's prescription for the industry. Noting the similarities between drug launches and movie launches (an idea that's been floating around for a few years now), he (or she) advocates learning from the studios that have been best at developing and managing creativity. We may, the article claims, have learned about all we can from benchmarking each other - we need to look outside the list of other pharma companies.
Why do I say "he (or she)"? Because, most unusually, the article is written anonymously. I'm quite curious about where it came from, but in the end it doesn't really matter. Anyone who works for a big company will recognize what's being talked about - the fixation on short-term results, the we're-sticking-with-this-decision-no-matter-what mentality, the command-and-control leadership style. No, William Goldman was right when he said about the movie business that "no one knows anything". And the same thing applies to the drug industry, too, but no one in the executive offices wants to admit it.
+ TrackBacks (0) | Category: Drug Development | Drug Industry History
February 1, 2007
Here's a question for the readership that should generate some interesting answers: what's the most valuable item you've seen someone ruin in a lab? I'll leave it broad enough to include both equipment and materials, and I expect to cringe numerous times on reading the comments.
I can put one into the hopper to start things off. Back some years ago, the guys down the hall from me had bought one of the largest Chiracel columns that were then sold. (For the non-chemists in the audience, this is a large packed column used to separate mirror image compound isomers (enantiomers) by pumping a mixture through). This was one of the ones where the chiral packing wasn't really bonded on to anything, but just sort of layered on another powdered solid support. And as the literature included with the column made clear, this meant that you could wash the stuff right off if you weren't careful with your solvent selection.
Well, it made it clear if you, like, read the sheet and everything. Which didn't stop someone from taking up their compound in methylene chloride and pumping it right onto the barely-used $15,000 (late 1980s money) column. And in the fullness of time (say, ten or fifteen minutes), out came the solvent front from the other end: cloudy, milky, swirling with opalescent shimmers like shampoo. Which shimmery stuff was, of course, the fifteen long ones of chiral resolving agent, scoured off the packing material by the cleansing wave of chlorinated solvent.
There: clean, simple, direct, and easily avoidable by spending two minutes reading a sheet of paper. That's the kind of thing I have in mind. Some additional examples?
+ TrackBacks (0) | Category: How Not to Do It