About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: firstname.lastname@example.org
July 31, 2009
I linked yesterday to a post by Megan McArdle about health care reform. And while I realize that everyone got into a shouting match in the comments to my own post on the subject - and people sure did in the comments to hers; it's endemic - I wanted to quote a section from her on drug discovery:
Advocates of this policy have a number of rejoinders to this, notably that NIH funding is responsible for a lot of innovation. This is true, but theoretical innovation is not the same thing as product innovation. We tend to think of innovation as a matter of a mad scientist somewhere making a Brilliant Discovery!!! but in fact, innovation is more often a matter of small steps towards perfection. Wal-Mart’s revolution in supply chain management has been one of the most powerful factors influencing American productivity in recent decades. Yes, it was enabled by the computer revolution–but computers, by themselves, did not give Wal-Mart the idea of treating trucks like mobile warehouses, much less the expertise to do it.
In the case of pharma, what an NIH or academic researcher does is very, very different from what a pharma researcher does. They are no more interchangeable than theoretical physicists and civil engineers. An academic identifies targets. A pharma researcher finds out whether those targets can be activated with a molecule. Then he finds out whether that molecule can be made to reach the target. Is it small enough to be orally dosed? (Unless the disease you’re after is fairly fatal, inability to orally dose is pretty much a drug-killer). Can it be made reliably? Can it be made cost-effectively? Can you scale production? It’s not a viable drug if it takes one guy three weeks with a bunsen burner to knock out 3 doses.
I don't think a lot of readers here will have a problem with that description, because it seems pretty accurate. True, we do a lot more inhibiting drug targets than we do activating them, because it's easier to toss a spanner in the works, but that's mostly just a matter of definitions. And this does pass by the people doing some drug discovery work in academia (and the people doing more blue-sky stuff in industry), but overall, it's basically how things are, plus or minus a good ol' Bunsen burner or two.
But not everyone's buying it. Take this response by Ben Domenech over at The New Ledger. We'd better hope that this isn't a representative view, and that the people who are trying to overhaul all of health care as quickly as possible have a better handle on how our end of the system works:
. . .But needless to say, this passage and the ones following it surprised me a great deal. Working at the Department of Health and Human Services provided me the opportunity to learn a good deal about the workings of the NIH, and I happen to have multiple friends who still work there — and their shocked reaction to McArdle’s description was stronger than mine, to say the least.
“McArdle clearly doesn’t understand what she’s writing about,” one former NIH colleague said today. “Where does she think Nobel prize winners in biomedical research originate, academic researchers or in Pharma? Our academic researchers run clinical trials and develop drugs. I’m not trying to talk down Pharma, which I’m a big fan of, but I don’t think anyone in the field could read what she wrote without laughing.”
Well, I certainly could make it through without a chuckle, and I'll have been doing drug discovery for twenty years this fall. So how does the guy from HHS think things go over here?
To understand how research is divided overall, consider it as three tranches: basic, translational, and clinical. Basic is research at the molecular level to understand how things work; translational research takes basic findings and tries to find applications for those findings in a clinical setting; and clinical research takes the translational findings and produces procedures, drugs, and equipment for use by and on patients. . .
. . .The truth, as anyone knowledgeable within the system will tell you, is that private companies just don’t do basic research. They do productization research, and only for well-known medical conditions that have a lot of commercial value to solve. The government funds nearly everything else, whether it’s done by government scientists or by academic scientists whose work is funded overwhelmingly by government grants.
Hmm. Well-known with a lot of commercial value. Now it's true that we tend to go after things with commercial value - it is a business, after all - but how well-known is Gaucher disease? Or Fabry disease? Mucopolysaccharidosis I? People who actually know something about the drug industry will be nodding their heads, though, because they'll have caught on that I'm listing off Genzyme's product portfolio (part of it, anyway), which is largely made up of treatments for such things. There ar many other examples. Believe me, if we can make money going after a disease, we'll give it a try, and there are a lot of diseases. (The biggest breakdown occurs not when a disease affects a smaller number of people, but when almost no one who has it can possibly pay for the cost of developing the treatment, as in many tropical diseases).
But even taking Domenech's three research divisions as given - and they're not bad - don't we in industry even get to do a little bit of translational research? Even sometimes some basic stuff? After all, in the great majority times when we start attacking some new target, there is no drug for it, you know. We have to express the protein in an active form, work up a reliable assay using it, screen our compound collections looking for a lead structure, then work on it for a few years to make new compounds that are potent, selective, nontoxic, practical to produce, and capable of being dosed in humans. (Oh, and they really should be chemical structures that no one's ever made or even speculated about before). All of that is "productization" research? Even when we're the first people to actually take a given target idea into the clinic at all?
That happens all the time, you know. The first project I ever worked on in this industry was a selective dopamine antagonist targeted for schizophrenia. We were the first company to take this particular subtype into the clinic, and boy, did we bomb big. No activity at all. It was almost as if we'd discovered something basic about schizophrenia, but apparently that can't be the case. Then I worked on Alzheimer's therapies, namely protease inhibitors targeting beta-amyloid production, and if I'm not mistaken, the only real human data on such things has come from industry. I could go on, and I will, given half a chance. But I hope that the point has been made. If it hasn't, then consider this quote, from here:
“. . .translational research requires skills and a culture that universities typically lack, says Victoria Hale, chief executive of the non-profit drug company the Institute for OneWorld Health in San Francisco, California, which is developing drugs for visceral leishmaniasis, malaria and Chagas' disease. Academic institutions are often naive about what it takes to develop a drug, she says, and much basic research is therefore unusable. That's because few universities are willing to support the medicinal chemistry research needed to verify from the outset that a compound will not be a dead end in terms of drug development."
The persistent confusion over what's done in industry and what's done in academia has been one of my biggest lessons from running this blog. The topic just will not die. A few years ago, I ended up writing a long post on what exactly drug companies do in response to the "NIH discovers all the drugs" crowd, with several follow-ups (here, here, and here). But overall, Hercules had an easier time with the Hydra.
Now, there is drug discovery in academia (ask Dennis Liotta!), although not enough of it to run an industry. Lyrica is an example of a compound that came right out of the university labs, although it certainly had an interesting road to the market. And the topic of academic drug research has come up around here many times over the last few years. So I don't want to act as if there's no contribution at all past basic research in academia, because that's not true at all. But neither is it the case that pharma just swoops in, picks up the wonder drugs, and decides what color the package should be.
But what really burns my toast is this part:
So Pharma is interested in making money as their primary goal — that should surprise no one. But they’re also interested in avoiding litigation. Suppose for a moment that Pharma produces a drug to treat one non-life threatening condition, and it’s a monetary success, earning profits measured in billions of dollars. But then one of their researchers discovers it might have other applications, including life-saving ones. Instead of starting on research, Pharma will stand pat. Why? Because it doesn’t make any business sense to go through an entire FDA approval process and a round of clinical trials all over again, and at the end of the day, they could just be needlessly jeopardizing the success of a multi-billion dollar drug. It makes business sense to just stand with what works perfectly fine for the larger population, not try to cure a more focused and more deadly condition.
Ummm. . .isn't this exactly what happened with Vioxx? Merck was trying to see if Cox-2 inhibitors could be useful for colon cancer, which is certainly deadly, and certainly a lot less common than joint and muscle pains. Why didn't Merck "stand pat"? Because they wanted to make even more money of course. They'd already spent some of the cash that would have to have been spent on developing Vioxx, and cancer trials aren't as long and costly as they are in some other therapeutic areas. So it was actually a reasonable thing to look into. If you're staying in the same dosing range, you're not likely to turn up tox problems that you didn't already see in your earlier trials. (That's where Merck got into real trouble, actually - the accusation was that they'd seen signs of Vioxx's cardiovascular problems before the colon cancer trial, but breezed past them). But you just might come up with a benefit that allows you to sell your drug to a whole new market.
And that might also explain why, in general, drug companies look for new therapeutic opportunities like this all the time with their existing drugs. In fact, sometimes we look for them so aggressively that we get nailed for off-label promotion. No, instead of standing pat, we get in trouble for just the opposite. Your patented drug is a wasting asset, remember, and your job is to make the absolute most of it while it's still yours. Closing your eyes to new opportunities is not the way to do that.
The thing is, Domenech's heart seems to be mostly in the right place. He just doesn't understand the drug industry, and neither do his NIH sources. Talking to someone who works in it would have helped a bit.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Business and Markets | Drug Industry History
July 30, 2009
I haven't written about the various health care reform packages that are being hammered out and hammered through the various parts of Congress. That's partly because I began to think fairly early on in the process that we weren't actually going to see something happen as quickly as the administration wanted, which meant that there were still plenty of twists and turns left. I still think that's true - in fact, I have no idea when a final bill will ever get Frankensteined together for a vote, and no one else seems to have a good idea, either.
And since the main focus of this blog is pharmaceutical research, the first question I have to deal with is what effect such a bill will have on what I (and many of the readers here) do for a living. Absent a good idea of what the legislation will really look like, that's impossible to do in detail. But I can paint some broad strokes at this point, and they're probably not going to come as much of a surprise: I don't like what I see.
On the macro level, I don't like the administration's rhetoric on this issue. I do not believe that health care costs are crippling our economy, and the implication that they're tied to our current economic downturn seems specious. (And yes, that argument has been made, and more than once). Such an any-weapon-to-hand approach seems a bit different from what many people may have thought that they were voting for in the last election.
But I didn't vote for Obama, although I certainly wasn't crazy about the McCain-Palin ticket, either. My fears (expressed here) that he might turn out to be a zealous world-changing reformer have been amply confirmed. What do I have against zealous world-changing reformers, you ask? Why, I fear that the world is trickier than they are, for one thing. And too many of these people seem to come across as "If you people would just have enough sense to see that I'm doing this for your own good" types. At the rate we're going, that'll be the key phrase in a presidential speech right after Labor Day. (Mickey Kaus has been pointing out for some time now that this eat-your-peas-for-the-common-good approach is not doing the administration any favors).
The only big changes I'm in the mood for, generally speaking, are ones that give people more control over their own destiny, and if that's what we're seeing here, I've missed it. (I'm not alone). I guess that I just don't believe that systems this large and this complex are subject to wholesale intervention by the Wise and the Good. I worry that the Wise and Good will, in fact, decide that if they're truly going to control costs that they're going to ration health care in ways that people aren't necessarily expecting. Part of that rationing may well have to be either de facto (or flat-out de jure) price controls on pharmaceuticals and other parts of the system - and if applied thoroughly enough, these will be an excellent way of creating shortages of just the things that are being controlled, in the same way that price controls have always functioned. Some of those shortages will be silent ones: the things we don't discover.
Alternatively, we could end up with a Great Big Plan that doesn't really attempt to cut costs, or defers those cost savings into the glorious future. It's worth considering that, as far as I can see, every single attempt to run a large state-sponsored heath plan has ended up costing far, far more than even the most pessimistic initial estimates. And this time will be different. . . how, exactly?
And that leads us to the sort of bill that I think we're most likely to get: one that doesn't satisfy the biggest advocates of sweeping health care reform, since it's had to abandon the big proposals for the sake of political reality, but one that at the same time spends lots and lots more money, with no clear plan of how to raise these funds, all of that again for the sake of political reality. One, in short, that gives all the politicians involved a chance to pin "I Passed Health Care Reform!" buttons on their jackets while pissing off everyone who bothers to look at the thing closely, and one that commits us to spending oceans of money to accomplish not very much.
Perhaps I'm just in a bad mood. But that looks like what we're heading for.
+ TrackBacks (0) | Category: Current Events | Drug Prices | Regulatory Affairs
July 29, 2009
I've now returned from a family vacation, so regular blogging is set to resume. Before it does, though, I have a brief public service announcement for readers who are looking for airfare deals: beware of Cheaptickets.com. I went with them this time because they beat what I could find on Kayak, but TANSTAAFL, or even necessarily a cheaper one.
Even when you've paid for your tickets and picked out your seats two months before, even after Cheaptickets sends you an e-mail with your reservation info, one that lists all your seat numbers and says "Your Seats Are Confirmed", don't just go and assume that those are, you know, your confirmed seat numbers. They aren't. You and your family can easily end up scattered throughout the plane - we sure did, at least until a helpful person from United was able to rearrange things.
Turns out the Cheaptickets people "forward your seat preferences" to the airlines, who then are free to do what they like with these suggestions. The whole seating-map thing is just a sort of gedankenversuch, not meant to have any real-world application. So keep that in mind.
+ TrackBacks (0) | Category: Blog Housekeeping
July 27, 2009
I'm still on the road - just wanted to let everyone know that I'm still out here, and piling up topics to cover here. Should be another day or two before regular posting resumes. See you then!
+ TrackBacks (0) | Category: Blog Housekeeping
July 22, 2009
Just wanted to let people know that posting will be irregular around here for the next few days, due to some traveling. I'll probably be able to put some stuff up, but it'll show up at odd intervals. I assume that no gigantic science/pharma stories will break in late July, but I guess one never knows. . .!
+ TrackBacks (0) | Category: Blog Housekeeping
July 20, 2009
Here's an interesting look at the current state of the Alzheimer's field from Bloomberg. The current big hope is Wyeth (and Elan)'s bapineuzumab, which I last wrote about here. That was after the companies reported what had to be considered less-than-hoped-for efficacy in the clinic. The current trial is the one sorted out by APOE4 status of the patients. After the earlier trial data, it seems unlikely that there's going to be a robust effect across the board - the people with the APOE4 mutation are probably the best hope for seeing real efficacy.
And if bapineuzumab doesn't turn out to work even for them? Well:
“Everyone is waiting with bated breath on bapineuzumab,” said Michael Gold, London-based Glaxo’s vice president of neurosciences, in an interview. “If that one fails, then everyone will say we have to rethink the amyloid hypothesis.”
Now that will be a painful process, but it's one that may well already have begun. beta-Amyloid has been the front-runner for. . .well, for decades now, to be honest. And it's been a target for drug companies since around the late 1980s/early 1990s, as it became clear that it was produced by proteolytic cleavage from a larger precursor protein. A vast amount of time, effort, and money have gone into trying to find something that will interrupt that process, and it's going to be rather hard to take if we find out that we've been chasing a symptom of Alzheimer's rather than a cause.
But there's really no other way to find such things out. Human beings are the only animals that really seem to get Alzheimer's, and that's made it a ferocious therapeutic area to work in. The amyloid hypothesis will die hard if die it does.
+ TrackBacks (0) | Category: Alzheimer's Disease | Clinical Trials | Drug Industry History | The Central Nervous System
Things are pretty quiet around the industry these days, so my blogging thoughts have been turning to Big General Problems. And here's one that I know that people are working on, but which I think we as chemists are going to have to understand much better: localization.
"Say what?" is the usual response to that, but hear me out. What I mean is the trick that living cells use for their feats of multistep synthesis. Enzymes aren't generally just floating around hoping to bump into things - well, some of them are, but a lot of them are tied to specific regions. They're either membrane-bound, or they're expressed in structures where they don't get a lot of chances to diffuse out into the mix. The interior of a cell, on the whole, is a pretty intensely structured place (as it would have to be).
And that allows specific reactions to take place away from other things that might interfere, which is something that we have a hard time doing in the lab. If you have a five-step synthesis, it's a pretty safe bet that you don't dump the reagents for all five steps into the pot at the same time and hope for the best. No, we generally have to fish out the product and take it on separately. It's often a real achievement (especially on larger scale) to be able to "telescope" two steps into one flask and skip any sort of product isolation between them. Doing it with more than one step is even more rare (and more useful when you can bring it off).
There's been a lot of work on one-pot cascade or domino reaction systems, and that's a step toward what we need. But most of these cases are reaction-driven: people find chemistries that can be run in this fashion, and then try to exploit them to make whatever can be made. Nothing wrong with that, but it would be nice to have product-driven approaches, where you'd look at a particular structure and figure out which multicomponent reaction scheme would work best for it. Generally speaking, we just don't have enough worked-out systems to be able to do that.
And that's where I think that some new technologies could help, specifically flow chemistry and/or microfluidics. Instead of figuring out reactions that can exist while all stirring around together in one pot, this approach takes it as a given that many transformations probably just can't be done that way. And if you can't have one big reactor with multiple things in it, then why not make multiple reactors, each with a different thing in it? Flow systems can, in theory, send compounds through a series of isolated reactions, moving the material physically through various zones and reagents. Not every reaction is perfect of course, but you can often use scavenger reagents along the way to strip out potential interfering impurities before the next step.
I like the idea, but there are a lot of things to be done to make it work. Probably the most advanced organic synthesis that's being done is this style is in Steve Ley's lab at Cambridge. I always enjoy reading their flow papers, which make clear that there's some significant optimization that needs to be done before you can throw the switch and stand back. Some other multistep flow work can be found here and here, and the same comment applies: there's a lot of preparation involved.
My hope is that these kinds of things will eventually move toward more of a plug-and-play system, where you put in the various cartridges and choose a protocol from the list of best-general-fits for your planned reactions. We're quite a ways from that, but I don't see why it wouldn't be possible.
+ TrackBacks (0) | Category: Chemical News | Life in the Drug Labs
July 17, 2009
I seem to have been putting a lot of graphics up this week, so here's another one. This is borrowed from a recent Science paper on the future of natural-products based drug discovery. It's interesting both from that viewpoint, and because of the general approval numbers:
And there you have it. Outside of anomalies like 2005, we can say, I think, that the 1980s were a comparative Golden Age of Drug Approvals, that the 1990s held their own but did not reach the earlier heights, and that since 2000 the trend has been dire. If you want some numbers to confirm your intuitions, you can just refer back to this.
As far as natural products go, from what I can see, the percentage of drugs derived from them has remained roughly constant: about half. Looking at the current clinical trial environment, though, the authors see this as likely to decline, and wonder if this is justified or not. They blame two broad factors, one of them being the prevailing drug discovery culture:
The double-digit yearly sales growth that drug companies typically enjoyed until about 10 years ago has led to unrealistically high expectations by their shareholders and great pressure to produce "blockbuster drugs" with more than $1 billion in annual sales (3). In the blockbuster model, a few drugs make the bulk of the profit. For example, eight products accounted for 58% of Pfizer’s annual worldwide sales of $44 billion in 2007.
As an aside, I understand the problems with swinging for the fences all the time, but I don't see the Pfizer situation above as anything anomalous. That's a power-law distribution, and sales figures are exactly where you'd expect to see such a thing. A large drug company with its revenues evenly divided out among a group of compounds would be the exception, wouldn't it?
The other factor that they say has been holding things back is the difficulty of screening and working with many natural products, especially now that we've found many of the obvious candidates. A lot of hits from cultures and extracts are due to compounds that you already know about. The authors suggest that new screening approaches could get around this problem, as well as extending the hunt to organisms that don't respond well to traditional culture techniques.
None of these sound like they're going to fix things in the near term, but I don't think that the industry as a whole has any near-term fixes. But since the same techniques used to isolate and work with tricky natural product structures will be able to help out in other areas, too, I wish the people working on them luck.
+ TrackBacks (0) | Category: Business and Markets | Drug Assays | Drug Development | Drug Industry History
July 16, 2009
I had a printout of the structure of maitotoxin on my desk the other day, mostly as a joke to alarm anyone who came into my office. "Yep, here's the best hit from the latest screen. . .I hear that you're on the list to run the chemistry end. . .what's that you say?"
This is, needless to say, one of the largest and scariest marine natural product structures ever determined (and that determination has been no stroll past the dessert table, either).
But that' hasn't stopped people from messing around with it. And there's much speculation that other people are strongly considering messing around with it, too - you synthetic chemists can guess the sorts of people that this might be, and their names, and what it might be like to sit through the seminars that result, and so on.
I fear that a total synthesis of maitotoxin would be largely a waste of time, but I'm willing to hear arguments against that position. Just looking at it, though, inspires thought. This eldrich beastie has 98 chiral centers. So let's do some math. If you're interested in the SAR of such molecules, you have your choice of (two to the 98th) possible isomers, which comes out to a bit over (3 times ten to the 29th) compounds. This is. . .a pretty large number. If you're looking for 10mg of each isomer to add to your screening collection (no sense in going back and making them again), then you're looking at a good bit over half the mass of the entire Earth. And that's just in sheer compounds; we're not counting the weight of vials, which will, I'd say, safely move you up toward the planetary weight of a low-end gas giant. We will ignore shelving considerations in the interest of time.
Recall that yesterday's post gave a number of about 27 million compounds below 11 heavy atoms. You could toss 27 million compounds into a collection of ten to the 29th and never see them again, of course. But that brings up two points: one, that the small-compound estimate ignores stereochemistry, and we've been getting those insane maitotoxin numbers by considering nothing but. The thing is, with only 11 non-hydrogen atoms, there aren't quite as many chances for things to get out of control. The GDB compound set goes up only to 110 million or so if you consider stereoisomers, which actually isn't nearly as much as I'd thought.
But the second point is that this shows you why the Berne group stopped at 11 heavy atoms, because the problem becomes intractable really fast as you go higher. It's worth remembering that the GDB people actually threw out over 98% of their scaffolds because they represented potential ring structures that are too strained to be very stable. And they only considered C, N, O and F as heavy atoms (even adding sulfur was considered too much to deal with, computationally). Then they tossed out another 98 or 99% of the structures that emerged from that enumeration as reactive and/or unstable. Relax your standards a bit, allow another atom or two, bump up the molecular weight, do any of those and you're going to exceed anyone's computational capacity. Update: the Berne group has just taken a crack at it, and managed a reasonable set up to 13 heavy atoms, with various simplifying assumptions to ease the burden. If you want to mess around with it, it's here, free of charge).
No, there are a lot of compounds out there. And if you look at the really big ones - and maitotoxin is nothing if not a really big one - there are whole universes contained just in each of them. (Bonus points for guessing the source of the name of the post, by the way).
+ TrackBacks (0) | Category: Chemical News | In Silico
July 15, 2009
I've been meaning to get around to a very interesting paper from the Shoichet group that came out a month or so ago in Nature Chemical Biology. Today's the day! It examines the content of screening libraries and compares them to what natural products generally look like, and they turn up some surprising things along the way. The main question they're trying to answer is: given the huge numbers of possible compounds, and the relatively tiny fraction of those we can screen, why does high-throughput screening even work at all?
The first data set they consider is the Generated Database (GDB), a calculated set of all the reasonable structures with 11 or fewer nonhydrogen atoms, which grew out of this work. Neglecting stereochemistry, that gives you between 26 and 27 million compounds. Once you're past the assumptions of the enumeration (which certainly seem defensible - no multiheteroatom single-bond chains, no gem-diols, no acid chlorides, etc.), then there are no human bias involved: that's the list.
The second list is everything from the Dictionary of Natural Products and all the metabolites and natural products from the Kyoto Encyclopedia of Genes and Genomes. That gives you 140,000+ compounds. And the final list is the ZINC database of over 9 million commercially available compounds, which (as they point out) is a pretty good proxy for a lot of screening collections as well.
One rather disturbing statistic comes out early when you start looking at overlaps between these data sets. For example, how many of the possible GDB structures are commercially available? The answer: 25,810 of them - in other words, you can only buy fewer than 0.01% of the possible compounds with 11 heavy atoms or below, making the "purchasable GDB" a paltry list indeed.
Now, what happens when you compare that list of natural products to these other data sets? Well, for one thing, the purchasable part of the GDB turns out to be much more similar to the natural product list than the full set. Everything in the GDB has at least 20% Tanimoto similarity to at least one compound in the natural products set, not that 20% means much of anything in that scoring system. But only 1% of the GDB has a 40% Tanimoto similarity, and less than 0.005% has an 80% Tanimoto similarity. That's a pretty steep dropoff!
But the "purchasable GDB" holds up much better. 10% of that list has 100% Tanimoto similarity (that is, 10% of the purchasable compounds are natural products themselves). The authors also compare individual commercial screening collections. If you're interested, ChemBridge and Asinex are the least natural-product-rich (about 5% of their collections), whereas IBS and Otava are the most (about 10%).
So one answer to "why does HTS ever work for anything" is that compound collections seem to be biased toward natural-product type structures, which we can reasonably assume have generally evolved to have some sort of biological activity. It would be most interesting to see the results of such an analysis run from inside several drug companies against their own compound collections. My guess is that the natural product similarities would be even higher than the "purchasable GDB" set's, because drug company collections have been deliberately stocked with structural series that have shown activity in one project or another.
That's certainly looking at things from a different perspective, because you can also hear a lot of talk about how our compound files are too ugly - too flat, too hydrophobic, not natural-product-like enough. These viewpoints aren't contradictory, though - if Shoichet is right, then improving those similarities would indeed lead to higher hit rates. Compared to everything else, we're already at the top of the similarity list, but in absolute terms there's still a lot of room for improvement.
So how would one go about changing this, assuming that one buys into this set of assumptions? The authors have searched through the various databases for ring structures, taking those as a good proxy for structural scaffolds. As it turns out 83% of the ring scaffolds among the natural products are unrepresented among the commercially available molecules - a result that I assume that Asinex, ChemBridge, Life Chemicals, Otava, Bionet and their ilk are noting with great interest. In fact, the authors go even further in pointing out opportunities, with a table of rings from this group that closely resemble known drug-like ring systems.
But wait a minute. . .when you look at those scaffolds, a number of them turn out to be rather, well, homely. I'd be worried about elimination to form a Michael acceptor in compound 19, for example. I'm not crazy about the N,S acetal in 21 or the overall stability of the acetals in 15, 17 and 31. The propiolactone in 23 is surely reactive, as is the quinone in 25, and I'd be very surprised if that's not what they owe their biological activities to. And so on.
All that said, there are still some structures in there that I'd be willing to check out, and there must be more of them in that 83%. No doubt a number of the rings that do sneak into the commercial list are not very well elaborated, either. I think that there is a real commercial opportunity here. A company could do quite well for itself by promoting its compound collection as being more natural-product similar than the competition, with tractable molecules, and a huge number of them unrepresented in any other catalog.
Now all you'd have to do is make these things. . .which would require hiring synthetic organic chemists, and plenty of them. These things aren't easy to make, or to work with. And as it so happens, there are quite a few good ones available these days. Anyone want to take this business model to heart?
+ TrackBacks (0) | Category: Drug Assays | Drug Industry History | In Silico
July 14, 2009
What does it take for a new technology to catch on in the labs? There's an endless stream of candidates (I hope it's endless, anyway), from small gizmos that you can keep in your drawer to multi-hundred-thousand-dollar machines that need their own air handling systems. But all of them start out in the "is this thing any good?" zone, and not all of them emerge, no matter how much they might cost.
That's the first criterion: does the new equipment do anything useful? You'd think that this would have been worked out by, say, the team that developed the product in the first place, but hope does spring eternal. Companies do sometimes get some funny ideas about what their intended markets are clamoring for.
The second test is whether it does its thing in a way that doesn't mess up what you're already doing. "Useful but annoying" is an all-too-well populated category, and if the balance tips too far toward the latter, people will gradually find reasons to stop using the equipment. With some equipment, you start to feel as if you're paying twenty dollars for $20.03 in pennies, putting the whole process into the "not worth the trouble" bin.
Automation is often a factor here. Poorly engineering automation will drive people away like a skunk, of course. Lack of automation won't drive them away, but it won't give them an incentive to come back, either. But do it right, and you lower the perceived cost of using the equipment. Microwave reactors for chemical reactions are a good example of this. The first buckaroos who did these things used kitchen microwave ovens and homebrew reaction vessels. Then there was a generation of reaction carousels that fit into the oven compartment, but that fell into the "annoying" category. The more recent crops of dedicated machines, though, have caught on. They don't look like microwave ovens at all (for example), since the reaction chamber is much smaller (built, in fact, to fit the reaction vials). And they run from a software interface, allowing you to put your tube in the rack, set up your conditions, and walk away.
That phrase "and walk away" is the key idea behind good lab automation. You shouldn't have to stand in front of a machine to make sure that it's going to do what it's supposed to. You can walk away from NMRs, from LC/MS machines, from fraction collectors and many other devices. But if you can't, because the machine hasn't evolved to the point where automation is possible - or worse, if it has automation you can't trust - then the benefit of using the thing had better be substantial.
Lab-scale flow reactors are a good example of equipment that hasn't quite reached the walk-away stage yet (although I have hopes). I know that there are several machines out there that have some ability to do multiple unattended runs, but I'd be interested to know how many users actually manage to leave the things alone while they're doing them. I'm a fan of flow chemistry, but until the machines are more like the microwave reactors, their user base will be confined more to hairy, wild-eyed types like me. The companies in the business seem to realize, though, that my phenotype will not allow them to earn an honest living, and are taking steps.
+ TrackBacks (0) | Category: Life in the Drug Labs
July 13, 2009
The world may or may not have been waiting for this, but there's now some theoretical support for the Peter Principle. What relevance does this have to the pharma-biotech industry, you ask? Well, actually, you probably don't ask, because you know just the sort of thing I'm talking about. If you've spent any time in any sort of large organization, you've seen what looks like empirical proof of the Peter Principle already - actually, you may already be picturing specific examples and muttering to yourself.
The classic R&D form of the phenomenon is someone who's capable of doing good research, but just terrible at managing people. You don't have to go very far up the hierarchy to see this one. Sad to say, there are quite a few scientists who reach their "level of incompetence" (to put it in Peterian terms) as soon as they get their first direct report under them. People skills are often not necessary to get through graduate school - in some research groups, they might actually be a handicap - so not every fresh PhD is equipped with managerial skills, to put it mildly. (This topic came up around here a few months ago, in a discussion of whether you want a scientist as a CEO in this business or not).
And the problem, in research as in everywhere else. is that educating a bad manager out of being bad is difficult at best, and impossible at worst. For one thing, a substantial number of poor managers have no idea, no idea at all, that anything might be amiss on their end. And the very deficiencies that keep them from realizing this also help to make them more impervious to attempts to change it. There's empirical support for this, too - often, the first thing that incompetent people are bad at is estimating their own competence.
Now that theorists are reproducing the Peter effects in model systems, that brings up the logical next question: can this help us do anything about the problem? The authors have some suggestions, but I don't see them being implemented any time soon. That's because the Peter Principle, if it's really true, necessarily implies that you should resist the temptation to always promote your best people:
We summarize in Table 1 the percentages of gain or loss obtained for the different strategies applied. These results confirm that, within a game theory-like approach, if one does not know what way of competence transmission is acting in a given organization, as usually one has in the majority of the typical situations, the best promotion strategies seem to be that of choosing a member at random or, at least, that of choosing alternatively, in a random sequence, the best or the worst members. This result is quite unexpected and counterintuitive, since the common sense tendency would be that of promoting always the best member, a choice that, if the Peter hypothesis holds, turns out to be completely wrong.
Try getting that one past the HR department!
+ TrackBacks (0) | Category: Business and Markets
July 10, 2009
I wanted to make another brief excursion here, since (as many of you will have seen on the news), the situation in Iran is still very volatile indeed. The proxy-server efforts that I've spoken about here have been overtaken by events - plaintext proxies are basically out of the picture, thanks to countermeasures by the Iranian government.
But there are other ways to get information in and out, as the number of video clips from yesterday's protests make clear. For a roundup, see this post from Massachusetts's own Tehran Bureau: "Geeks Around the Globe Rally to Help Iranians Online". I'm glad to number myself among them.
One aspect of said geekdom is supporting Tor. I'm running a relay on my home computer - that's my machine, the relay named "levoglucosan" on this list of current routers. Setting up Tor took about five minutes to (but no real geek skills whatsoever, as opposed to getting the proxy servers going). Tor's getting a lot of use, as the Tehran Bureau post makes clear:
“Before the election we were seeing about one to two hundred new users [from Iran] per day,” says Andrew Lewman, executive director of The Tor Project.
“Right after the election and as the protests started we started seeing that spike up into 700 – 1,000 per day. Now we’re up to about 2,000 new users a day and around 8,000 connections sustained at any time, which is a huge, dramatic increase.”
The Canadians are doing their part via Psiphon, which has also had thousands of Iranian users recently. Another new effort is Haystack, a new anonymous-access tool which has been specifically designed to circumvent the Iranian regime's web filtering tools. It's modeled on Freegate, which has been giving the Great Firewall of China fits (and has also been useful in Iran, although they've had to cut access back to keep their Chinese bandwidth up). Haystack appears to have had its first test inside Iran yesterday, and appears to be working just as planned. With any luck, it'll soon be giving fits to the Iranian web censors, too: the kind of government that beats unarmed protestors in the streets, that breaks down doors in the middle of the night to haul people away just for suggesting in public that they don't like their leaders.
As a scientist, I believe in freedom of expression and freedom of inquiry. I've donated money and time to the efforts linked to above, and I'd like to urge that others do the same if they can.
+ TrackBacks (0) | Category: Current Events
A new paper coming out in Nature is getting a lot of attention, and well it should. This is some of the more dramatic anti-aging news that's been reported to date. (The accompanying editorial is also surely the first time anyone's quoted "Stairway to Heaven" in Nature).
The work hinges on a kinase enzyme called TOR (you often see an "m" in front of it, for "mammalian"). TOR, in accordance with the best gotta-name-it-something traditions of biochemistry, stands for "target of rapamycin", by which you would deduce (correctly) that rapamycin was discovered well before TOR. Rapamycin's a complex natural product first isolated from bacteria in a soil sample from Easter Island (Rapa Nui) - right here, in fact. In the late 1980s and early 1990s it was (along with another macrolide immunosuppresant, FK-506) the subject of a huge amount of research. (Note that FK-506 and rapamycin, though similar, still have some major differences in mechanism - unraveling these was most definitely nontrivial). Both compounds have strong immunosuppressive properties - the hope was that one or the other might prove to be some sort of universal transplant drug, among other things.
Rapamycin isn't that, but it's still useful, particularly in kidney transplants. And since TOR is involved in a lot of important cellular processes (brace yourself), inhibition of it by rapamycin and synthetic molecules has been studied extensively for other actions. The most interesting (well, perhaps until now) has been as an anticancer therapy. That alone illustrates the trickiness of this area, since one problem with any immunosuppressive therapy is a significantly higher risk of cancer. Decoupling these two effects has occupied a lot of time and effort over the years; that last link should give you an idea of the magnitude of the task.
But rapamycin has also shown life-extending properties in simple organisms, and this latest paper extends this effect to mice. The NIH group studying this had their problems, though - just adding the compound to rodent chow wasn't enough to achieve useful blood levels. More formulation work had to be done to produce an encapsulated version that could make it past the upper gut, and by the time that was worked out, the large cohort of mice set aside for the experiment was. . .well, rather more aged than planned.
But they went ahead with the experiment anyway, starting them off at 600 days old, which is roughly a 60-year-old human. Startlingly, the compound still extends life span, by about 14% in the female mice and 9% in the males. At ages where about 5% of the control mice were still alive, some 20% of the treated mice were still going. That's a very significant result, especially considering the late start. All in all, this looks like the most dramatic mid-to-later lifespan intervention that anyone's ever seen in a mammal. (Caloric restriction, for example, has been basically useless if started at the 600 day mark in mice, and no weight losses were seen here). There's a rapamycin study under way with mice in the prime of rodent life (starting at 270 days), and the preliminary results look quite similar (with again a stronger effect in the females).
The causes of death don't seem to have altered. A good sample of animals from both groups were checked by necropsy, and nothing significant was noted. That seems rather surprising, because the blood levels of the compound are (at least from what I can see) rather high. The paper mentions that the mice had 60 to 70 ng/mL rapamycin, and looking around, I find blood levels of 15 ng/mL mentioned as effective in tumor suppression in one mouse model, and the immunosuppressive doses seem to be similar. I'd be glad to hear from anyone who knows more about rapamycin dosing in mice, though; it's definitely outside my range of experience.
Are people going to run out and start taking the stuff? It wouldn't surprise me, although I'd have to say that that's a bad idea at the moment. There's an awful lot that we don't understand about the tradeoffs between aging, cancer, and the immune response, and I'd hate to end up on the wrong side of that bet. Jumping straight to humans is too big a leap for now, but remember - there are a lot of other mTOR inhibitors out there in development (try this paper for starters). If we can narrow down which pathways are important for lifespan (and believe me, there are people thinking hard about this right now, especially after this paper), then there could be some very interesting opportunities
+ TrackBacks (0) | Category: Aging and Lifespan
July 9, 2009
Let's open up again that contentious subject of scientific jobs. In my entire memory, I have never once heard anyone editorialize that we are turning out too many scientists and engineers. A looming shortage has always been, well, looming. And these days, it's easy to wonder how much of a shortage there can possibly be. This USA Today article (link thanks to a longtime reader of this site) rounds up a lot of quotes from people in the game, and wonders about the same thing:
While there have been warnings for more than 50 years, a renewed push over the past four years has earned the attention of both the Bush and Obama administrations.
Speaking to the National Academy of Sciences in April, Obama announced "a renewed commitment to education in mathematics and science," fulfilling a campaign promise to train 100,000 scientists and engineers during his presidency.
Only problem: We may not have jobs for them all.
As the push to train more young people in STEM — science, technology, engineering and math — careers gains steam, a few prominent skeptics are warning that it may be misguided — and that rhetoric about the USA losing its world pre-eminence in science, math and technology may be a stretch.
I think that one muddying factor (as the article mentions later on) is that lumping all scientists, mathematicians, and engineers together isn't very useful. Civil engineering is very different from optimizing computational algorithms, which is quite different from medicinal chemistry, which is quite different from semiconductor research. When I hear people talk as if all these were part of a coherent whole, I sometimes get the impression that, because of the speaker's own educational background, they must seem to be one somehow. But it doesn't make sense to me.
That said, I know that employment prospects in our own field of drug research are very much on everyone's mind. The last year or two have been the worst I've ever seen for hiring in the industry. I go back only to 1989, but longer-serving colleagues report the same feelings. Looking over the ads that appear in the likes of C&E News certainly doesn't make a person think differently.
The unimpressive rate of successful new drug introductions, coupled with the rising costs of R&D (especially clinical trials), was already squeezing us before this whole economic downturn hit. Outsourcing was one big response to that (again, pre-downturn), and we've hashed over that issue around here several times. (The downturn's effect on the outsourcing business has been mixed, by the way, as far as I can see. Some companies may have increased their offshore work, but others have cut back on it as one form of discretionary spending).
But back to the big questions, which are pretty damned hard to answer: are there technical/scientific fields where the US has too many people for the jobs available? If so, are these situations part of various cyclical trends, or are they full secular downturns, or what? Did we get there by training too many people for a job market that was otherwise in reasonable shape, or did the number of positions start to fall and not hold up that end of the process, or both? And where are all these variables going in the future?
I don't know, and I'm willing to bet that no one else does, either. When you're listening to someone talk about these issues, though, I think that there are several things to look out for that might indicate that the person you're hearing has not thought things through well enough. First off, there's that everything-in-one-category problem that I mentioned above. Anyone who seriously wants to address the issue in that fashion hasn't, I'd say, worked on the problem long enough. Secondly, I think it's fair to say that anyone who seems to uncritically accept the idea of a severe shortage of manpower across the whole technical/scientific area is not arguing from a position of strength. Unfortunately, that category has, in the past few years, included people like Bill Gates, various cabinet secretaries, heads of the National Science Foundation, and other such riff-raff. This isn't helping to clear the air.
Next, anyone who brings up the numbers of Chinese and Indian graduates in these areas, especially anyone who just quotes numbers of "engineers" without breaking things down more, needs to think harder. It's true that impressively huge numbers can be quoted, but (sad to say) they're not all they're cracked up to be, at least not yet:
Even Asia's much-touted numerical advantage is less than it seems. China supposedly graduates 600,000 engineering majors each year, India another 350,000. The United States trails with only 70,000 engineering graduates annually. Although these numbers suggest an Asian edge in generating brainpower, they are thoroughly misleading. Half of China's engineering graduates and two thirds of India's have associate degrees. Once quality is factored in, Asia's lead disappears altogether. A much-cited 2005 McKinsey Global Institute study reports that human resource managers in multinational companies consider only 10 percent of Chinese engineers and 25 percent of Indian engineers as even "employable," compared with 81 percent of American engineers.
So there's that to consider. And we haven't even talked about the various solutions proposed, even stipulated what the problems are. Pour money into education? Industrial policy? Retraining? Tax incentives? It's a mess. I guess my main message is to beware of anyone who tries to tell you that it's a reasonably understandable one.
+ TrackBacks (0) | Category: Business and Markets
July 8, 2009
Anyone who defends the pharmaceutical industry has to be ready to hear, over and over and over, about how much it spends on sales and marketing versus R&D. This is thought to be a telling point about where the priorities really are. I've addressed this one several times, and my best response is to point out that sales and marketing are actually supposed to bring in more money than you spend on them, and do so more reliably than R&D in the short term.
There's now a very useful paper in Nature Reviews Drug Discovery looking at just this issue. The authors (from three universities in the US and Israel) are looking into the general question of which is the better use of money: put it into R&D for the long term, or promote existing products for the short term? I should make clear at the outset that those two options do line up in that way. R&D expenditures take years to pay off, if ever, given the amount of time that drug development takes. And marketing of a current product had better start paying off in a shorter time frame, because every patented drug is a wasting asset, constantly being eaten into by competition and by its time to patent expiration.
So which makes more financial sense? The authors numbers from the Wharton databases on publicly traded drug companies, looking at those with more than $50 million in sales. Using the company stock prices as a measure of value (J. Finance LVI(6), 2431–2456 (2001), I'm giving you references here), they found, in general, that R&D investments have a net positive effect, while increased promotion has a negative effect. (See also Rev. Account Stud. 7, 355–382 (2002), another journal I don't reference much). Both effects are larger for smaller companies, as you might expect, but they held up across the industry. The effect also holds up if you factor out the compensation packages of the top five executives of each company (which is a nice control to run, I have to say). And yes, since you ask, there is a negative effect on stock price that correlates to higher executive compensation, and I'm willing to bet that this effect holds for more than just the drug industry.
Since we're talking about stock prices, which are generally forward-looking, the way to interpret these results is probably that investors expect R&D expenditures to pay off in the long term, but actually expect sales and marketing expenditures to reduce long-term value. If that's so, then why spend money on marketing? The reason the authors propose is just what I'd been talking about: short-term reliability. Drug discovery and development is inherently risky, and promotion of existing products is (at least comparatively) more of a sure thing. Companies engage in a mix of the two to try to even the cash flow out. (And as the authors note, if executive compensation is tied more to short-term performance, then there's an incentive to go with the short-term gains).
In general, though, you'd figure that companies should invest more in R&D. And here's the real kicker: that's exactly what's been happening. As this graph from the paper shows, over the last thirty years expenditures in the Sales, General, and Administrative area have risen only slightly as a per cent of sales. The Cost of Goods Sold category (materials, physical plant, manufacturing facilities, etc.) has gone proportionally down, with an interesting excursion in the mid-1990s. (Note also that this used to be the leading category). And R&D expenditures (again, as a per cent of sales) rose in the 1980s, were flat in the 1990s, and have risen since then. Overall, since 1975, the proportion of money spent on R&D has more than tripled, from 5% to 17%.
This, I hardly need point out, does not fit the narrative of some of the e-mails and comments I get. Some perceptions of the drug industry have us, Back In the Old Days, as spending our money on R&D, only to slimily slide into becoming pure marketing businesses as time has passed, with our recent years being especially disgusting and rapacious. According to these figures, this is at the very least not accurate, and comes close to being the opposite of the truth. Comments are welcome - most welcome, indeed.
+ TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Industry History
July 7, 2009
While we're on the topic of hydrogen bonds and computations, there's a paper coming out in JACS that attempts to answer an old question. Why, exactly, does every living thing on earth use so much ribose? It's the absolute, unchanging carbohydrate backbone to all the RNA on Earth, and like the other things in this category (why L amino acids instead of D?), it's attracted a lot of speculation. If you subscribe to the RNA-first hypothesis of the origins of life, then the question becomes even more pressing.
A few years ago, it was found that ribose, all by itself, diffuses through membranes faster than the other pentose sugars. This results holds up for several kinds of lipid bilayers, suggesting that it's not some property of the membrane itself that's at work. So what about the ability of the sugar molecules to escape from water and into the lipid layers?
Well, they don't differ much in logP, that's for sure, as the original authors point out. This latest paper finds, though, by using molecular dynamic simulations that there is something odd about ribose. In nonpolar environments, its hydroxy groups form a chain of hydrogen-bond-like interactions, particularly notable when it's in the beta-pyranose form. These aren't a factor in aqueous solution, and the other pentoses don't seem to pick up as much stabilization under hydrophobic conditions, either.
So ribose is happier inside the lipid layer than the other sugars, and thus pays less of a price for leaving the aqueous environment, and (both in simulation and experimentally) diffuses across membranes ten times as quickly as its closely related carboyhydate kin. (Try saying that five times fast!) This, as both the original Salk paper and this latest one note, leads to an interesting speculation on why ribose was preferred in the origins of life: it got there firstest with the mostest. (That's a popular misquote of Nathan Bedford Forrest's doctrine of warfare, and if he's ever come up before in a discussion of ribose solvation, I'd like to hear about it).
+ TrackBacks (0) | Category: Biological News | In Silico | Life As We (Don't) Know It
Hydrogen bonds are important. There, that should be an sweepingly obvious enough statement to get things started. But they really are - hydrogen bonding accounts for the weird properties of water, for one thing, and it's those weird properties that are keeping us alive. And leaving out the water (a mighty big step), internal hydrogen bonding is still absolutely essential to the structure of large biological molecules - proteins, complex carbohydrates, DNA and RNA, and so on.
But we don't understand hydrogen bonds all that well, dang it all. It's not like we're totally ignorant of them, for sure, but there are a lot of important things that we don't have a good handle on. One of these may just have been illustrated by this paper in Nature Structural and Molecular Biology by a group from Scripps. They've been working on understanding the fact that all hydrogen bonds are not created equal. By carefully going through a lot of protein mutants, they have evidence for the idea that H-bonds that form in polar environments are weaker than ones that form in nonpolar ones.
That makes sense, on the face of it. One way to think of it is that a hydrogen bond in a locally hydrophobic area is the only game in town, and counts for more. But this work claims that such bonds can be worth as much as 1.2 kcal/mole more than the wimpier ones, which is rather a lot. Those kinds of energy differences could add up very quickly when you're trying to understand why a protein folds up the way it does, or why one small molecule binds more tightly than another one.
Do we take such things into account when we're trying to compute these energies? Generally speaking, no, we do not - well, not yet. If these folks are right, though, we'd better start.
Update: note that the paper itself doesn't suggest that this is a new idea - they reference work going back to 1963 (!) on the topic. What they're trying to do is put more real numbers into the mix. And that's what my last paragraph above is trying to state (and perhaps overstate): it's difficult to account for these thing computationally, since they vary so widely, and since we don't have that good a computational handle on hydrogen bonds in general. The more real world data that can be fed back into the models, the better.
+ TrackBacks (0) | Category: In Silico
July 6, 2009
There's been a raging battle going on in the comments to this post wherein I disparaged homeopathic medicine. I've been staying out of it, but I had to excerpt this comment, make by a persistent advocate for the miracle water:
In the meantime, homeopathy is practiced openly by learned men in Europe. Why is that? Are they THAT ‘superstitious’? That ‘stupid’? Or that ‘corrupt’. Seriously. Is Great Britain RULED by a bunch of superstitious idiots? The Royal family retains homeopaths as part of their medical staff.
I'll be glad to field that one. Why yes, since you ask, if the royal family pays homeopaths, then "superstitious idiots" seems to be a perfectly appropriate phrase. And anyone who believes that any member of a hereditary monarchy (or of any other rich family) has to be more intelligent because of their position. . .well, there are phrases to describe a person like that, too. Hey, we can even be thrifty and reuse "superstitious idiot". This is an old enough logical fallacy to have a Latin name; see above.
If you'd like to see someone else berate the House of Windsor for just these same failings, you can see Richard Dawkins do a first-class job of it here.
+ TrackBacks (0) | Category: Snake Oil
Someone's leaked an American Chemical Society memo to Nature, in which the VP of the publishing division talks about how the printed journals are going to be phased out. The ACS isn't confirming anything, but they're not denying it, either: it looks like the days of paper copies of their journals are numbered.
I've been expecting that. I used to have a print subscription to the Journal of Organic Chemistry back in the early and mid-1990s, and I took them with me in a move in 1997. I interrupted my subscription around that time, and never got around to renewing it. By then, online access was starting to become a more convenient way to locate old articles, and as the ACS improved their archives the advantages became overwhelming. Then I got used to following the new issues online, either by going to the journal's site or by RSS feeds.
So my boxed collection of several years of JOC sat in my basement, in bales of cobalt-blue-covered bricks of paper. I'd planned on moving them into my office, but didn't got around to it at first. That delay allowed the situation to turn into "Hmmm. . .not sure that I see the need to have these taking up the shelf space", which turned into "You know, I need to recycle these things". And gradually, that's just what I did.
When I joined the Wonder Drug Factory in '97, new print journals were still put out on a table in the library as they came in, for people to sit down and read. A few years later, the table was gone, and whole idea was sounding downright Victorian in retrospect. The company where I work now doesn't even have much of a real, printed-on-paper chemistry library at all. It's been years I last picked up a hard copy of any chemistry journal - when I see the cover illustration of a journal on its web site, I keep thinking of "Elegy Written in a Country Churchyard": Full many a flower is born to blush unseen / And waste its sweetness on the desert air. OK, I'm perhaps a bit weird in that respect. But you get the idea.
Printed copies of journals have some advantages. I used to read JOC in the laundromat when I lived in New Jersey, which kept the casual chit-chat down to a stark minimum, I can tell you. I think that the browsing effect of looking through a hard copy is only partially emulated by scrolling through an RSS feed - the old way, you could see all the details inside a paper as you flipped through, and often learned something. So in a way, I'll miss the bound versions. But then I think of those boxes in my basement, and I realize that there's really no other way.
+ TrackBacks (0) | Category: The Scientific Literature
July 3, 2009
I'll be taking today off, as an addition to the Fourth of July weekend. I hope that my American readers enjoy some warm, sunny weather (of the kind that's been in very short supply around here). No matter what the conditions, though, I'll be making a large slow-cooked pork shoulder with plenty of hickory wood. I'll see everyone on Monday!
+ TrackBacks (0) | Category: Blog Housekeeping
July 2, 2009
Moore's Law: number of semiconductors on a chip doubling every 18 months or so, etc. Everyone's heard of it. But can we agree that anyone who uses it as a metaphor or perscription for drug research doesn't know what they're talking about?
I first came across the comparison back during the genomics frenzy. One company that had bought into the craze in a big way press-released (after a rather interval) that they'd advanced their first compound to the clinic based on this wonderful genomics information. I remember rolling my eyes and thinking "Oh, yeah", but on a hunch I went to the Yahoo! stock message boards (often a teeming heap of crazy, then as now). And there I found people just levitating with delight at this news. "This is Moore's Law as applied to drug discovery!" shouted one enthusiast. "Do you people realize what this means?" What it meant, apparently, was not only that this announcement had come rather quickly. It also meant that this genomics stuff was going to discover twice as many drugs as this real soon. And real soon after that, twice as many more, and so on until the guy posting the comment was as rich as Warren Buffet, because he was a visionary who'd been smart enough to load himself into the catapult and help cut the rope. (For those who don't know how that story ended, the answer is Not Well: the stock that occasioned all this hyperventilation ended up dropping by a factor of nearly a hundred over the next couple of years. The press-released clinical candidate was never, ever, heard of again).
I bring this up because a reader in the industry forwarded me this column from Bio-IT World, entitled, yes, "Only Moore's Law Can Save Big Pharma". I've read it three times now, and I still have only the vaguest idea of what it's talking about. Let's see if any of you can do better.
The author starts off by talking about the pressures that the drug industry is under, and I have no problem with him there. That is, until he gets to the scientific pressures, which he sketches out thusly:
Scientifically, the classic drug discovery paradigm has reached the end of its long road. Penicillin, stumbled on by accident, was a bona fide magic bullet. The industry has since been organized to conduct programs of discovery, not design. The most that can be said for modern pharmaceutical research, with its hundreds of thousands of candidate molecules being shoveled through high-throughput screening, is that it is an organized accident. This approach is perhaps best characterized by the Chief Scientific Officer of a prominent biotech company who recently said, "Drug discovery is all about passion and faith. It has nothing to do with analytics."
The problem with faith-based drug discovery is that the low hanging fruit has already been plucked, driving would be discoverers further afield. Searching for the next miracle drug in some witch doctor's jungle brew is not science. It's desperation.
The only way to escape this downward spiral is new science. Fortunately, the fuzzy outlines of a revolution are just emerging. For lack of a better word, call it Digital Chemistry.
And when the man says "fuzzy outline", well, you'd better take him at his word. What, I know you're all asking, is this Digital Chemistry stuff? Here, wade into this:
Tomorrow's drug companies will build rationally engineered multi-component molecular machines, not small molecule drugs isolated from tree bark or bread mold. These molecular machines will be assembled from discrete interchangeable modules designed using hierarchical simulation tools that resemble the tool chains used to build complex integrated circuits from simple nanoscale components. Guess-and-check wet chemistry can't scale. Hit or miss discovery lacks cross-product synergy. Digital Chemistry will change that.
Honestly, if I start talking like this, I hope that onlookers will forgo taking notes and catch on quickly enough to call the ambulance. I know that I'm quoting too much, but I have to tell you more about how all this is going to work:
But modeling protein-protein interaction is computationally intractable, you say? True. But the kinetic behavior of the component molecules that will one day constitute the expanding design library for Digital Chemistry will be synthetically constrained. This will allow engineers to deliver ever more complex functional behavior as the drugs and the tools used to design them co-evolve.
How will drugs of the future function? Intracellular microtherapeutic action will be triggered if and only if precisely targeted DNA or RNA pathologies are detected within individual sick cells. Normal cells will be unaffected. Corrective action shutting down only malfunctioning cells will have the potential of delivering 99% cure rates. Some therapies will be broad based and others will be personalized, programmed using DNA from the patient's own tumor that has been extracted, sequenced, and used to configure "target codes" that can be custom loaded into the detection module of these molecular machines.
Look, I know where this is coming from. And I freely admit that I hope that, eventually, a really detailed molecular-level knowledge of disease pathology, coupled with a really robust nanotechnology, will allow us to treat disease in ways that we can't even approach now. Speed the day! But the day is not sped by acting as if this is the short-term solution for the ills of the drug industry, or by talking as if we already have any idea at all about how to go about these things. We don't.
And what does that paragraph up there mean? "The kinetic behavior. . .will be synthetically constrained"? Honestly, I should be qualified to make sense of that, but I can't. And how do we go from protein-protein interactions at the beginning of all that to DNA and RNA pathologies at the end, anyway? If all the genomics business has taught us anything, it's that these are two very, very different worlds - both important, but separated by a rather wide zone of very lightly-filled-in knowledge.
Let's take this step by step; there's no other way. In the future, according to this piece, we will detect pathologies by detecting cell-by-cell variations in DNA and/or RNA. How will we do that? At present, you have to rip open cells and kill them to sequence their nucleic acids, and the sensitivities are not good enough to do it one cell at a time. So we're going to find some way to do that in a specific non-lethal way, either from the outside of the cells (by a technology that we cannot even yet envision) or by getting inside them (by a technology that we cannot even envision) and reading off their sequences in situ (by a technology that we cannot even envision). Moreover, we're going to do that not only with the permanent DNA, but with the various transiently expressed RNA species, which are localized to all sort of different cell compartments, present in minute amounts and often for short periods of time, and handled in ways that we're only beginning to grasp and for purposes that are not at all yet clear. Right.
Then. . .then we're going to take "corrective action". By this I presume that we're either going to selectively kill those cells or alter them through gene therapy. I should note that gene therapy, though incredibly promising as ever, is something that so far we have been unable, in most cases, to get to work. Never mind. We're going to do this cell by cell, selectively picking out just the ones we want out of the trillions of possibilities in the living organism, using technologies that, I cannot emphasize enough, we do not yet have. We do not yet know how to find most individual cells types in a complex living tissue; huge arguments ensue about whether certain rare types (such as stem cells) are present at all. We cannot find and pick out, for example, every precancerous cell in a given volume of tissue, not even by slicing pieces out of it, taking it out into the lab, and using all the modern techniques of instrumental analysis and molecular biology.
What will we use to do any of this inside the living organism? What will such things be made of? How will you dose them, whatever they are? Will they be taken up though the gut? Doesn't seem likely, given the size and complexity we're talking about. So, intravenous then, fine - how will they distribute through the body? Everything spreads out a bit differently, you know. How do you keep them from sticking to all kinds of proteins and surfaces that you're not interested in? How long will they last in vivo? How will you keep them from being cleared out by the liver, or from setting off a potentially deadly immune response? All of these could vary from patient to patient, just to make things more interesting. How will we get any of these things into cells, when we only roughly understand the dozens of different transport mechanisms involved? And how will we keep the cells from pumping them right back out? They do that, you know. And when it's time to kill the cells, how do you make absolutely sure that you're only killing the ones you want? And when it's time to do the gene therapy, what's the energy source for all the chemistry involved, as we cut out some sequences and splice in the others? Are we absolutely sure that we're only doing that in just the right places in just the right cells, or will we (disastrously) be sticking in copies into the DNA of a quarter of a per cent of all the others?
And what does all this nucleic acid focus have to do with protein expression and processing? You can't fix a lot of things at the DNA level. Misfolding, misglycosylation, defects in transport and removal - a lot of this stuff is post-genomic. Are we going to be able to sequence proteins in vivo, cell by cell, as well? Detect tertiary structure problems? How? And fix them, how?
Alright, you get the idea. The thing is, and this may be surprising considering those last few paragraphs, that I don't consider all of this to be intrinsically impossible. Many people who beat up on nanotechnology would disagree, but I think that some of these things are, at least in broad hazy theory, possibly doable. But they will require technologies that we are nowhere close to owning. Babbling, as the Bio-IT World piece does, about "detection modules" and "target codes" and "corrective action" is absolutely no help at all. Every one of those phrases unpacks into a gigantic tangle of incredibly complex details and total unknowns. I'm not ready to rule some of this stuff out. But I'm not ready to rule it in just by waving my hands.
+ TrackBacks (0) | Category: Drug Industry History | General Scientific News | In Silico | Press Coverage
July 1, 2009
+ TrackBacks (0) | Category: Blog Housekeeping
I wrote last summer about Vanda Pharmaceuticals and their difficulty getting a new antipsychotic Fanapt (iloperidone) through the FDA. At the time, they'd received one of those wonderful requests for more information from the agency, of the kind that spread cheer whenever they appear. I couldn't see how the company could clear this up without (probably) having to spend a lot of money that it didn't have, and I was very pessimistic about their survival.
And I was wrong. Big-time. Vanda received approval for iloperidone, in what is a major surprise not just for me, but for the company's hardy shareholders and for the few analysts left covering them. After congratulating the company, I feel like asking them "So, how did you do that, anyway?" To the best of my knowledge, the company didn't go back into the clinic - and it's hard to see how they even could have. Less than a year just isn't feasible from a standing start in an antipsychotic trial just on logistic grounds, let alone the fact that Vanda doesn't seem to have had the funds to even try.
So was this all just a regrettable misunderstanding? And if so, on whose part? Did the FDA misinterpret something, only to be argued back by the company? Or did Vanda mess something up in the original regulatory package? We may never know.
The question now that the dog has caught the mail truck is what to do with it. No deal has been announced yet to market the compound, and Vanda still doesn't seem to have the funds to sell it by itself. (Moreover, they don't seem to be recruiting a sales force). Some observers think that the company may have had time selling itself off, and that the run in the stock was overdone just for that reason.
In the meantime, though, the company should enjoy its good fortune (as should anyone who was holding its stock when the news hit). And readers of this blog should make a note that, in case there was any doubt, I can be completely, totally wrong about the field I work in. . .
+ TrackBacks (0) | Category: Business and Markets | Regulatory Affairs | The Central Nervous System