About this Author
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
August 29, 2008
There’s no end to the variables that can kick your data around in drug discovery. If you concentrate completely on all the things that could go wrong, though, you’ll be too terrified to run any useful experiments. You have to push on, but stay alert. It’s like medical practice: most of the time you don’t have to worry about most of the possibilities, but you need to recognize the odd ones when they show up.
One particular effect gets rediscovered from time to time, and I’ve just recently had to take it into account myself: the material that your vials and wells are made out of. That’s generally not a consideration for organic chemists, since we work mostly in glass, and on comparatively large scale. There are some cases where glass (specifically the free OH groups on its surface) can mess up really sensitive compounds, but in drug discovery we try not to work with things that are that temperamental.
But when you move to the chemistry/biology interface, things change. Material effects are pretty well-known among pharmacokinetics people, for example, although not all medicinal chemists are aware of them. The reason is that PK samples (blood, plasma, tissue) tend to have very small amounts of the desired analyte in them, inside a sea of proteins and other gunk. If you’re going down to nanograms (or less) of the substance of interest, it doesn’t take much to mess up your data.
And as it turns out, different sorts of plastics will bind various compounds to widely varying degrees. Taxol (OK, taxotere) is a notorious example, sticking to the sides of various containers like crazy. And you never know when you're going to run into one of those yourself. I know of a drug discovery project whose PK numbers were driving everyone crazy (weirdly variable, and mostly suggesting physically impossible levels of drug clearance) until they figured out that this was the problem. If you took a stock solution of the compound and ran it though a couple of dilutions while standing in the standard plastic vials, nothing was left. Wash the suckers out with methanol, though, and voila.
Here's a paper which suggests that polystyrene can be a real offender, and from past experience I can tell you to look out for polypropylene, especially the cheap stuff. You won't notice anything until you get way down there to the tiny amounts - but if that's where you're working, you'd better keep it in mind.
+ TrackBacks (0) | Category: Drug Development | Pharmacokinetics
August 28, 2008
You know, every time I point out a paper from PNAS, there are always a few comments to the effect of "Why do you bother reading that garbage heap, anyway?" Since I keep citing papers from the journal, it's obvious that I disagree, but I suppose I should take a minute to explain why.
The reason people are down on PNAS is the way that members of the National Academy can, if they choose, sort of jam things into the journal through a side entrance. Here are all the details. The unusual thing about the journal is the existence of "Track I". Basically, a member of the NAS can publish up to four of their own papers per year. Each of these have to be submitted with the comments of two qualified referees, but the author gets to pick them. So a reasonable member should be able to get any sort of interesting or at least non-insane paper in there, by judicious choice of colleagues for review. Members can also pass along up to two papers a year by others in their field, with a similar review process (Track III). Some NAS members take full advantage of these privileges, and some hardly ever do, even (so I'm told, in some cases) for their own papers.
It's a lot less rigorous than the open (Track II) submissions, that's for sure. For those, you're supposed to name three editorial board members, three NAS members, and five external referees, and the editorial board can still do whatever it wants with your paper or with the lists you've sent. (To be sure, they can also reject those direct-submission papers from members, although no figures are available on how often that happens). Two thirds of the Track II submissions are rejected before being sent out for review at all.
But hold on: according to the journal, 80% of the submissions are via Track II, but those make up only 40% of the published contents. Doing the math, that means that the most of Track I and Track III submissions have to get in. Assume 80 Track II manuscripts and 20 of the others. Rejecting two-thirds of the first group will give you about 27 papers to send out for review. If you've accepted all 20 of the others, that means that about half of those 27 will have to get canned during the later review process, to make that 40/60 proportion come out right. So the overall acceptance rate for open submissions has to be, at most, 16%.
But if you ditch some of the 20 member-track papers, you have to come down even harder on the open submissions, of course. If you only (only!) take 75% of the member submissions, that gives you 15 manuscripts. Now you have to reject not half of the open submission papers that made the first cut, but 63% of them, to knock it down to ten published Track II submissions. So with an acceptance rate of 75% for member submissions, it has to be about 12% for everyone else. And so on.
So much for the numbers - it's clear that NAS members must put a lot of things of their own (or from their friends) into PNAS. The real question is: what does this do to the quality of the journal? As far as I can see, it's still a very interesting read, and definitely cannot be safely ignored. And the publication routes are out on the table: if you want to keep score and adjust your perceptions accordingly, the Track I papers are identified as "Contributed by" the member, and the Track IIIs are "Communicated by". I think, myself, that the advantage of letting members publish unusual or possibly controversial work outweighs the temptation to fill the journal with junk.
+ TrackBacks (0) | Category: The Scientific Literature
August 27, 2008
As Arthur Kornberg never tired of pointing out, cells are gels. It’s too easy for biologists and chemists to imagine cells as sort of like liquid-filled plastic bags – and while that’s an OK picture as far as it goes, it tends to make you picture the cytoplasm as a lot more dilute than it really is. Something about the consistency of gumbo is more like it – maybe even the first batch of gumbo I ever made, to the whole pot of which I cluelessly added a good strong dose of filé powder, causing it all to set up into something could be nearly be sliced like a meat loaf.
The point is, there’s nowhere near as much bulk solvent in a cell as there is in anything you’d willingly work with in a lab. And that means that a lot of things behave differently than you expect – proteins, for example. Spherical proteins are the easiest to deal with, since they’re probably going to stay that way no matter what happens to them, short of outright denaturation. (Spheres are good choices for extreme environments). But most proteins aren’t spherical, and their shapes are extremely important – how well do we understand their behavior under real-world conditions?
It’s not an easy question to answer. The standard ways to study protein structures are (1) X-ray crystallography, a rather artificial state of affairs, since proteins are rarely found in the crystalline state in vivo, (2) NMR spectra, which can be very informative but are usually taken from purified proteins in a clean buffer solution, and (3) molecular modeling. That last technique’s relation to reality depends (among other things) on the patience, skill, and computational resources of the people using it. But just making sure that you’re modeling a protein’s structure in the presence of water molecules, rather than in some sort of ideal mathematical vacuum, can be enough of a challenge. Including a stew of other proteins right around the one of interest just isn’t feasible, even if we knew which ones to put in.
There’s a recent open-access paper in PNAS that does a good job calling attention to this problem. The authors studied a roughly football-shaped protein, VIsE, which comes from Borrelia burgdorferi, the Lyme disease organism. Diagnostic tests for Lyme recognize one stretch of this protein - but the odd thing is, that region appears to be buried inside the hydrophobic core of its structure, which makes you wonder how anything could recognize it at all.
The team studied the protein under different levels of denaturing agents and non-denaturing additives, and found several different structures seem to present themselves under different conditions. To their evident surprise, this even agreed with their molecular modeling of the process. Both the speed of protein folding and the courses the folding takes are altered - and under cellular levels of crowding, it turns out the protein may well adopt a spherical state that exposes much more of that antigen sequence. That's shown in the illustration, where C is the structure that's suggested for real-world conditions, as opposed to A, and the antigen sequence is shown in green. (The Y axis relates to the volume fraction taken up by various crowding agents).
Drug discovery people have always been wary of the structures of membrane-bound proteins, because we don't understand much about them. We should be wary of the structures of the free-swimming ones, too - after all, they're certainly wary of us.
+ TrackBacks (0) | Category:
August 26, 2008
As all organic chemists who follow the literature know, over the last few years there’s been a strong swell of papers using Barry Sharpless’s “click chemistry” triazole-forming reactions. These reaction let you form five-membered triazole rings from two not-very-reactive partners, an azide and an acetylene, and people have been putting them to all kinds of uses, from the trivial to the very interesting indeed.
In the former category are papers that boil down to “We made triazoles from some acetylenes and azides that no one else has gotten around to using yet, and here they are, for some reason”. There are fewer of those publications than there were a couple of years ago, but they’re still out there. For its part, the latter (interesting) category is really all over the place, from in vivo biological applications to nanotechnology and materials science.
One recent paper in Organic Letters which was called to my attention starts off looking as if it’s going to be another bit of flotsam from the first group, but by the end it’s a very different thing indeed. The authors (from the Isobe group at Tohoku University in Japan, with collaborators from Tokyo) have made an analog of thymine, the T in the genetic code, where the 2-deoxyribose part has both an azide and an acetylene built onto it.
So far, so good, and at one point you probably could have gotten a paper out of things right there – let ‘em rip to make a few poly-triazole things and send off the manuscript. But this is a more complete piece of work. For one thing, they’ve made sure that their acetylenes can have removable silyl groups on them. That lets you turn their click reactivity on and off, since the copper-catalyzed reaction needs a free alkyne out there. So starting from a resin-supported sugar, they did one triazole click reaction after another in a controlled fashion – it took some messing around with the conditions, but they worked it out pretty smoothly.
And since the acetylene was at the 5 position of the sugar, and the azide was at the 3, they built a sort of poly-T oligonucleotide – but one that’s linked together by triazoles where instead of the phosphate groups found in DNA. People have, of course, made all sorts of DNA analogs, with all sorts of replacements for the phosphates, but they vary in how well they mimic the real thing. Startlingly, when they took a 10-mer of their “TL-DNA” (triazole-linked) and exposed it to a complementary 10-residue strand of good ol' poly-A DNA, the two zipped right up. In fact, the resulting helix seems to be significantly stronger than native DNA, as measured by a large increase in melting point. (That's their molecular model of the complex below left).
Well, after reading this paper, my first thought was that it might eventually make me eat some of my other words. Because just last week I was saying things about the prospects for nucleic acid therapies (RNAi, antisense) - mean, horrible, nasty things, according to a few of the comments that piled up, about how these might be rather hard to implement. But when I saw the end of this paper, the first thing that popped into my head was "stable high-affinity antisense DNA backbone. Holy cow". I assume that this also crossed the minds of the authors, and of some of the paper's other readers. Given the potential of the field, I would also assume that eventually we'll see that idea put to a test. It's a long way from being something that works, but it sure looks like a good thing to take a look at, doesn't it?
+ TrackBacks (0) | Category: Biological News
August 25, 2008
You need access to vacuum if you’re going to work at the bench in chemistry. In fact, you need more than one kind. Reasonably hard vacuum (well, by our standards, which is laughable by the standards of the physicists) is down in the single Torr or below – that is, less than about 1% of normal air pressure. We use that for pulling out residues of water or organic solvents from our compounds. You can’t usually see it happening from the solid ones, but the syrupy liquids will foam up or blow a long series of thick bubbles when the vacuum is applied. The foam can be an irritating problem at times; some things will fill your flask with sticky bubbles and go right on up into the vacuum line if you’re not watching them.
The lesser vacuum lines are used for bulk evaporation of solvent (on your rotavap) and for filtering things off. We do an awful lot of both of those, too, and a full vacuum-pump pull is too vigorous for them in most cases. Evaporating down reactions is a constant task in an organic chemistry lab; I’d rather not think about how much of it I’ve done over the years. As for filtration, there are many cases where a solid product can be filtered out of the bulk liquid (which is good) or where some undesired solid by-product has to be filtered out before you can go on (not as good).
The low-tech way to get the sort of pull-it-though vacuum you need for these things is a water aspirator. You don’t see these as much any more, and you don’t see them at all in industry, since they necessarily pull solvent vapors into the water stream. But they work. An aspirator is basically a narrowing tube that hooks up to a hard-spraying water tap and has a sidearm fitting. The accelerating blast of water pulls the air in the tube along with it as it goes, creating a useful vacuum. If you wanted to make one rather more environmentally friendly, you’d keep a well-stocked dry ice condenser in line with it to trap out the solvent vapors before they go down the drain (which is what your rota-vap should have on it, anyway), but even with that, you’re always going to be turning the water flow into a waste stream. As I say, you don’t see them as much these days.
But we used them back when I was in grad school, that’s for sure, mostly for the rotavaps. If you wanted to keep things from splashing around back in your hood, you attached some rubber tubing to the other end of the thing and ran it further down the drain a bit.
Well, one day, one of the guys in the lab next door to me was shocked to see water blasting around in his hood. It was a real fountain, just geysering out full blast from what must have been a cracked water line or something in the back. He ran over and immediately shut off every tap – but to no avail. Roaring, showering water everywhere. Getting a look at the source, he realized, to his consternation, that the water was coming up out of the drain in the back of his hood. I remember standing there with him, staring at this in disbelief. It looked like a special effect. How on earth could you get water blasting up out of a drain pipe?
Suddenly it hit me. I ran around to the other side of the lab, where a new Japanese post-doc had taken up residence. “Masa”, I asked him, “Did you just put that rota-vap in your hood today?” “Yes, yes, just started it today”. There was a water aspirator flooshing away back in the back of his hood. “Did you put some rubber tubing on that thing?” “Tubing? Oh, yes” “How much?!” “Whoaaa. . .” He spread his arms to indicate the mighty extent of the rubber tubing he’d added.
Mighty, indeed. He’d run the stuff down his drain, through a horizontal pipe and right through a T joint, and back up out of the drain of the other guy’s hood, which backed on to his. So when he turned his water on full throttle, he immediately started irrigating his labmate’s space. We finally go thing turned off, and trimmed back the rubber tubing to a more reasonable length (like, not seven feet), and order was restored. For a while.
Note: if you want to see How Not To Do It to a really expensive vacuum rig, try here.
+ TrackBacks (0) | Category: Graduate School | How Not to Do It | Life in the Drug Labs
August 22, 2008
The Boston Globe has a piece on the open-source science movement. Many readers here will have come across the idea before, but it’s interesting to see it make a large newspaper. (Admittedly, the Globe is more likely to cover this sort of thing than most metropolitan dailies, given the concentration of research jobs around here).
The idea, as in open-source software development, is that everything is out in a common area for everyone to see and work on. (Here's one of the biggest examples). Ideas can come from all over, and with progress coming more quickly as many different approaches get proposed, debated, and tried out. I like the idea, in theory. Of course, since I work in industry, it’s a nonstarter. I have absolutely no idea of how you’d reconcile that model with profitable intellectual property rights, and I haven’t seen any scheme yet that makes me want to abandon profit-making IP as the driver of commercial science. Of course, there's always the prize model, which is worth taking seriously. . .
Even for academic science, open source work runs right into the traditional ideas of priority and credit, and the article doesn’t resolve this dilemma. (As far as I can tell, the open-source science advocates haven’t completely resolved it, either). There’s always the lingering (or not-so-lingering) worry about someone scooping your results, and for academia there’s always that little question of grant applications. There have been enough accusations over the years in various fields of people lifting ideas during grant proposal reviews or journal refereeing to make you wonder how well a broader open-source system would work out, given the small but significant number of unscrupulous people out there.
On the other hand, maybe if things were more open in general, there would be less incentive to lift ideas, since the opportunities to do so wouldn’t be so rare. And if someone’s name is associated from the beginning with a given idea, on some open forum, it could make questions of priority easier to resolve. A subsidiary problem, though, is that there are people who are better at generating ideas than executing them – some of these folks, once unchained, could end up with their fingerprints on all sorts of things that they’ve never gotten around to enabling. Of course, that might be a feature rather than a bug: people who generate lots of ideas are, after all, worth having around. And over time, there might well be less of a stigma than there is now for someone else to follow up on these things.
The thing is, science has already been a form of open-source work for hundreds of years now. It’s just that the information has been shared at a later stage, though presentations and publications, rather than being put out there right after it’s been thought up or while it’s being generated. That’s why I always shiver a bit when I read about how long Isaac Newton waited before writing up any of his results – if Edmund Halley hadn’t pressed him to do it, he might never have gotten around to it at all, which would have been a terrible tragedy.
And it’s why stories like those told of physicist Lars Onsager strike me as somehow wrong. Onsager was famous for only publishing his absolute best work – which was pretty damned good – and putting the rest into his copious file cabinets (example here). (A related trait was that he was also apparently incapable of lecturing at any comprehensible level about his work). Supposedly, younger colleagues would come by once in a while and tell him about some interesting thing that they’d worked out, and ask him if he thought it was correct. Onsager would pause, dig through his files, pull out some old unpublished work that the new person had unknowing duplicated, and say “Yes, that’s correct”. It seems to me that you don’t want to do that, withholding potentially useful results for the sake of what is, in the end, a form of vanity.
And although I'm not exactly Lars Onsager, this is as good a time as any to mention that my summer student, who’s finishing up in the lab this week, has been able to generate a lot of interesting data, and that I’m going to be trying to write it up this fall for publication. Readers may be interested to know that this work is based on more ideas I’ve had in the vein of the “Vial Thirty-Three” project detailed here, so with any luck, people will eventually be able to see some of what I’ve been so excited about all this time. And that’s about as open-source as this industrial scientist can get!
+ TrackBacks (0) | Category: Birth of an Idea | The Scientific Literature | Who Discovers and Why
August 21, 2008
Many readers will well remember when Merck bought the RNA-interference company Sirna in 2006. They paid over a billion dollars for them, and made the whole RNA area an even bigger field for speculation than it was already.
Another big player in that field is Alnylam, who have been making deals all over the place. Many shareholders have been waiting for someone to buy ALNY for a similarly hefty premium, but the wait has been long (and all those agreements make such an acqusition harder and harder to realize).
As that post (and this one, and this one from 2004)) should make clear, I've been a bit cooler on the prospects for RNA therapies. I think the current RNA field is full of extremely interesting things, wonderful discoveries, fascinating research tools which could lead to all sorts of things - but I don't necessarily think it's full of new drugs per se. Nucleic acid-based therapies are just nightmarish to administer, and unless a real breakthrough in doing that appears, I think that (as drugs) they're always going to have their ankles tied together.
Well, Jonas Alsenas at Leerink Swann agrees, and he's not afraid to say so. According to Mike Huckman at CNBC, the firm initiated coverage of Alnylam with Alsenas saying that he thought the stock should be trading at about half its current value, and that he didn't see them developing any products for many years, if ever. And he went on to this statement, which I don't think anyone in the industry can deny:
The pharmaceutical industry is often swept by new technology fads. They are caused by sincere enthusiasm, fears of being left behind, and desperation to fill chronically depleted development pipelines, in our view.
I'm sure that the ALNY investors are not going to take this well, but hey, the truth hurts. For now, I continue to agree that modern RNA techniques are extraordinary research tools - but not drugs, not in almost every case.
+ TrackBacks (0) | Category: Business and Markets
August 20, 2008
Well, today’s subject isn’t a cheerful data set, but it certainly deserves some thought. Over at Pharmalot, Ed Silverman has some data from consulting firm AVOS Life Sciences, who have sat down to estimate how well various drug companies will do with revenue from new drugs over the next few years.
As of 2007, they have the industry average at about 77 cents coming from new products (defined as those launched within the previous five years) for every dollar lost from patent-expiring older ones. That doesn’t sound very good, but the average is a bit misleading, since it runs from the highs of Eli Lilly ($6.64/1), Amgen ($4.50/1) and Roche ($4.03/1) down to Sanofi-Aventis (11 cents new per dollar loss on the old). But it’s true that most everyone else is well under a dollar. It would be a lot of work, but it would be interesting to know (calculating by the same methods) how that ratio has changed over the last twenty years – that would give us some perspective on where we stand now.
But AVOS has gone out to estimate the picture in 2012, and it makes today’s numbers seem like a free buffet. Of the fourteen drug makers on their list, only Schering-Plough shows a robust increase in terms of how much it’s expected to make from new products versus its declining ones. GSK shows a modest improvement – and everyone else goes down.
That’s as in down, dooby doo, down down. The hardest-hit in terms of the actual numbers are Pfizer, AstraZeneca, Roche, and Sanofi-Aventis, all of whom are projected to be making pennies (or, gulp, nothing at all) from new products compared to what’s heading down the chute for them by then. In percentage terms, Roche and Eli Lilly are worst off – they look good now, as mentioned above, but the eventual losses of things like Zyprexa kick the ratios over good and hard. (Sanofi-Aventis goes down to zero, but only from that $0.11 figure, so it’s at least not going to be such an adjustment for them!)
As I say, I don’t have access to the underlying data, but the broad picture seems about right. There are a lot of big patent expirations coming up in the next few years, and not enough promising products coming on to replace them. According to AVOS, Roche and Sanofi-Aventis aren’t projected to have any new product launches at all between now and 2012, which can’t be good.
It’s worth remembering that figures like these are likely to show big swings even under normal conditions. Imagine a company with a big product that it launches, which gradually turns into a blockbuster. Near the end of its patent life, it launches another winner of the same type, which grows into another big seller. Everything’s fine! But the ratio of new revenue/expiring revenue is going to swing around a lot as you follow those sales numbers, sort of like derivatives in calculus, veering from too-high to too-low, although the company itself is sailing along pretty well. Let’s hope that this is some of the background for these numbers as well. The problem is, I don’t think that can explain all of them. . .
+ TrackBacks (0) | Category: Drug Development | Drug Industry History
August 19, 2008
I wanted to recommend this post by Milkshake over at Org Prep Daily (and not just because he liked the recent column I wrote for Chemistry World). I was writing about the limited number of reactions that some med-chem labs get locked into, and the effect of this both on the compounds that get made, and on the motivation of the chemists. Milkshake has a good set of recommendations on how to avoid the boredom trap, and I recommend checking them out. He ends with the following:
You should care about the chemistry methodology and do things not just to crank out the final compounds to fill up the testing queue. Your boss (has) perhaps lost all his chemistry interest already and maybe he is unnerved about the project progress and pushes people hard - but while you try not to get fired you don’t necessarily want to think like your boss (and end up wretched). If you continue to look at your research project with curiosity and do things also for the sake of your chemistry interest you are likely to be more original because thinking about the methodology will suggest new directions in your medchem project. You may get accused of playing with chemistry and going off-tangent but you will likely remain more content and productive. . .
And this is all true. Most projects need some oddball compounds thrown into them, to keep things interesting (and honest), and it’s the people who are keeping up with the literature who will probably make them. I went through a period some years ago when I didn’t stay current with the journals very well, and if I’d let that continue to slide, it would have had a bad effect. (RSS was one of the things that saved me!)
But there’s another very good reason to stay sharp and run the unusual reactions, though: the boring reactions are increasingly going to be shipped to someone else, someone who probably works in a very different time zone. Yep, this is my “give ‘em something they can’t get in Shanghai” talk again. The outsourcing shops are there to pound out molecules as quickly as possible, and they’re going to use well-established chemistry as much as they can. Now, that’s the same pressure that operates in most med-chem projects, but I strongly recommend differentiating yourself if possible.
Be the person who runs the new stuff, who reads the literature and adopts things quickly, and who makes compounds that aren’t like all the stuff that’s already in the screening deck. You don’t have to go completely crazy, you know. There are plenty of good, reasonable structures that no one else is making at your company – have no doubt – and if you’re the person who makes them and who introduces new chemistry into the department, you have something with which to justify your salary (or a higher one!) On the other hand, if you’re the person who cranks out the sulfonamide libraries, well. . .they can get that cheaper somewhere else, you know.
+ TrackBacks (0) | Category: Life in the Drug Labs
August 18, 2008
So Genentech has told Roche to get lost – well, to a first approximation, anyway. I think what they’ve actually told them is to go open their Swiss wallets wider. What it comes down to now is how highly Genentech values itself versus how much Roche is willing to pay – the balance between those two will determine how things go. And then there are the large shareholders in Genentech to consider – if their idea of a good price clashes with the figure that Genentech’s board has in mind, then things could get more complicated. (And if the US dollar continues to climb against the Euro, that could complicated everyone's calculations, too - at the very least, it's speeding things up).
Personally, I think that Genentech is better off being left alone. But that’s no surprise – I think that in a lot of the M&A deals I’ve seen in the industry, particularly between large companies, I’ve thought that the participants should have stayed home and spent their money elsewhere. A personality defect, to be sure, and clear evidence that I’d never make it at an investment bank.
The reasons I think that Genentech is better off unmolested are probably the same ones that its own employees have. The company seems to have a good research culture going – they’ve been productive and willing to take risks, which is all you can ask of a drug discovery organization. Roche, for its part, isn’t exactly an Evil Empire, but they’re not Genentech. And that, I think, is what gets me about most of these deals. I think that there is no one best way to do drug discovery, since the problems we face are so varied. And that means that the more different approaches there are being tried, the better. We need a healthy ecology in this industry, and the closer we get to a monoculture, the worse off I think we’ll be. I think that Genentech has something to offer all its own, and that it’s in danger of being lost if Roche buys (and Roche-ifies) the place.
Some people out there are worried more than others. Roche doesn’t have as much experience in biologics, so they’ll want to retain the protein groups. (The question is whether they'll want to work for Roche!) But Genentech has also made a push into small molecules in recent years, and medicinal chemistry might be an area that Roche feels it has enough of already – they’re not buying Genentech for small molecules, after all. We’ll see over the next month if they’re buying Genentech at all. . .
+ TrackBacks (0) | Category: Business and Markets
August 15, 2008
Just wanted to let people know that yes, I'm still out here. I've returned from vacation, and am dealing with the usual catch-up on everything that's going on. That includes a flood of interesting data at work, thanks to my summer student, which is always nice to come back to!
Regular posting will resume on Monday, and we'll get back to what passes for normal around here.
+ TrackBacks (0) | Category: Blog Housekeeping
August 5, 2008
I've just been told (by a reliable source) that something big is up with the Roche-Palo Alto site. I don't know if this is part of their bid for Genentech or what, but the word "closing" has been mentioned. I hate to pass on news like this with no more details, but something does appear to be going on. Anyone with more details, please add them in the comments section.
So much for not posting on my vacation - I haven't even finished packing for my flight yet. What a year this is for the industry, and it's only August. . .
+ TrackBacks (0) | Category: Business and Markets
I wanted to let people know that starting tomorrow I'll be taking some vacation time. Internet access will be rather limited - I'll be checking my mail some in the evenings, but there will be no posting until the middle of next week. Science will have to march on without me for a few days!
+ TrackBacks (0) | Category: Blog Housekeeping
August 4, 2008
As mentioned in the comments here (and as told to me by e-mail as well), a lot of Genentech employees are looking around for other options in the face of a possible Roche takeover. A lot of Genentech employees – some other Bay area biotechs are apparently seeing shoals of CVs coming in. Does that ever give an acquiring company pause, when people start diving over the sides at its approach? I suppose it depends on if they’re in it to buy the current pipeline or to buy some research productivity. But surely Roche wants some of the latter? If they do finally succeed in buying Genentech, what will they have bought by the time they finish?
And while we’re on the job-seeking topic, I’ve heard about some possibilities for ex-GSK people (and others out on the market from the various recent layoffs). Merck is hiring at their West Point, PA site, for one. EMD-Serono is expanding and looking for people in Rockland, MA. And a rare drug-discovery opportunity outside the industry is also available at the NIH Chemical Genomics Center. I have contacts for these if people want to send in CVs directly - just e-mail me and let me know.
+ TrackBacks (0) | Category: Business and Markets
August 1, 2008
The ax is falling again at GlaxoSmithKline. This time it’s the oncology group.
Last month the cardiovascular people got this same treatment, you’ll recall, and there was some disagreement about how many jobs were being affected. But it looks like the company is moving one by one through its Centers of Excellence in Drug Discovery (CEDDs) and running a most excellent scythe through them. By the time they’re through, the total number of layoffs looks like it will be substantial indeed.
That’s because inside each area so far the cutbacks are pretty sweeping. Total oncology head count is apparently being reduced by about 40%. Discovery chemistry seems, unfortunately, to be getting it a bit worse, since some of the sub-areas aren't losing head count at all. The estimates I have are that of the c. 120 chemists in the area, about 60 are losing their jobs. That includes the entire oncology med-chem group at the Research Triangle Park location, and from what I'm told, none of them are being relocated to the Philadelphia-area sites. So much for discovering Tykerb, et al.
Are all of the CEDDs going to get this same treatment, or to the same degree? GSK isn’t saying, but I’d certainly bet on this sort of thing happening again as the year goes on. What the company’s research arm will look like when it’s all over is anybody’s guess, too, but there’s one thing for sure: it’ll be a heck of a lot smaller.
And whether this new trimmed-down inlicensed/outsourced GSK will be any more productive is anybody’s guess either. But we won’t know that for a long time. It’ll take quite a while just for all of these changes to stop reverberating through the company, for one thing, and then it’ll be several years after that before it’ll be possible to look at the pipeline and have a majority of it be a product of the new organization. As I’ve said before, this is one the biggest challenges in trying to engineer a large-scale change in a drug discovery shop – the lag time before you see the effects.
I’m already seeing resumes, but I’d like to invite any readers who know of openings for experienced drug discovery positions to either mention them in the comments or email me about them for a future post. (I did a lot of that during my own experience with a site closure, but of course, this time I don’t know most of the people involved personally). At the rate things are going, I’m going to have to start running classified ads down the right side of the page.
+ TrackBacks (0) | Category: Business and Markets | Cancer