About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
In the Pipeline:
Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline
March 30, 2015
Rather quickly, we have some results from that conflict between the New York State Attorney General and the herbal supplements retailers. GNC has reached a deal with the office, and will (1) use DNA barcoding assays to verify that their plant-material supplements are actually the plant on the label, (2) state in its stores and on its website whether a given supplement contains plant material or an extract of same, and (3) ". . .spotlight that extracts are chemicals derived from plants after applying solvents such as liquid carbon dioxide. . ."
That plant matter/plant extract business is in response to one of the counterarguments brought after the original findings - merchants pointed out that the extract might well not contain any of the orginal source's DNA. This makes me wonder if some other supplement brands might see a marketing opportunity, if GNC is going to tell people that their pills contain chemicals extracted by solvents. Nothing yet from the other stores mentioned in the NYAG report (Target, Wal-Mart, Walgreens), but I wouldn't be surprised if similar deals are in the works.
+ TrackBacks (0) | Category: Analytical Chemistry
Here's a look from Bayer at their experiences providing vitamin D receptor ligands to the academic community over the last twenty years (which takes the compounds back into their Schering AG origins). Overall, they've found that the compounds seem to have helped the various research groups along quite a bit, and have led to a good number of publications (with about a 33% conversion rate). This comparison was interesting, I thought:
[In 2003-3013} a total of 121 requests were received (Fig. 1c). Briefly, to request compounds, the academic group completes a one-page non-confidential research project plan that is then reviewed by Bayer scientists. In addition, the academic institution has to review a sample transfer agreement (STA). The compound is dispatched if there are no reasonable objections against the research plan and if the STA is signed and returned.
The overall success rate for requests was high (71%, or 86 of 121 requests), although there were regional differences. Requesters from the United States had a lower chance of receiving the compound than did requesters from other countries. . .Based on archived e-mail communications, the review and execution of the STA was the main cause of delay. Quite frequently, US institutions had either questions or concerns about the STA, or the US investigator never returned the STA template to Bayer.
A first request from an academic scientist for a compound from the VDR ligand cluster took an average of 99 days to fulfill (data on 19 requests available). This time was mostly spent on processing the STA. A second request from the same scientist took an average of just 57 days to fulfil (n = 11 requests). Even when successful, requests from the United States took substantially longer, with an average request time of 152 days (n = 4 requests) compared with 85 days for non-US institutions (n = 15 requests).
They say near the end of the article that they've now revised the language of the STA, but as far as I can see, they don't say if some part of it in particular was a common sticking point, or what changes they made. That seems like it would be a worthwhile piece of information; I'm surprised that it was left out. . .
+ TrackBacks (0) | Category: Academia (vs. Industry)
The fake peer review scandals just keep on coming. BioMed Central noticed problems with a few papers last fall, but they've now had to pull 43 papers from their journals:
Some of the manipulations appear to have been conducted by third-party agencies offering language-editing and submission assistance to authors. It is unclear whether the authors of the manuscripts involved were aware that the agencies were proposing fabricated reviewers on their behalf or whether authors proposed fabricated names directly themselves.
That's being very diplomatic. I would guess that the odds are very high that the authors involved either personally suggested fake friends to do the reviewing, or knew that they were paying someone to suggest some, or didn't care much one way or another as long as their paper got published. They're paying some agency to get that to happen, so why should they concern themselves with the details of how the goods are delivered?
What we have is a counterfeiting problem. In too many places, the currency of a scientific career is the number of papers that are attached to a person's name. And as with any valuable currency, the incentive exists to pass off fake versions of it as the real thing. Base metals are mixed into the coins; paper notes are copied. In some countries, generating a list of publications is the equivalent of printing off stacks of hundred-dollar bills down in the basement. In these days of modern times, as the Firesign Theater guys used to say, we now have third parties who will let you time-share on their basement printing press. You chip in for the ink and paper, and they'll run you off some notes.
Counterfeiting goes on as long as these notes are valuable. So as long as there are places that count papers for promotion, tenure, etc., there will be people faking papers and faking the methods to get them published. I applaud the efforts of various publishers to try to police this kind of thing, but the real problem is on the demand side.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
March 27, 2015
MIT's Bob Langer has another idea: he's looking to change the way that entire courses of treatment are dosed. What if an extended-release formulation, for an oral drug, was really extended release. Days, weeks? Instead of taking pills every morning, what if you took one, well, pill-like thing that emitted the drug substance on schedule, inside you?
That would certainly help the patient compliance issue, which is a bigger issue than you might think. But while there are "depot" formulations for such slow release, there certainly aren't any of them that hang around in the stomach. It's a challenge, but Langer and co-author Giovanni Traverso make the case that it's doable.
There are two basic engineering approaches to extend transit time. One is to slow a drug's passage through the GI system by using devices that increase friction with the gut's mucosal walls4. The other is to prolong retention by loading drugs into devices that are larger than the points in the gut that limit passage of materials beyond a certain size: the pyloric sphincter at the exit of the stomach and the anus5. Also crucial to the successful development of such technologies are ways to ensure that drugs survive the harsh GI environment..
If you're going to leave something along the way, the stomach seems to be the best bet. There are gastric balloons, as used in weight loss therapy, and many cases in the medical literature of what are known as bezoars, indigestible masses that persist without apparent harm (unless they got too large). Eating a large amount of fresh persimmons in one sitting, by the way, is one of the more well-defined causes of bezoar trouble that does not involve mental problems - don't say you don't learn stuff around here. (That link is fine, but I assume that you might not want to do a Google image search around that term. Not going to try it myself). You will be interested to know that there are recent reports that these persimmon masses can be treated nonsurgically by ingestion of substantial amounts of Coca-Cola.
Back to drugs! Another key feature of this idea will already have occurred to some readers: you're also going to want the ability to demolish any such device or construct on demand, in case there's a bad reaction to either the drug or the delivery system. That's a significant complication - not a show-stopper, but certainly a show-slower. Ideally, such a dosing system would be simple and robust enough to be deployed in countries with less-developed heath care systems, where patient compliance is especially difficult and especially hard to monitor. If this gets realized as some sort of expensive high-tech thingie that requires you to be within walking distance of a good medical center, its impact will be rather muted.
That said, Langer would seem to be just the person to get something like this going - drug delivery and materials science are longtime interests of his, and he can most certainly raise money and round up resources. This effort will be interesting to watch, but it's probably going to be a long one.
+ TrackBacks (0) | Category: Drug Development | Pharmacokinetics
March 26, 2015
This is a neat little structure, and it's truly a pain to synthesize (12 steps from myo-inositol). But it now seems to hold the record as the most polar aliphatic compound ever measured, and well it might. David O'Hagan of St. Andrews reported it at the ACS meeting in Denver - one side of that molecule is full of electronegative fluorines, and the other side has what's left. In bulk, this would be a very unusual solvent to play around with in chromatography and the like, but I don't think we're going to see any four-liter jugs of around the lab any time soon.
Update: annoyingly for my speculations, the compound appears to be a solid. Maybe the smaller-ring analogs?
+ TrackBacks (0) | Category: Chemical News
You know, I'm not sure I should have cut Atomwise so much slack the other day. I just came across this piece on them, and. . .well, I'll let it speak for itself:
“Here I am just sitting in this house and I’m able to predict a cure to measles,” co-founder of Atomwise Alex Levy tells me over the phone from his apartment in Mountain View, Calif.
Atomwise, a health tech startup in the current Y Combinator batch, has launched more than a dozen projects in the last year to find cures for both common and orphan diseases – diseases that would otherwise be too expensive and time-intensive to tackle. It’s working with IBM to find a cure to Ebola and with Dalhousie University in Canada to search for a measles treatment. Levy says the startup went through 8.2 million compounds to find potential cures for multiple sclerosis in a matter of days.
. . .“It’s like having a virtual super intelligent brain that can analyze millions of molecules and potential interactions in days instead of years,” Levy says. . .“You still have to test but you take out all the guesswork before you get started.”
OK, guys, this is what's known as hubris. That's a term from the ancient Greeks, and it translates to something like "overconfident pride" in modern usage, and it's a good way to summon the goddess Nemesis. We get to meet her a lot in this business. Retribution lands on us several days a week around here in drug research, as you will note from our success rates in the clinic. But this sort of "Gosh, I'm curing diseases right here at my kitchen table" talk is pushing it even further, inviting a visit from another ancient Greek chick, the goddess Atë. She's in charge of delusional folly, and pitches in on the retribution as well. Atë's the one that Shakespeare has helping out Caesar's spirit in Mark Antony's speech, you know, cry "Havoc" and let slip the dogs of war and all that stuff.
What I'm getting at here is that this interview is the kind of publicity you don't need. The sorts of people that you're going to have to work with to get a drug project off the ground are going to roll their eyes (at best) in reaction to it, and it's going to be hard to get anyone to take you seriously if you go on in this vein. This work is hard enough as it is, without making it even harder.
Update, from the comments: "Alex Levy here, the fellow quoted in the piece, and longtime Pipeline reader. To Derek and most commenters, thanks for the critique. Simply put: we agree. We’re always looking for better ways to communicate about Atomwise, and computational drug discovery generally. It's challenging to help the press and public differentiate hits from cures, and evidence from proof. What a system like Atomwise actually does can be pretty opaque to a general audience.
We’re skeptical scientists too, and do extensive retrospective and prospective validation studies internally. We hope to publish more as time goes on, and are excited to share those results with the community here. Wouldn't it be nice if Atomwise works even half as well as it sounds on TechCrunch?"
+ TrackBacks (0) | Category: Drug Development
March 25, 2015
Here's an interesting report in the Wall Street Journal on plans to run a large clinical trial with metformin. That compound has a lot of effects, and many of them seem as if they could be beneficial in an aging population.
Dr. Barzilai expects to enroll more than 1,000 elderly participants in the randomized, controlled clinical trial to be conducted at multiple research centers and take five to seven years. The project is in the preliminary stages and permanent funding hasn’t yet been secured. Funding for the planning phase is coming from the American Federation for Aging Research, a nonprofit organization of which Dr. Barzilai is deputy scientific director.
The trial aims to test the drug metformin, a common medication often used to treat Type 2 diabetes, and see if it can delay or prevent other chronic diseases. (The project is being called Targeting/Taming Aging With Metformin, or TAME.) Metformin isn’t necessarily more promising than other drugs that have shown signs of extending life and reducing age-related chronic diseases. But metformin has been widely and safely used for more than 60 years, has very few side effects and is inexpensive.
I hope this gets off the ground, for just those reasons. The study itself will not be cheap, but (as the article notes) it could pioneer some ways of looking at aging in the clinic, and we need for people to be taking steps in that direction. The planet's population, on the average, is not getting any younger, as birth rates level off (or plunge outright), and healthy lifespan is a bigger and bigger issue.
andy Walsh, an FDA spokeswoman, said the agency’s perspective has long been that “aging” isn’t a disease. “We clearly have approved drugs that treat consequences of aging,” she said. Although the FDA currently is inclined to treat diseases prevalent in older people as separate medical conditions, “if someone in the drug-development industry found something that treated all of these, we might revisit our thinking.”
As well they might. This is worth keeping an eye on, for sure.
+ TrackBacks (0) | Category: Aging and Lifespan | Clinical Trials
Here's a followup to something I wrote in 2012. That was when a joint venture called Trancelerate was announced to address precompetitive drug development issues, clinical trial design, and so on.
Someone asked me the other day what had come out of this, and I had to admit that I was stumped. So I'd like to throw the question open to the readership. Does anyone know of some things that can be pointed to that have emerged from Trancelerate? There was a progress report of sorts in Nature Reviews Drug Discovery, which mostly seems to have covered the venture's first year in operation. Otherwise, details have been scarce, from what I can see.
+ TrackBacks (0) | Category: Clinical Trials | Drug Development
Since I've mentioned a recent book on academic drug discovery, I also wanted to highlight this article at Nature Biotechnology on university tech transfer offices (TTOs). The authors, mostly from a list of high-profile research universities (Oxford, Stanford, UCSF, University College London, Harvard et al.) and part of the Oxbridge Biotech Roundtable, have a lot of good advice for academic entrepreneurs who are going to have negotiate a pretty complex landscape.
First-time academic bioentrepreneurs frequently confront a deal-making process they do not completely understand, in part because final deal terms are often held confidential, making it difficult for outside observers to understand fair market terms. At the same time, bioentrepreneurs sometimes do not fully recognize how the interests of the TTO and university may diverge from their own. In fact, one postdoc we spoke to said, “Knowing what I know now about my university's licensing process, I might have considered a different university for my postdoc.”
TTOs arguably benefit from information asymmetries in negotiations, whereas experienced bioentrepreneurs fear that disclosing details of prior deals could jeopardize their ongoing relationship with their TTO. Thus, they often stay silent on just how they got their asset outside the university walls.
Inexperienced faculty members, heading into lengthy negotiations in legal areas that they may not understand well, are certainly at a disadvantage. And what this article makes clear is that it's not just the slick operators from the biopharma business world that need to be watched out for, it's also the slick operators at the university itself. The interests of the institution and the interests of the inventor may well diverge, which is a particularly interesting situation when the inventor is an employee of said institution.
The article notes a number of differences between the US and the UK in these matters, a big one being the UK universities seem to expect a much larger equity stake in any spinout companies, but some fundamental advice is the same. Recognize everyone's own interests in the negotiations, and adjust your thinking accordingly. Get yourself the best and most experienced legal counsel you can find to look over the proposed IP and business arrangements. Remember that almost everything is negotiable, and a lot of it is renegotiable (but also remember that this means that you're going to have to flexible on some issues yourself). And get ready for everything to take longer and cost more money than you were prepared for at the start - it's worse than renovating a kitchen.
Update: see the comments section - the university equity numbers in this article are being disputed there.
Update #2: the authors are apparently working on correcting this issue, with new tables forthcoming.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Business and Markets
March 24, 2015
Last year I mentioned reports that the startup incubator Ycombinator was thinking of getting into the biopharma field. Here's a look at the current crop of potential companies.
One thing that stands out is that most of these seem to be focused on patient care or some sort of diagnostic. One exception is 20n, which is looking to engineer microbes to produce known pharmaceuticals or intermediates. That's not at all a crazy idea, but the example given on the site (acetaminophen) is not a particularly compelling example by itself, since it's extremely easy to make, from cheap precursors, on an industrial scale. And I'm not sure what to make of that "map of every chemical that can be made biologically". It's a nice graphic for the middle of the page, but there's no telling what it means. I like a lot of the ideas kicking around in the synthetic biology field, but I can't really say what 20n is up to yet.
The other one on the list I noticed is Atomwise, and I'll let them speak for themselves:
Medicines are getting more expensive to develop. These days, it takes about $1.8 billion and 15 years for a single new drug. Atomwise aims to change that by using supercomputers to predict, in advance, which potential medicines will work, and which won't. Our tools can tell the difference between great drug candidates and toxic ones, and discover new uses for old medicines.
Actually, what will speak for themselves are the results. If Atomwise can do this, at all, even poorly, then there are billions of dollars waiting out there for them to scoop up. But just the act of saying that you can do things like this makes me suspicious that they really can't do things like this. Here's a bit more:
Previous attempts haven't always met expectations. The techniques of the day were limited by the knowledge and computers available. Today, things are different. We have invented cutting-edge machine learning algorithms that are built specifically for the world's most powerful computers. We use one of the world's top supercomputers to analyze databases 1000 times larger than those used in the past. This lets us deliver what many others can't: precise and reliable medicinal predictions.
I could go on about this for a while, but in the end, these arguments are settled by data. Come on down and try it, guys. There's plenty of room, and plenty to work on, so let's see what you can get done. I'll be watching with interest, and so will others.
Update, from the comments: "Hello Everyone, I'm the CEO of Atomwise and a long-time semi-lurker here. (I'm the anonymous who keeps asking what it would take to convince people that in silico methods work.) I agree that the proof will come from data; that's one reason why we're doing a large-scale evaluation with Merck. Send me an email (firstname.lastname@example.org) and I'd be happy to present our data from previous prospective validation projects to you. Or, if you have some minimally proprietary data against which you'd like to evaluate our predictive capability, let's run the experiment. Best, Abraham Heifets"
+ TrackBacks (0) | Category: Biological News | Business and Markets | In Silico
These are words that you really like to hear: "stopped for efficacy". That's Merck's situation with their anti-PD-1 antibody Keytruda (pembrolizumab), which was in a clinical trial in advanced melanoma patients versus Yervoy (ipilimumab), which targets CTLA-4. Couple this with the kinds of data that Bristol-Myers Squibb and others are generating, and PD-1 looks like it's justifying its hype (which has been significant).
This antibody came from Organon, which was bought by Schering-Plough, which was bought by Merck, so it may be the main thing that Merck gets out of the whole deal. The cancer immunotherapy wave is showing no signs of breaking.
+ TrackBacks (0) | Category: Cancer | Clinical Trials
Here, folks, is someone to explain to you that "lemon water is very alkaline" (a direct quote), and that cayenne pepper has been "proven to boost your metabolism", and that "an acidic body promotes disease" and. . .oh the hell with it, I can't keep reading that crap. It's the Food Babe, of course, profiled in New York magazine, and it's the usual geyser of nonsense.
Is there any use in pointing out that the body regulates its own acidity and alkalinity very tightly? And that anyone talking about how disease is somehow related to a systemically out-of-whack body pH is almost certainly a fool or a con artist? Or that lemons are actually acidic, a fact known to many fifth graders? Or that someone who sets themselves up as a beacon of good sense and sound nutritional advice and who still doesn't know any of this might perhaps be just a bit out of their depth?
Why no, there isn't. Because anyone who would do such a thing is clearly an evil person who eats bowls of industrial waste for breakfast, whose liver remains unstimulated, and whose lemon water won't make him alkaline no matter how many gallons he downs. But soaked in chemicals as I am, I must somehow find the motivation to carry on.
+ TrackBacks (0) | Category: Snake Oil
March 23, 2015
I would definitely not want to be downwind of the release of five tons of titanium tetrachloride. This happened near Montreal over the weekend, and things seem to have turned out a lot better than one might have imagined (only two people hospitalized).
For those who haven't worked with it, "tickle-four" fumes wildly on contact with moist air, as it hydrolyzes to HCl and a haze of titanium dioxide. (The commercial solution in methylene chloride doesn't give you the true experience; connoisseurs insist on the neat liquid). I once saw someone nearly drop a liter-sized glass bottle of the stuff, and he had to site down for a minute after that one. "I think we might all have had to leave for a little while" was his comment.
Since no one seems to have been seriously hurt, I'll mention that the other thought that the sheer size of this leak brings to mind is a line from an old "Pogo" comic strip. "Say something weighty", Churchy La Femme begs Porkypine, who looks at him and deadpans "Fourteen ton of bi-toomi-nous coal". Churchy objects, saying that here he is, wanting to have a serious conversation, and asks again for Porkypine to say something even weightier. "Fifteen ton of bi-toomi-nous coal" is the reply. Five tons of titanium tetrachloride is a lot. I'm glad that the whole incident wasn't far worse.
+ TrackBacks (0) | Category: Life in the Drug Labs
This does not seem to me like a great advance in intellectual property law: this Bloomberg article says that some large investors are using the patent challenge process for their own purposes:
Taking advantage of new rules created by Congress three years ago, hedge funds have increasingly been filing challenges to pharmaceutical patents. Some may be angling for payouts to drop their claims, while others are shorting the stock, betting that the manufacturers’ shares will plummet.
Activist investor Kyle Bass sent a shudder through the drug industry earlier this year by embarking on a patent-challenging strategy. Now a New York hedge fund, Ferrum Ferro Capital LLC, has made an even more brazen move by seeking to invalidate an Allergan Inc. patent that has already been upheld in court. Neither investment firm has said whether it’s betting against specific stocks.
I'm generally a lot more sympathetic to short-sellers than most people are. I've gone short myself at times (although not in recent years), and I think that the markets need people who are attuned to the downside. This, though, seems to be crossing a line. The challenge only costs $23,000 or so under the rules put in place in 2012, though, so it's a weapon just sitting there ready to be used. And it appears to be getting used about five times as much as the PTO expected. An investment fund didn't usually have the standing to challenge a patent in court, but now the door is open.
We'll see how much of a problem this turns out to be. The challenge system was loosened up in order to deal with yet another problem, patent trolling, where people use portfolios of overly-broad issued patents to shake down everyone within reach. And that's not a productive activity, either, but we may have to aim for a process that doesn't allow new shakedowns to arise in the place of the old ones.
+ TrackBacks (0) | Category: Business and Markets | Patents and IP
March 20, 2015
Well, if you needed any more proof about how powerful and easy to use the CRISPR/Cas9 technique is for gene editing, take a look at today's headlines on it:
A group of leading biologists on Thursday called for a worldwide moratorium on use of a new genome-editing technique that would alter human DNA in a way that can be inherited.
The biologists fear that the new technique is so effective and easy to use that some physicians may push ahead before its safety can be assessed. They also want the public to understand the ethical issues surrounding the technique, which could be used to cure genetic diseases, but also to enhance qualities like beauty or intelligence. The latter is a path that many ethicists believe should never be taken.
Someone's going to take it, though. Maybe not in this country, maybe not openly. But this gene editing technology is the most powerful technique yet for doing such things, and I have no doubt that it's going to be used for human germ-line editing followed by in vitro fertilization, etc. I'm pretty sure that I don't like the idea, either, not at this stage of our knowledge, but those scruples are not going to be universally shared. We're going to end up having this debate sooner than we think.
Note: here's a similar call for restraint using the zinc-finger editing technique.
+ TrackBacks (0) | Category: Biological News
If you're a rabid biotech investor, you already know all about Biogen's data this morning. If you're sane, or insane in some other, more interesting direction, then here's what's up: last December, the company released some Phase I data suggesting that their amyloid antibody, BIIB037 (aducanumab) was showing better-than-expected responses.
Some of you may be wondering about that. Clinical responses? In a Phase I study? Well, there's nothing stopping you from collecting the data, and in this case, since Biogen was dose-responsing the subjects, they looked both at the amounts of amyloid (using Lilly's imaging agent for this purpose) and they also gave the patients standard cognitive tests to see if something was happening. That's what has everyone so excited, because there does appear to have been a dose-response, by both measurements. I'm glad to hear it - there is a huge need for an Alzheimer's therapy, of course, and the failure rate has been so brutal in the clinic (arguably 100% - there are approved drugs, but they don't do much).
Here's more on this from Matthew Herper at Forbes. The thing is, although this is good news, many investors are taking it as incredible, world-changing news. Biogen stock has been gaining over the last few months on expectations, and today's data will probably keep the party going. That's an awful lot to pin on a bunch of Phase Ib data, I'd say. Especially since this is Alzheimer's. The only phase that matters in Alzheimer's is Phase III.
Biogen knows this - they're skipping right ahead to it, and good for them for taking the risk. But a risk it is. Every other Alzheimer's antibody trial has failed, even though some have tried to pretend otherwise. The data are not in yet on the more preventative trials, where you hit the earliest possible patient population, true. But the results have not been encouraging. Either Biogen is really on to something here (which is what the investors are betting on) or they're in for a crashing disappointment in a few years. Yep, a few years - that's how long it's likely to take to run a large, well-controlled Phase III in this area.
There are other things to think about. One big issue is that the sample sizes are very, very small. Adam Feuerstein (on Twitter) noticed that the cognitive decline in Biogen's placebo group was larger than one might have expected. And there does seem to be some brain swelling in patients who are APOE4 carriers. All of these things are going to have to be worked out, slowly and expensively, so if you see headlines that Biogen has cured Alzheimer's, don't believe them - yet. I hope they have, but no one knows that, or can know that, for some time to come.
As a side issue, the excitement in the company's stock is part of the general biotech stock run of the last two or three years. It has been a tremendous market, that's for sure - buying the biotech index in 2010 would have been pretty slick move, although you'd have also lost some hair by now wondering how long the party would continue. My guess is that the people who did buy it back then have already sold and regretted it, and maybe more than once. As someone who works in the industry, I've certainly benefited from the runup, so by warning about investor euphoria I'm actually working against my own short-term interests!
+ TrackBacks (0) | Category: Alzheimer's Disease | Clinical Trials
March 19, 2015
I have a book review out in Nature Chemical Biology of A Practical Guide to Drug Development in Academia. As you'll see, I liked it, finding it a very useful guide to real-world drug discovery for people who are interested in what they'd need to do to get into it.
Here's a quote that some readers may well appreciate:
We - as academics - are often under the impression that drug development, unlike our own basic research, is a rather mundane and straightforward process. In fact, those of us who have spent time in the biopharmaceutical industry have found that drug discovery and development lies at the intersection of basic research and applied science and requires a great deal of creativity and rigor. Exceptional scientists populate both the biopharmaceutical industry and the regulatory agencies. Drug development can be every bit as challenging and require even more persistence than traditional academic research.
Beyond these general points, there's a lot of very solid practical advice in the book. I definitely recommend it to academic researchers, and to people starting out in the drug industry who'd like an overview of the whole process as well.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Drug Development
How long does a drug molecule stay in its site of action? How long does it need to? These questions have had a higher profile in recent years, as more drug discovery teams pay attention to the idea of residence time. This is a short review, with an impressive list of examples from the recent literature, and it's a good place to get caught up on the topic.
You'd think that residence time would be closely related to the binding constants that we already measure - high potency would equal long residence time - but it's not always that straightforward. Even rather similar structures (with rather similar potencies) can have significant differences in off-rates and residence times, for reasons that are not always easy to work out unless you have a lot of high-quality structural and kinetics data (and sometimes not even then). One reason is that a compound binding to a target is not always the simple binary complex that we tend to picture. You can have a molecule that comes in and binds to one conformation of the protein, only for that event to set off a shift to another conformation that can bind the drug more tightly or more loosely (and through different interactions). Or perhaps the binding site is in equilibrium between several low-energy states even before the drug molecule shows up, but some of these states are competent to bind it, and some aren't. The ways in which a ligand can perturb these balances can be subtle, but important.
And then you have the princes and the kings of low-off-rate compounds, the reversible covalent compounds and the irreversible binders. These have distinct profiles in careful kinetics experiments, but (as this review mentions) there have still been many reversible binders that were not recognized as such until comparatively late in the game. Both types of compounds have moved from curiosities, to be treated with some wariness, to perfectly acceptable (and even sought-after) classes over the years.
But don't think that this is an guarantee of success, because the one constant in drug research (from Paul Ehrlich on up) is that there is no such thing. As this review shows, there are also a few examples of compounds whose in vitro behavior showed clear and reproducible variations in residence time, but whose behavior in animal models (or humans) hardly budged. Among other things, this effect can depend on the tone of the system you're trying to regulate: does it need to be inhibited for a long time, or will a quick mash on the pedal accomplish the same thing? Needless to say, this is often a question that we don't know the answer to up front - the only way to find out is to make some compounds and see what they do.
Which is, in the end, about the size of it for drug research in general. It would be nicer (and certainly cheaper) if we didn't always just have to find some compounds and see what they do, but often that's just what it comes down to. For many targets, though, you'd probably do well to consider residence times as one more way to get where you're going.
+ TrackBacks (0) | Category: Drug Assays | Drug Development
March 18, 2015
How does the press cover stem cell work? You probably already know the answer to that one, but here's proof: this paper examines the media coverage from 2010 to 2013, and finds that "highly optimistic" timelines for translation to the clinic are the rule. Unfortunately, they also find that quotes from scientists are the main source for this sort of thing, too. My belief is that some journalists are just predisposed to go with whatever sounds good and exciting, so whomever gives the most hopeful prediction is the person that gets quoted.
The whole translation-to-a-real-therapy process, though, is terribly underestimated by the general public (and by some people who really should know better, too). That's one of the roots of the the NIH Fallacy, the idea that drugs are all discovered with NIH money and that drug companies just reach in and pluck those delicious fruits once they're good and ripe. You can only believe that if you don't know how much work it takes to go from a discovery to a drug, and how many, many things can go wrong along the way.
There's a related fallacy, the idea that there are all sorts of cures sitting on shelves (of both drug companies and universities) that no one has bothered to dust off. While it would not surprise me if there were, indeed, great ideas sitting around waiting for someone to rediscover them, they are surely surrounded by masses of not-so-great ideas. And figuring out the difference between those involves the piles of money and time that go into drug development - we haven't figured out any other way to do it.
+ TrackBacks (0) | Category: Drug Development | Press Coverage
Here's Peter Kenny on an important aspect of the concept of "PAINS" in screening (the sort of pan-assay-interference compounds that have been a big topic in the field the last few years). He notes that the original PAINS paper grew out of some campaigns using AlphaScreen technology. And although some of the classes of compounds that have been flagged probably hit other assay formats as well, to their disadvantage, some of them may be peculiar to that platform. Especially worth noting are compounds that can scavenge singlet oxygen, because the AlphaScreen mechanism will cause those to ping up as false positives. The various double-bond-to-sulfur compounds that many of us love to hate may well be acting through that mechanism.
He's certainly right about this, and his recommendation for AlphaScreen users to read up on the single oxygen reactivity literature is worth seconding. I think, though, that there are more reasons to dislike those sorts of sulfur compounds than just this - they have a bewildering set of reactivities and instabilities under various assay conditions, which is really what makes them so pernicious as screening hits. In fact, that's one the key concepts that those of us who are big on the PAINS concept are hoping to communicate to other chemists, to biologists, and others: that (1) there sure are a lot of ways that you can get false positives, (2) there are compounds that are capable of performing more than one of these tricks simultaneously, so (3) these things tend to over-represent as plausible screening hits, but are difficult-to-impossible to carry on with.
+ TrackBacks (0) | Category: Drug Assays
I've complained in the past (and I'm not the only one) about total synthesis work that doesn't (or maybe can't) deliver relevant analogs of the final product. That's been one of the traditional rationales for the work, but it's not always followed up on. But here's one that does: Dale Boger's group at Scripps has published another paper on modifying vancomycin, work that has grown out of their total synthesis efforts in the area.
This is clearly an area with important applications - in fact, there are three synthetically modified antiobiotics of this kind on the market (oritavancin, dalbavancin, and telavancin. These have modifications, notably the addition of hydrophobic side chains, that both change their activity by helping them bind to the cell membrane, but also (at least for telavancin) also seem to give them new mechanisms of membrane disruption as well.
Vancomycin resistance is known, but it's been very slow to develop, compared to many other antibiotics. That's probably because it's not binding to a protein target (which is directly coded for by the bacterial DNA, providing a way out through mutation). Instead, vancomycin binds to D-Ala-D-Ala, a key component in the construction of the bacterial cell wall. That's a much harder mechanism for the bacteria to catch on to, as it were, but when they do, it's very bad news indeed, since vancomycin itself is often a last line of defense in the clinic against infections like MRSA. In this paper, the Boger group is adding one of the commonly used hydrophobic groups (a para-chlorobiphenyl) and simultaneously changing a key amide carbonyl, as found in their earlier binding-pocket work, in the hopes that the double modification would complete evade the defenses of the resistant bacterial strains.
Does it? They report a variety of changes to that amide, and the amidine and methylene variations turn out to have excellent potency against both the wild-type and resistant strains. This is a very nice result indeed, showing that the two modifications can work together, and this could point the way to a new generation of vancomycins that (with any luck) can continue confusing bacteria for many years to come. Congratulations to Boger and his group - this is very difficult chemistry indeed, and it's being done for excellent reasons. This, in fact, is just the sort of thing that it's hard to imagine any sort of automated synthesis machine ever being able to perform, and is the kind of high-level work that the advent of such machines should be freeing us up to do. There is no replacement for talented, hard-working organic chemists on projects like this.
Full disclosure: I was a summer undergrad in Boger's group over thirty years ago - and no, that time frame doesn't seem very plausible to me, either, but there it is. I did not enjoy myself that much, but neither did the grad student I worked for, I'm pretty sure. I was not exactly an ornament of the lab, and I think that Boger himself was able to deal with my departure at the end of the summer without too much strain.
+ TrackBacks (0) | Category: Chemical News | Infectious Diseases
March 17, 2015
Here's a good paper from Phil Baran and co-workers in Accounts of Chemical Research on the relationship between industrial and academic research. It's illustrated with examples from his own work, such as the ingenol synthesis, and with new synthetic methods discovered in collaboration with Bristol-Myers Squibb, Pfizer, Eisai, and Sigma-Aldrich.
The article uses these to show that there can be useful, productive collaborations between organic chemistry groups from both sides of the field. The academic side can bring in ideas and techniques that industry might not have the time or patience to look into, but the industrial side can in turn bring in expertise from areas that don't even exist in most academic settings, along with some real-world focus. We all have known these things for some time, though - or we should have. But collaborations as useful as the ones that Baran's group have had are relatively rare. Part of that, of course, is that he and his group are extremely productive, but I think that there's another factor at work, too.
My own take is that too many deals are made that are too loose and open-ended, in a sort of "Gosh, if you guys discover something we'll probably pay you some money for it eventually" way, which doesn't motivate anyone to focus much. Better for industry to go to academia saying "You know, we'd really like to be able to do something like Reaction X", a more specific goal. It might be harder for academic groups to approach industry, though, since they may or may not have a good idea of what people are looking for. To that end, I wonder if it would be useful to have something out there in public, a list of unsolved problems that would be sure to attract interest if someone has a way to approach them.
David Hilbert I'm not, but I'd be glad to hear nominations for such things in the comments. My own contribution, to start things off, is going to be uncontroversial (for once). Industrial organic chemists, it's safe to say, are going to be very interested to hear about any new late-stage oxidation or fluorination chemistries - the sorts of things where you could imagine taking a collection of final drug-like compounds, running them through the process, and producing a reasonable number of new derivatives in one pass. There's already work going on in this area, of course - I told you I wasn't going to be controversial. But anything reliable in this area is definitely worth hearing about, and that's what I think a list like this should be: a collection of things that are guaranteed to spark interest. Any ideas?
+ TrackBacks (0) | Category: Academia (vs. Industry)
In 2012, the Spiegelman lab at Harvard reported a new peptide hormone, irisin (derived from a known precursor, FNDC5), that seemed to be involved in (among other things) brown-fat-like energy usage and the beneficial effects of exercise. There have been questions from other researchers about this work, but even this time last year work was coming out on irisin's mode of action. (And as of last September, Spiegelman was still vigorously defending it).
But the controversy over these results is getting ready to spill over into the popular press, which was quite enthusiastic back in 2012 ("Harvard Team Finds Exercise Hormone", the headlines pretty much write themselves). Here's an article from MedPage Today, which is probably the fullest look at the situation outside of the primary literature or the metabolic research grapevine. It's prompted by this new paper in Nature's open-access journal Scientific Reports, which fires a cannon right over the bow: "Irisin - a myth rather than an exercise-inducible myokine". That, by the standards of the scientific literature, is the equivalent of pushing over someone's Harley in front of the biker bar, an unignorable challenge. The authors have a powerful case. They provide evidence that the antibodies used for the ELISA assays that underpin most (all?) of the published irisin work are, in fact, nonspecific. What's more, even though they pick up a number of other proteins, they don't actually seem to recognize authentic (synthesized) irisin. Using a better detection system, however, the authors can find no evidence for meaningful amounts of irisin in human blood at all.
Spiegelman will surely have a response to this - you can't not have a response to a paper like this one. For now, though, the whole idea of irisin seems to be in doubt, and if it's indeed not real, a lot of people have been wasting their time.
+ TrackBacks (0) | Category: Biological News | Diabetes and Obesity
March 16, 2015
John Caroll at FierceBiotech has more on those Biogen Idec cuts last week. It was only about 20 people in chemistry and neurology - although if you were one of the ones affected, you may well ask (to quote Austrian writer Peter Altenberg) "What's so only?" There were also cuts of about the same size last fall in departments like clinical operations and QC, and Carroll is hearing that there may be some more:
Another person close to the move last fall tells me that Chief Medical Officer Al Sandrock outlined plans to create a "Biogen Idec Version 4.0" at an offsite company meeting last spring. "They want a leaner team," says one source, looking to outsource more jobs. And staffers are wary that the efficiency focus will spur upcoming cuts, fearing more workers will face the ax later in the year.
What has people paying attention to these moves, which are certainly not on the scale of what's been happening at GSK, AstraZeneca, and other companies over the last two or three years, is that Biogen is in great shape right now. They have big-selling drugs early in their patent cycles in MS and hemophilia, and they're banging away on some other high-profile projects as we speak. Data are expected on Friday on their Alzheimer's antibody program, which looked pretty impressive last time anyone saw any data, in December - and that's impressive on the absolute scale, not on the relative scale of other Alzheimer's antibody programs.
So if a company like Biogen, with big sales and big prospects, whose stock continues to rollick along to all-time highs, can be tweaking their head count, shuffling responsibilities, adjusting their outsourcing and all the rest of it - well, that tells you what things are like on the business end of this industry. Plenty of other drug companies can only dream about being in Biogen's situation, but there they are, with both hands gripping the wheel, watching every dial and listening to every stray sound from the engine.
+ TrackBacks (0) | Category: Business and Markets
The quinolone antibacterials have over fifty years of use in humans, although the first generation of them was not all that impressive. Most people either don't know where they came from, or will answer "naldixic acid", which was the first of the breed to make it to market (1962, from Sterling). That one, so the story goes, was developed from an impurity isolated during a synthesis of the antimalarial chloroquine, and was found to have antibacterial properties.
But "so the story goes" is not much of historical foundation. This new open-access article (from Gregory Bisacchi at AstraZeneca) is probably the most comprehensive look at the early days of the quinolones, and it tries to fill in a lot of the missing details. There are many. I found the story itself to be interesting, although those days in drug research now have a sort of otherwordly feel to them compared to how we work now. It's a bit like reading Reminiscences of a Stock Operator, which is basically a biography of Jesse Livermore under another name and looking at today's stock market. The actions are all recognizable, but appear to have taken place on another planet.
Adding to the problem is that even in the earliest days, the origins of this drug class were not clear:
Considering the vast and still growing literature on (fluoro)quinolone antibacterials and the medical and commercial importance of this class, it may seem surprising that the lines of research leading to the identification of nalidixic acid had been nearly a complete mystery to the drug discovery community for several decades following its first disclosures. David Greenwood stated in his 2008 book “Antimicrobial Drugs: Chronicle of a twentieth century medical triumph” that “Sterling-Winthrop was reticent about revealing details of the discovery”
There were references to at least two more detailed papers "to be published" that never appeared (so that's certainly not a new phenomenon!) But the switch from the original chloroquine impurity (which had a quinolone structure) to naldixic acid (which is a 1,8-naphthyridone) has been mysterious. What this new paper shows, though, is that quinolones had already been identified as potential antibacterials in three patents from ICI, which seem to be almost forgotten now. This almost certainly influenced Sterling's own research and patent filings, but the exact details are beyond recovery.
There are some Australian academic papers mentioned compounds in this class from even before the ICI patents, but they have no mention of biological activity. And there are some Indian publications in between the ICI and Sterling work, which cite the ICI patent. But Sterling's own patents don't get around to citing any of the above. As Bisacchi put it, "Conceivably, patents filed during this earlier time were governed by less rigorous standards regarding broad prior art citation than we are accustomed to today." Searching the literature was, of course, more of a pain back then than it is today, but at the same time, there was a lot less literature, and it's hard to imagine that Sterling's team was unaware of all this prior art - especially since the move to naldixic acid itself looks very much like an attempt to get around it. For their part, as this paper notes, ICI never published their own quinolone work in any journal, and that along with Sterling's relentless lack of interest in mentioning it has left it almost entirely forgotten.
Bisacchi mentions that he has tried to dig into the AstraZeneca archives (which stretch back to the ICI days) to see what happened with that project, but that he's been unsuccessful. It does appear that the company missed a significant opportunity by not following up on their early quinolone work, though. The whole story brings up a larger problem in drug discovery and in scientific programs in general: it can be very difficult to reconstruct their history. Decisions get taken that made sense at the time, based on limited data, but in retrospect become hard to understand. Even the people directly involved can end up rationalizing such things and making them more of a coherent narrative in their own memories, and that doesn't even take into account more conscious forms of tale-telling. Very few research programs are ever retold in all their shaggy, Brownian-motion glory after the fact. The difference is roughly that of natural human speech, as transcribed from a tape, and dialog as written for a play.
But this isn't just a problem in science. It's a problem of history in general. What looks from the outside like an imposing structure of facts and eyewitness accounts can turn out to be a rickety structure when you start poking around inside it. A story about Sir Walter Raleigh, who began his "History of the World" during his imprisonment in the Tower of London, might be appropriate here. As it's told - and here's some perhaps not fully documented history as well - Raleigh heard some sort of major disturbance going on one day, well out of his sight, and he couldn't make out what had happened. In the coming days, he heard several completely different versions of what had taken place, and didn't know which one to believe. This, it's said, got him thinking about his own manuscript - how was he to write a history of the world, when he couldn't even be sure about what had just happened within his own earshot? Historians have faced this problem ever since. Journalist Ron Rosenbaum has written about this several times, describing the tone that you can hear in an interviewer's voice in an old tape recording, as the last surviving witness to some event is questioned. This is the only person left who can clear up the story - if they don't know, it will turn into yet another mystery. So it's not surprising that we in drug discovery have plenty of our own.
+ TrackBacks (0) | Category: Drug Industry History | Infectious Diseases
March 13, 2015
Now this is quite a development: 23andMe is the personal genomics company that had a run-in with the FDA and is still trying to come to terms with the agency over the reporting of their customers' genetic information. They also have a new collaboration with Pfizer to apply their genomics database to drug discovery. The fine print in the customer agreement is that 23andMe plans to use the sequences from their customers for just such purposes, and it is a valuable database. They have 800,000 sequences, and the company says that over 80% of them have consented to have their data used for research.
But they've apparently decided to go straight on to drug discovery with all this genomics data, which is a big leap. They've hired Richard Scheller, a head of drug discovery at Genentech before his retirement, and that's a serious hire indeed. The company says that he's going to "help build a dedicated research and development team", so we'll see if they mean it.
Because if they do, they're talking about building a drug company from scratch. Some of it, or large chunks of it, will probably be outsourced in one way or another, because this will involve a lot of people with expertise that 23andMe just doesn't have. Or not yet. It'll be quite interesting to watch how this develops, and how much the company wants to do itself versus contracting out (and not to mention what disease areas it feels it has an opportunity in). But I'm always glad to see someone else getting into this game, and I wish them luck.
+ TrackBacks (0) | Category: Biological News | Drug Development
Well, the comments are certainly rolling in on yesterday's post about the potential "end of organic synthesis". So far, I'd say that a majority of them are of the opinion that I've lost my mind. And either I've had practice at people telling me that, or I've had practice at losing my mind, but I had a feeling that this would be the reaction. So I'll do one more post on the topic, and then we can agree to disagree and check back on the subject after a while to see what's happening. I should note up front that the (few) popular press stories about this work have mostly been terrible, with people talking about a "3D printer for molecules", which is so wrong that one hardly even knows where to start. So I definitely want to divorce myself from that stuff - I'm crazy in a different direction entirely.
Here, then, are some of the objections that have been raised, in the comments, in e-mail, and in some one-on-one conversations:
End of synthesis? You must be joking. This is not even close. As I tried (ineffectively) to make clear yesterday, I don't think that this particular paper is The End. But it's the first thing I've seen that makes me think that there is an end to a lot of traditional organic chemistry. For many of the things that we use organic synthesis for, the later iterations of this approach may serve very well. And that's another point: organic synthesis is there to be used for other things, which means that anything that makes it easier to use it as a tool should find a home. In the end, though, that's what organic synthesis is to me: a tool. A really interesting, complicated, fun, ever-changing tool, a means to other ends. There's more on this at the end of this post; this attitude of mine may be the root of some of the disagreements I've had talking about this latest work.
But this machine of Burke's is nothing more than another way to do Suzuki couplings. And we already know how to do those. Yeah, it's true that this whole thing is driven by boronic acid couplings to form C-C bonds, the widely used (or even "beaten to death") Suzuki reaction. But we have not yet exhausted the possibilities of boronic acids, not by a long shot, as a look at the literature will show. Alkyl couplings are doable in many cases, even alkyl-alkyl, and (as I mentioned yesterday), that's an active area of research indeed. The idea of using Burke's protocol to crank out sausage-strings of polyaryl compounds does not excite me, although that's surely the easiest thing to do with it right now. It doesn't have to stay at that level, and I'm guessing that it won't.
This solid-phase purification isn't much more than people do already (cartridges and so on). It is similar. But existing solid-phase purifications are often case-by-case, depending on the structure of the molecule (often whether or not it has any acid or base character to exploit). What stood out to me about the MIDA-boronate solvent switch method, though, is that it may be a general method. A wide variety of MIDA-containing structures are shown doing it in the paper. And not only does it look pretty general, it's also tied into the key reaction, carbon-carbon bond formation, so you're always going to have that handle on the molecule in every step. Tying these two together is one of the things that intrigues me.
It's worth distinguishing this sort of thing from solid-phase synthesis, where the molecule of interest is attached to a support. That's how peptide synthesis works, of course, and it can be really good stuff. But a lot of combichem work was set up that way, using linkers to beads, and there are some organic synthesis reactions that just don't work very well that way. (You also have to have a molecule that has a suitable linking group in its structure). The Burke approach sticks with solution chemistry, in which we have more elbow room, and uses the solid phase just in the purification step. But since I bring that up. . .
This is nothing more than combichem. And we tried going beserk with combichem in the 1990s, and look what happened. Good point. Back when the combichem craze was just starting, people were talking just like I was in my post yesterday, and time has shown them to be fools. I saw all that happening at first hand, so it's definitely on my mind here. But combichem, especially in its 1990s iteration, was mostly about forming amides and sulfonamides. Its practitioners (well, some of them) tried for that not to be the case, but the great majority of compounds produced went through those sorts of reactions. And not every useful molecule, sad to say, has an amide holding it together. But that gets back to what I just mentioned above: the currency in this new scheme is not amide formation, or reductive amination, or what have you: it's C-C bond formation, and that really is the baseline of organic chemistry. If boronic acid couplings continue to make inroads into alkyl carbon territory (and, for that matter, into C-X heteroatom bond forming territory), and they will, then this protocol just gets more powerful.
The yields aren't good/the reaction isn't atom-efficient/the whole thing isn't scalable. This is definitely not a replacement for process chemistry, that's for sure. It's for cranking out lots of early-stage variations, or (alternatively) for cranking out small but useful amounts of a specific dial-it-up combination of known building blocks. In fact, even if my entire lunatic vision from yesterday comes true, process chemistry will still be with us, because there are surely better routes for any given molecule than the generic bang-it-together that this system provides. But that's what you do when you've discovered something valuable and want to optimize it - this chemistry is there to help you discover something valuable in the first place.
What's gradually dawned on me about total synthesis of natural products is that it's something a bit like process chemistry, but applied to molecules that will never be scaled up. You're optimizing yields at every step and looking for the fewest steps possible, although you're not exactly minimizing costs or waste streams (the way real process chemists do). It's come to seem more and more bizarre to me, actually. But that leads us to. . .
How, exactly, then, is an ugly bang-it-together approach going to kill off total synthesis, which is all about making difficult molecules via the best possible routes?Actually, I think that total synthesis is the most vulnerable part of organic chemistry to this whole way of working. Even its proponents know that it's been losing ground over the years, as the reasons for doing it narrow. Its best practitioners, to my mind, have been concentrating on making it as general and free from case-by-case detail-chopping as possible. At the same time, the reasons for making many of its targets are eroding (structure determination being the most notable of these). There's always a case to be made that pushing into hard natural products structures will cause new reactions to be discovered, but those new reactions (in some cases) could also get discovered by deliberately searching for them, without bothering to get up to step twenty-three before starting to look.
One of the standard rationales for synthesis of active natural products, though, has been that it can generate analogs of these structures to learn more about their mechanism of action. That's absolutely right, but it's all too often been just lip service. The list of analogs produced ends up being short, in the drive to make the natural product itself, and no one bothers to make too many "unnatural" variations. This sort of building-block approach, though, is a natural fit for making lists of analogs in a combinatorial fashion. Burke's group has, in fact, done just that with Amphotericin B, and it's interesting stuff. Many years back, I watched a few people sweating away at making that exact natural product by the traditional approach, and the contrast between that and the Burke group's work is probably one of the things that got me thinking about this whole business from the angle I'm taking.
I still don't see how anyone can get excited about this stuff - or at least not to this level. Fair enough - as mentioned above, we can agree to disagree, and time will sort us out, as it sorts out most things. But I think it's useful, even (or especially) for people who think this is crazy talk to be exposed to some of that crazy talk, and to realize that it's possible for people to think this way about it. I think that Wavefunction is right: I'm taking a more Whitesidesian view of organic chemistry. When I see someone making a molecule, I feel like asking "What's it for?", and anything that helps to answer that question is worth investigating. Back when I was working on my own PhD (total synthesis of a natural product!), I gradually realized that the only legitimate answer I had for that question was "To get me out of grad school". (Another way I put it was "The world does not need another chiral-pool synthesis of a macrolide antibiotic. But I do"). That affected me even more than I knew at the time, and I already thought it was affecting me a lot. Ever since then, I've been trying to never end up in that spot.
The other thing to keep in mind about yesterday's post is that (as I mentioned) I've been thinking about this whole thing on and off for months. So what that represented was a pretty good head of steam, all at once. I'm not saying that if you disagree, that you'll come to agree with me after thinking about it longer - it's probably the opposite, you'll think I'm even more off my head. But that's one reason why that post was so long, among the others cited already, and why it took the tone it did.
Update: depending on the reserve levels in your sense-of-humor tank, you may like this ad from the future. . .
Second update: more thoughts from Wavefunction.
Third update: the backlash begins in earnest!
+ TrackBacks (0) | Category: Chemical News
March 12, 2015
Unfortunately, I have it from people who are definitely in a position to know that Biogen is laying off scientists this afternoon. A number of experienced chemists have been let go so far, and I'll put up more details as I learn them. This is particularly unexpected, considering that the company appears to be in terrific financial shape. . .
Update: the company says that this is a small layoff in chemistry and neurology, and that head count is not being reduced. From what little I've seen of the situation, though, it almost looks as if they're clearing out a lot of longtime employees and will then be replacing them with new ones, who are (presumably) cheaper. It's true that you have to check to make sure that you have the right people for your needs - and let people go if you don't - but there are a number of fine lines in this area.
+ TrackBacks (0) | Category: Business and Markets
"It's not the end of the earth. But you can see it from there." That was Lou Holtz, talking about coaching football in Fayetteville, Arkansas. But today I'm talking about a new paper from Marty Burke and his group at Illinois, and although it isn't the end of organic synthesis, you can see it from there.
Now, that sounds a bit frightening, or a bit idiotic, or maybe a bit of both. But have a look at the paper. I had a chance to see him talk about this work a few months ago - I found it fascinating and startling, and I've been thinking about the implications ever since. This paper is the perfect opportunity to talk about it all (here's a commentary at Science). It's a summary of a lot of work that the Burke lab has been publishing over the last few years, and when you put it all together, there are some far-reaching consequences. On one level, it's about assembling sets of molecules from modular building blocks, each containing MIDA boronates and bromides. That's been a worthwhile reaction to study, since these boronates are very easy to handle and shelf-stable. What Burke's group has found, though, is that the MIDA complexes have an unusual property: they stick to silica, even when eluted with MeOH/ether. But THF moves them right off.
This trick allows something very useful indeed. It's a universal catch-and-release for organic intermediates. And that, as the paper shows, opens the door to a lot of automated synthesis. You take a MIDA boronate intermediate, and deprotect it to the free boronic acid. You then couple it to another intermediate, which has a reactive bromide (or what have you) at one end, and another MIDA boronate at the other. The solvent switch lets you purify the crude reaction by loading it onto silica, washing everything else off with MeOH/ether, and then eluting the MIDA-containing product with THF. Then you do it again. And again.
The paper shows a wide range of products produced in just this fashion. Yields are decent, although varied, but there's always product coming out the other end at some level. The number of possible compounds that can be made in this way is limited, at the first level, by the number of MIDA-boronate containing intermediates that you can synthesize, and you can certainly make a heap. At the second level, it's limited by the sorts of couplings that boronic acids can do, and we still don't have general methods to make them do bond formation between saturated carbons very well. But that's an area of intensive research, and it looks like a solvable problem, eventually. I would go so far as to suggest that this paper makes a good case for trying to get this to work with boronic acids (as opposed to alkylboranes, etc.), because of the immediate application of the catch-and-release purification, but we'll see what happens.
What gets me about this current paper, though, is the concept behind it. This has the potential to take a large part of organic synthesis into the realm now occupied by peptide and nucleotide synthesis. Those two are certainly easier problems - you have one kind of bond between every subunit, and a limited number of subunits themselves. But the advent of solid-phase iterative methods to synthesize these sorts of molecules was still a huge advance. It took making such things out of the realm of every-one-a-new-individual-challenge, and into the world of "Sure, we should be able to make that. Fire up the machine."
That first category, we should note, is where total synthesis of natural products has traditionally been. And proudly so. I've had a lot to say about that over the years around here, going back to 2002, but I'll summarize: I think that total synthesis was, at one time, one of the most vital and important parts of organic chemistry. But that day is past. Modern analytical methods have largely (although not quite totally) eroded the structure determination reasons for doing it, and modern synthetic techniques have put a vast number of molecules within theoretical reach. "Theoretical", in this case, meaning "Given enough postdocs, enough grant money, and enough time". That certainly wasn't always the case. When Woodward, Stork, or (fill in your favorite here) started out to synthesize some complex molecule back fifty years ago, it was often not very clear at all how one might go about it. Just coming up with a semi-plausible synthetic route was a real intellectual accomplishment, and dealing with what happened when these ideas met the real world was another. Total synthesis took all the brainpower and all the skill that could be brought to bear on it.
It's still not easy. But it's sure not the same. It's much harder to draw a molecule that's truly a stumper these days. We have so many reactions and approaches that you can generally come up with at least a paper synthesis - mind you, it may not be a very nice paper synthesis, but in the old days you probably couldn't even come up with that much. So if fewer and fewer molecules really are an adventure - or really promise to advance human knowledge in the course of making them - what's left?
What's left, I'd say, is for organic synthesis to get braced to take the next step. That is, it needs to stop being an end in itself, and start becoming a means to other ends. That's already what we use it for in drug research - the only reason we do organic chemistry is that we don't know any other ways to make small-molecule drug candidates. In the earlier stages of a project, we don't much care about way we make things, just so long as they get made. As I'm fond of saying, in discovery med-chem, there are only two yields: enough and not enough. Did you make a sample of the compound that can be tested in the assay? That's enough. And that's the primary concern - how you made it is secondary. This is sometimes a bit of a surprise for people coming from high-powered academic synthesis groups, because you can do an awful lot of good med-chem using just reactions from the first semester of sophomore organic chemistry, and you can do an awful lot of good med-chem while putting up with reaction yields that no academic group would stand for. But one adjusts.
We may all need to adjust. What if this MIDA boronate protocol, or some later variant of it, starts turning big swaths of organic synthesis into a process of stick-the-pieces-together? Like peptide synthesis? These routes may not be the most elegant and highest-yielding things ever seen, especially not at first. But that leads to the question of why you're making these molecules in the first place. Are you making them so that you can do something with them - test them as drugs, use them as nanotech building blocks, make a new battery or solar cell, investigate a new kind of material? Then fine - you probably have enough now to get started on the next phase of that idea, thanks to this Synth-O-Matic over here. Or are you trying to make the best possible synthesis of your molecules (fewest steps, highest yield, etc.)? In that case, you need to be careful. That's a very worthy goal if you already know that this is a valuable molecule, which is what the process chemists do in industry. But if it's just another new molecule, then why are you optimizing its synthesis? If along the way you're discovering new and better synthetic reactions and protocols, then good for you - but I would define "better" as "better able to be used to crank out new molecules for other purposes", not "done in five fewer steps than the last group had to use to make the same molecule". Not that alone. Not any more.
If organic synthesis become modular, then the new chemistry and new reactions are going to go more into making new modules. All our problems are still there - tricky functionality, multiple chiral centers, quaternary carbons. But if we end up making large molecules mostly by looking for boronate disconnections and stitching the pieces together, then we're on a hunt to make the pieces, not to make the whole molecules.
But what about the art? What about the elegance? Well, we're going to have to say goodbye to some of it. The printing press drove fine hand copy from the world - you don't see so many gold-leaf illuminated letters any more. More recently, and in our own field, the advent of modern analytical chemistry drove out the classic methods of structure determination. Now there was a puzzle worthy of the finest thinking that could be thrown at it. Old-fashioned degradation and derivatization was a fiendishly difficult challenge, like playing chess with the lights off and the moves called out in a language you don't know. But that kind of chemistry is gone, totally gone, and it'll never come back. No one does it like that any more. There were chemists who just couldn't face that, when it happened back in the 1960s and into the 1970s, when they found that what they were really good at was no longer of value. It was hard. But organic synthesis may have to face up to the same sort of realization, that time has overtaken it and that arts gratia artis is no longer a fit slogan to work by. This paper today is the first one that's really made me think that this transition is in sight. For me, organic synthesis is never quite going to be the same.
But in science, when something dies it's because something else is being born. The idea, the hope, is that if the field does become modular and mechanized, that it frees us up to do things that we couldn't do before. Think about biomolecules: if peptides and oligonucleotides still had to be synthesized as if they were huge natural products, by human-wave-attack teams of day-and-night grad students, how far do you think biology would have gotten by now? Synthesizing such things was Nobel-worthy at first, then worth a PhD all by themselves, but now it's a routine part of everyday work. Organic synthesis is heading down the exact same road - more slowly, because it'a a much harder problem, but (I think inexorably). Get ready for it. We're going to need to stop being so focused on just making molecules, and start to think more about what we do with them.
Note: for previous (and partly superseded) thoughts here on automated organic synthesis, see this post.
Update: for more thoughts on this, see here.(.
+ TrackBacks (0) | Category: Chemical News
This article at NEJM is looking at how well clinical trial results are made public, which has been a big topic over the last few years. Let me say up front that the results are quite interesting, and that some news outlets appear to be misreporting them.
Since 2007, it's been required by law that anyone sponsoring a clinical trial in the US register it at clinicaltrials.gov, and report at least a summary of the results within one year after finishing data collection for the trial's primary endpoint (or within a year of stopping it for any other reasons). These authors (all from Duke) found that the clinicaltrials.gov data can be messy to work with. It's not clear which trials in the registry are subject to the above legal requirements, so they first used someone else's algorithm to identify over 32,000 "highly likely clinical trials". Then they picked out the ones that were listed as "completed" or "terminated" before August 31, 2012 (to give everyone time to report), and that took the number down to 13,327 trials, all of which ended between January 1, 2008 and that 2012 cutoff. Any trial reporting results (or filing a request for an extension) by September 27, 2013 was considered to be legally acceptable.
How did everyone do? Only 13% of all the trials reported data within one year of completion, but the authors say that they still can't be sure how many of the trials being analyzed were required to report during that time (there are exceptions related to whether an intervention has been approved for marketing or not). Here's where they tried to correct for this:
We manually reviewed a sample of 205 HLACTs to determine requirements for reporting (Tables S15A and S15B in the Supplementary Appendix). By reviewing approval dates and labeling information, we determined that 44 to 45% of industry-funded HLACTs in this sample were not required to report results, as compared with 6% of NIH-funded studies and 9% of those funded by other government or academic institutions. On the basis of this review, we estimated that during the 5-year period, approximately 79 to 80% of industry-funded trials reported summary results or had a legally acceptable reason for delay. In contrast, only 49 to 50% of NIH-funded trials and 42 to 45% of those funded by other government or academic institutions reported results or had legally acceptable reasons for delay.
That's the real take-home of this article. The authors themselves say that:
Before the passage of the FDAAA, industry sponsors received particular scrutiny for selective reporting. Since the enactment of the law, many companies have developed disclosure policies and have actively pursued expanded public disclosure of data. Curiously, reporting continues to lag for trials funded by the NIH and by other government or academic institutions. Pfizer has reported that the preparation of results summaries requires 4 to 60 hours, and it is possible that the NIH and other funders have been unable or unwilling to allocate adequate resources to ensure timely reporting.
That much seems clear: the drug industry has been doing a significantly better job of complying with the law than publicly funded trials have. Some of the reports about this paper have picked up on this, but others have landed on that 13% overall figure and gotten stuck, even though (as the paper itself shows) many of the trials in that set were not even legally required to report data. The most detailed report in the press is probably this one from NPR. They get the results of the paper right, which is more than I can say for some others. I particularly noted, and not happily, that Ben Goldacre tweeted that figure along with a link to his book, "Bad Pharma", which juxtaposition implies that this number is both germane and the fault of the drug industry. I expected better.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Clinical Trials
I wanted to note that I have a long post about a very interesting chemistry topic coming up, but I won't be able to put it up until its embargo period ends at 2 PM Eastern. So keep your eyes out for that one!
+ TrackBacks (0) | Category: Blog Housekeeping
March 11, 2015
Ouch. When the "unclick" work from the Bielawski lab at Texas was found not to hold up, the word was that other papers involving the now-hard-to-find co-author (Kelly Wiggins) were being looked over.
And here come three more retractions. They all cite scientific misconduct on the part of "one of the co-authors" who was affiliated with the university at the time. This is quite the stink bomb for everyone involved, and as usual when something like this happens, you wonder how much could have been done to prevent it. But the unnerving truth is that if someone is willing to really go all-out in faking data, it can be rather hard to catch them. For one thing, you don't expect someone to be just making it all up - it can be hard to get your head around that idea (and early in my chemistry career, I encountered a case of just that, so I speak from a minor sort of experience). And if the data have been hocused well, the numbers and results can look quite convincing. In the end, we're often taking each other's word for stuff in science, and if you want to abuse that, you can: for a while.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
Enough depression for now. The last couple of posts have not been cheerful, because there's a lot of not-so-cheerful stuff out there in the business side of the industry. But I wanted to remind everyone (and myself) that we're actually getting some good things done around here. Take a look at oncology - it's starting to look like this could be the era when people, decades from now, will say that the corner was turned. Some of the results coming out of recent trials have been really eye-opening, both the small molecules and the biologics. And the recent focus on immune-based therapies has shown a lot of dramatic progress (which is also attracting a lot of dramatic investment money). We're just barely starting on these things, and on the combinations that need to be tried. Real progress is really being made.
That's just one therapeutic area, although it's a big one. There are others advancing as well. That's the frustrating part, in a way - coming into this field de novo, you'd look around and see so many opportunities that you wouldn't know where to start. But many of the existing businesses (and in some cases, existing business models) are having a heck of a time fitting in. Running an organization the size of a Merck or a Pfizer, with those expenses and those legacy commitments and that overhead, really is a beastly job. But no business is owed some sort of right to always exist in its present form. Business-wise, this is an ugly period. Scientifically, it's really quite good. Bridging those two, now - that's where all the clanging noises are coming from.
+ TrackBacks (0) | Category: Business and Markets | Drug Industry History
Let's get the bad news out of the way first. Amgen's purchase of Onyx now has the widely expected sequel: they're closing down that site, and laying off 300 people.
Kyprolis is working out fine, and that's all Amgen wanted. The only surprise, sadly, is that it took them this long.
+ TrackBacks (0) | Category: Business and Markets
March 10, 2015
This is not a happy column by John Carroll at FierceBiotech. But it's an accurate one. He references the period just a few years ago when Pfizer closed its Sandwich site in the UK and Roche closed Nutley, NJ, among other upheavals.
That period of intense Big Pharma turmoil, though, has failed to create a new normal that can offer investigators greater confidence that they'll be able to keep their jobs. And the disruption is continuing with a new wave of restructuring every bit as traumatic as the first tsunami of makeovers.
Honestly, I don't know anyone in this business, large company or small, that can say truthfully that they have great confidence in keeping their jobs. Certainly not with any time horizon as great as, say, four years, the time between the closures mentioned above and now. Turmoil is the new normal; Carroll's right about that. It's been this way for years now, to the point where we hardly notice it, except when another dramatic shakeup hits.
That's why I tend to lose patience with critics of the drug business, even though they may have other valid points, when they start going on about big, complacent pharma, drowsy and bulging with cash. It's hard to square that picture with the experience of actually working in this industry, when it feels like some sort of lunatic combination of a roller coaster and a demolition derby. If drug companies are such unstoppable money machines, how come everyone seems to be running around in Headless Poultry Mode?
+ TrackBacks (0) | Category: Business and Markets | Drug Industry History
I remember when I was in my grad school research group, looking up at the shelves of lab notebooks from past students and postdocs. Some of these were taken down and used from time to time - "Oh yeah, so-and-so made that intermediate one time, must be in one of his notebooks" - but many of them rested undisturbed. The same goes for many a PhD dissertation (I know mine hasn't troubled very many people, that's for sure).
At the time (and we're talking mid-1980s, dang it all) I kept wondering if there were some way to get at all this information and make it searchable, especially by structure. Since my group worked in a fairly specialized area of organic synthesis, many of the former reactions had at least a chance of being relevant to some future grad student. WIth 1985 hardware and software that was a bit of a tall order, but when I use my current electronic notebook I'm seeing those thoughts made real, since I can search company-wide by structure (or whatever other criterion I can dream up).
But there are certainly masses of unpublished data sitting out there, and a group in the UK is trying to bring it out into the light. In a pilot project, they've gone over 750 PhD dissertations from 15 universities, and extracted 45,000 structures (75,000 if you count all the stereoisomers). This effort is funded by the Royal Society of Chemistry, so the structures are going into ChemSpider (and most of them were new to it, as well they might be).
As the article details, the original hope was for a physical collection of compound samples, but a variety of issues - not least, cost - has made that hard to realize. So this is a virtual set so far, which would be available for virtual screening. And here's where opinions really start to diverge. 75,000 physical compounds makes for a pretty good storage and dispensing effort, but that many virtual compounds is a tiny drop in a very large bucket. And if you'd like to double, triple, or quadruple that number of compounds, it would be short work, computationally. Of making virtual screening sets there is no practical end. (One wonders how many of the RSC dissertation compounds are already in the GDB sets). It's the chemical space problem - if you're just filling it out randomly, there's an awful lot to fill.
Now, that same argument can be applied to a physical compound collection, and such a collection will inevitably be even smaller than a virtual set. But it has a big advantage in utility, because the fact is that we really can't do virtual screening with a great deal of confidence yet. Real compounds against real proteins, that's the way to go if you can do it. The expense will be far greater, though. Just collecting the compounds themselves will be a major effort, and you need a building to store them and all the equipment that big screening collections call for. The curation of the set will be an even bigger pain: it's absolutely certain that a reasonable percentage of such compounds will be clean, but not what they say on the bottle, and what may be an even larger percentage will not be very clean at all. How to deal with those?
From what I can see, the RSC is working on those questions, and I wish them luck (and funding). What if everyone who did work in a publicly funded lab in the UK sent samples in to the National Compound Collection? What if we did that in the US? We have a lot of chemistry going on in academia, and untold numbers of compounds that get squirreled away in vials and stuck in desk drawers. How much money would it take to get them brought together, and would such an effort pay off?
+ TrackBacks (0) | Category: Chemical News | Drug Assays