About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
October 31, 2011
Thoughts from Matthew Herper at Forbes about Steve Jobs, modern medicine, what innovation means, and why it can be so hard in some fields. This is relevant to this post and its precursors.
+ TrackBacks (0) | Category: Drug Development | Who Discovers and Why
Over the weekend, it became clear to me that there had been a case of identity spoofing in the comments to this post. A person had left comments while claiming to be a Merck HR employee, but this same Merck employee contacted me, in understandable confusion, when colleagues asked him about this. That's because he'd never posted anything here at all.
I'd wondered at the time about what was going on - neither of the comments were posted from a Merck IP address, but that wasn't necessarily surprising either way. But hearing from the person himself, most definitely from Merck this time, made up my mind very quickly.
There have been occasional games played around here in the comments section, and most of it's harmless. I'll let jokes through that claim to be from various Nobel Prize candidates - that's just the Internet doing one of the things it's good at. But deliberate assumption of identity like this is another thing altogether. The original comments have been deleted, and the responses to them are now "inoperative", as they used to say in Nixon's day.
+ TrackBacks (0) | Category: Business and Markets | The Dark Side
I occasionally cover odd attempts at alternative - very alternative - energy sources here, because there's a chemistry angle to many of them. The various cold fusion claims have always gotten a slightly less frosty reception among professional chemists than among professional physicists, on average. And yes, there are two good explanations of that, which are not mutually exclusive: (1) that the chemists are willing to be a bit more open-minded since (among other things) they have less invested in the state of physics as it is, and (2) that the chemists are willing to be more open-minded because they know less about physics.
So far, the track record on these things has been pretty close to 100% hardtack disappointment, dry as dust and crunchy as hell. But as Tyler Cowen put it over at Marginal Revolution, the expected value of such things is so high that a small amount of attention is worthwhile. The latest headline-grabber is a mysterious thingie from Italy called the E-Cat, which I mentioned briefly here back in July.
The inventors apparently concluded a larger-scale demonstration over the weekend, as reported here, at the request of an unnamed client from the US. The problem, as that article shows, is that we really don't have a lot more to go on: this "client" could plausibly be DARPA, or that could (also plausibly) just be what the device's backers would like for everyone to think, the better to fleece the unwary in the next round.
So for now, I'm just noting this with cautious interest. I certainly hope that the people behind this are operating in good faith, in which case I will in good faith wish them well. But we'll see what happens next, if anything. For now, the "snake oil" tag stays on.
+ TrackBacks (0) | Category: General Scientific News | Snake Oil
October 28, 2011
The cutbacks at Merck seem to have been pretty severe, if the messages that I'm getting from former Schering-Plough people are any indication. A lot of longtime R&D people have been let go, which is no surprise when you see what's been happening over the last few years with Pfizer's acquisitions (just to pick the biggest example). Experience, past accomplishments, and ability are not very high at all on the list of factors being judged when it comes to this point.
It's worth asking just how well that whole Schering-Plough deal is going for Merck, though. Here's a thorough breakdown of all the pipelines at the time the deal was going through. You can see that some of the areas (women's health, respiratory) have worked out as planned, but some others (cardiovascular, hepatitis C) have definitely not. And (as that link makes clear) one of the big variables when the deal went through was how much money would be left from the J&J deal after arbitration. If you look at the company's earnings, it's a mixed bag. Singulair is the biggest on the list, but that one's going off patent next year. Remicade is bringing in some money, after the territories were split up, with Merck holding on to Europe, Russia, and Turkey. The only other product from the Schering-Plough deal on the top-selling list is Nasonex, and that just makes the cut.
I just have to wonder how different this press release would have been if the deal hadn't gone through at all. But sales figures aside, what we don't see is the huge disruption in research and early development, just as you don't see that in Pfizer's deals over the years. You don't notice the drugs that don't get discovered, the early projects that don't quite advance. Was it all really worth it?
Like all the other mergers, this one only makes sense if you factor in big cost reductions - that DataMonitor link above makes this clear. And Merck does indeed look as if they're cutting their expenses as planned, so perhaps these numbers will come out right on target, and earnings-per-share will follow along. But what happened to Ken Frazier's brave attempt to withdraw EPS guidance entirely and focus on rebuilding the company's R&D? Was that just window dressing, was it an honest effort to change things that has now been abandoned, or what?
+ TrackBacks (0) | Category: Business and Markets | Drug Industry History
October 27, 2011
While mentioning lab equipment, I thought I'd also note that I've been contacted by a fellow who's trying to interest people in a newly invented high-throughput low-footprint liquid handler device that he's prototyped. Haven't seen it in action, don't know him personally, but he's out in the Bay area for the next few days, and can be reached at carlcrott-at-gmail-dot-com, if you're in the market for that sort of thing. I figure in this business environment, people can use a break. . .
+ TrackBacks (0) | Category: General Scientific News
Talking about hydrogenation here the other day brought up another thought: there's a point where lab work becomes quite difficult, and there are not a lot of good options to help with that. I'm talking about scale-up work, the grey zone between benchtop synthesis and production.
The first of those is where I spend my time. I've often said that there are really only two yields to be calculated in medicinal chemistry: enough and not enough. For some cases, "enough" can be two milligrams, if all you need is one assay (although you're not adding to the screening collection that way). Twenty is plenty for a new compound; it'll be in stock for years at the usual rates of screening.
But later on, when you start to get interested in a particular molecule, those numbers inflate quickly. In vivo tests, PK, toxicology - all these can start to chew up much larger amounts of compound. Hundreds of mgs, then grams, tens of grams, hundreds of grams to get through preclinical because it turns out that you need another large-animal run - those of you in the labs will be familiar with the progression. And well downstream of people like me, there's pilot plant work and real commercial production, where things are measured in kilos and up.
Those folks have a tricky job, but they have one advantage: they know what compound they're making, and eventually they've settled on a route that works. (If you can't do that, well, you don't have a drug). And for a real money-making compound, it's worth investing in dedicated equipment, whole dedicated facilities if need be, just to make it correctly. Earlier in the process, though, you have to be ready for all kinds of chemistries to make all kinds of compounds.
The medium-scale is where those two worlds collide. A medicinal chemist can generally make things up to the tens of grams, maybe a hundred or two, using roughly the same techniques that are used on the smaller scale: round-bottom flasks, suction filtration, magnetic stirring, standard-sized rotary evaporators, silica gel columns. Everything gets bigger and more unwieldy, and it always takes more time (and more solvent) than you thought, but it can get done. Some of the more exotic small-scale chemistries do start to break down on you, which also also adds to the time needed when you have to come up with alternatives.
But if you're always having to work on roughly the hundred-gram scale, you're straddling two regimes. The size of the glassware gets hard to manage - things are heavy, they tip over, they crack - and you really have to have more serious capacity for things like solvent evaporation. But this is way too small for industrial-plant equipment, the kind of thing where you design the process to start on the third floor to take advantage of gravity as you pump the contents of the big batch reactor downstairs for the next step. And it's getting too big for scaled-up versions of standard equipment, but at the same time, you need the versatility that general-purpose labware provides.
Some kinds of gear help to bridge this gap - overhead mechanical stirrers, outboard circulating chillers, and large-capacity rota-vaps come to mind. But there are many other cases where something that's neither benchtop nor pilot plant is needed, and doesn't necessarily exist. Hydrogenation is a case in point. It's done on an industrial scale; that's where all that partially hydrogenated vegetable oil comes from, for one thing. And hydrogenation is common on the gram scale or below. But hydrogenating three hundred grams of something can be a real pain in many labs. The common solution is roll-your-eyes-and-split-it-into-batches, but that gets old fast. . .
+ TrackBacks (0) | Category: Life in the Drug Labs
October 26, 2011
With all the recent talk about the NIH's translational research efforts, and the controversy about their drug screening efforts, this seems like a good time to note this interview with Francis Collins over at BioCentury TV. (It's currently the lead video, but you'll be able to find it in their "Show Guide" afterwards as well).
Collins says that they're not trying to compete with the private sector, but taking a look at the drug development process "the way an engineer would", which takes me back to this morning's post re: Andy Grove. One thing he emphasizes is that he believes that the failure rate is too high because the wrong targets are being picked, and that target validation would be a good thing to improve.
He's also beating the drum for new targets to come out of more sequencing of human genomes, but that's something I'll reserve judgment on. The second clip has some discussion of the DARPA-backed toxicology chip and some questions on repurposing existing drugs. The third clip talks about the FDA's role in all this, and tries to clarify what NIH's role would be in outlicensing any discoveries. (Collins also admits along the way that the whole NCATS proposal has needed some clarifying as well, and doesn't sound happy with some of the press coverage).
Part 5 (part 4 is just a short wrap-up) discusses the current funding environment, and then moves into ethics and conflicts of interest - other people's conflicts, I should note. Worth a lunchtime look!
+ TrackBacks (0) | Category: Academia (vs. Industry) | Drug Assays | Drug Development
Readers will recall my occasional pieces on Intel legend Andy Grove's idea for drug discovery. (The first one wasn't too complimentary; the second was a bit more neutral). You always wonder, when you have a blog, if the people you're writing about have a chance to see what you've said - well, in this case, that question's been answered. Here's a recent article by Lisa Krieger in the San Jose Mercury News, detailing Grove's thoughts on medical innovation. Near the end, there's this:
Some biotech insiders are angered by Grove's dismissal of their dedication to the cause.
"It would be daft to suggest that if biopharma simply followed the lead of the semiconductor industry, all would be well," wrote Kevin Davies in the online journal Bio-IT World.com. "The semiconductor industry doesn't have the complex physiology of the human body -- or the FDA, for that matter, to contend with."
In his blog "In The Pipeline," biochemist Derek Lowe called Grove "rich, famous, smart and wrong." Grove's recent editorial, Lowe said, "is not a crazy idea, but I think it still needs some work. ... The details of it, which slide by very quickly in Grove's article, are the real problems. Aren't they always?"
"Sticks and stones. ... There were brutal comments but I don't care. The typical comment is 'Chips are not people, go (expletive) yourself.' But to not look over to the other side to see what other people in other professions have done -- that is a lazy intellectual activity."
My purpose in these posts, of course, has not been to insult Andy Grove. That doesn't get any of us anywhere. What I'd like to do, though, since he's clearly sincere about trying to speed up the pace of drug discovery (and with good reason), is to help get him up to speed on what it's like to actually discover drugs. It's not his field; it is mine. But I should note here that being an "expert" in drug discovery doesn't exactly give you a lot of great tools to insure success, unfortunately. What it does give you is the rough location of a lot of sinkholes that you might want to try to avoid. ("So you can go plunge into new, unexplored sinkholes", says a voice from the back.)
Grove's certainly a man worth taking seriously, and I hope that he, in turn, takes seriously those of us over here in the drug industry. This really is a strange business, and it's worth getting to know it. People like me - and there are still a lot of us, although it seems from all the layoffs that there are fewer every month - are the equivalents of the chip designers and production engineers at Intel. We have one foot in the labs, trying to troubleshoot this or that process, and figure out what the latest results mean. And we have one foot in the offices, where we try to see where the whole effort is going, and where it should go next. I think that perspectives from this level of drug research would be useful for someone like Andy Grove to experience: not so far down in the details that you can't see the sky, but not so far up in the air that all you see are the big, sweeping vistas.
And conversely, I think that we should take him up on his offer to look at what people in the chip industry (and others) have done. It can't hurt; we definitely need all the help we can get over here. I can't, off the top of my head, see many things that we could pick up on, for the reasons given in those earlier posts, but then again, I haven't worked over there, in the same way that Andy Grove hasn't worked over here. It's worth a try - and if anyone out there in the readership (journalist, engineer, what have you) would like to forward that on to Grove himself, please do. I'm always surprised at just how many people around the industry read this site, and to start a big discussion among people who actually do drug discovery, you could do worse.
+ TrackBacks (0) | Category: Drug Development
October 25, 2011
Novartis has had fewer examples of the layoffs and closures that have beset the rest of the drug industry, but no one's immune. Reports are that they're eliminating 1,100 jobs in Europe and about 1,000 here in the US (and here's more from the Basler Zeitung, if you read German). And meanwhile, yes, 700 positions will be added in India and China (not research - data handling and trial management).
This isn't on the scale of some of the Pfizer layoffs, but it's bad enough. And is anyone willing to think that this will be the end?
+ TrackBacks (0) | Category: Business and Markets
Today brings some pharma business news that makes no sense to me - see what you think. Via FierceBiotech, we have this report from Sky News in the UK. It's all about a new venture from Christopher Evans, which is said to be ready to "change the business models" of the big drug companies.
Howzat? Here, apparently is how:
Evans has lined up a heavyweight group of industry executives to join NCPharma Inc, which plans to raise billions of pounds to finance the acquisition of portfolios of medicines that are in development. They would then be sold back to the big pharmaceutical companies which founded them once they are ready to be taken to market. . .
. . .The company is initially aiming to raise $750m to finance the development of 32 clinical compounds licensed from Merck, according to insiders. As deals are done with other pharma companies, scores more products will be added to NCPharma’s development portfolio.
Conceivably that could mean that NCPharma is developing as many as 150 drug compounds within a few years if it persuades a handful of companies to partner with it.
Well, no, actually, that is not conceivable. 750 million is nowhere near enough to seriously pursue 32 compounds across several different therapeutic areas in the clinic, for one thing. And how, exactly, does Merck come to be sitting on 32 viable clinical candidates that they haven't quite gotten around to developing? A cynical mind might imagine that the company is cleaning out its desk drawers and wishing NCPharma good luck with its dusty collection of also-rans. Something does not add up here.
And how does this new company plan to succeed with the castoffs from the rest of the industry? Well you should ask:
The company believes it will be more effective than the integrated drugs companies at developing late-stage candidates because it will be “completely focused on clinical development of new pharmaceuticals and will not be affected by major overhead nor distracted by large corporate infrastructure, early research, large scale manufacture nor marketing activities. It will license-in only the best quality, premier drug portfolios from big pharma and biotech companies to develop in its unique, low-risk model.”
That is a string of grammatically correct English words, indeed it is, but it does not convey sense. Big pharma and biotech companies will not outlicense their best quality stuff - why should they? NCPharma will face the same costs and the same success rates as anyone else in the clinic - how can they not? And it's not like there aren't other companies out there that will provide outsourced clinical trial services for you; this is not some new business model that's just been thought of.
No, the only way I can make this add up is to note that there's a lot of talk about Middle East/Gulf States money in this venture. There's not much pharma expertise in that part of the world, and a cynical observer might see this whole thing as an arbitrage play. One group has a lot of money, but no drug industry. Meanwhile, the pharma industry has a lot of failed or sidetracked drug candidates, and no desire to spend more cash on them. And a third group, well, they exist to bring those two together. Am I missing anything?
+ TrackBacks (0) | Category: Business and Markets
October 24, 2011
I wanted to take a moment to highlight this series of posts over at Chemjobber. He's interviewing fellow chemists who have been laid off and asking for practical advice on how it went, how it's going, and what to do. Others are welcome to send along their own stories; see his e-mail contact at the site. It's a painful subject, but boy, is it a real one these days.
Update: there was a case of identity spoofing in the comments to this post, with someone claiming to be a Merck employee. I've removed the original comments, and (after some thought), I've also removed the (faked) name from the replies that they gathered. Those comments were posted in good faith, but I'm trying to get the stolen identity cleaned out.
+ TrackBacks (0) | Category: Business and Markets
We organic chemists have always liked the hydrogenation reaction. Take your compound up in a solvent, add a pinch of black catalyst powder, and put some hydrogen gas into the vessel. Come back a few hours later, filter off the catalyst, and there's your cleanly reduced compound, ready for the next step, often looking even better than it did before you ran the reaction.
For many decades, the standard ways to run these reactions have been to either take a balloon of hydrogen gas and attach it to the top of your round-bottom flask (as in this video clip), or run it on a "Parr shaker". That last piece of equipment has been with us, essentially unchanged, since the 1920s. It's simplicity itself: a thick-walled glass bottle for your reaction, a tube and stopper running into it (with a framework to hold it down under pressure), a hydrogen reservoir, and a motor to shake the bottle around. Its relentless dackadackadackadacka noise is one of the standard sounds of organic chemistry. These things are always off in separate hydrogenation rooms, and when you have several of them running in there at once the out-of-phase clatter makes sequential thought almost impossible. I wish that there were an audio file I could link to, but working organic chemists will all know the tune.
There are newer ways to run the reaction, and flow chemistry is the obvious choice. The "H-Cube" was an early entry into this space, and many of them are to be found around the chemistry world. Unfortunately, many of them are also found gathering dust. Uptake of the machine has been uneven, despite some obvious advantages. That's because the first-generation machine has some obvious disadvantages, too: you have to change the catalyst cartridge every time you want to try something different, because there's only one at a time. The cartridges themselves are not too large, so if your reaction isn't efficient enough, you can have a problem with not being able to run everything in one-time-through mode. And there's no liquid handling - you have to load your sample and collect it in whatever means you see fit. Various people have modified the machine over the years to get around these limitations, and the company now sells a machine incorporating many of these ideas. And there are competitors out there as well.
So here's my question for the chemical audience: has anyone had enough nerve to ditch the Parr shakers completely? I've heard of places that have done it, but when you inquire closely, you often find that there are still a couple around that do a disproportionate share of the hydrogenations. Are there any flow solutions that work well enough to get away with this? You'd think that there would be advantages to a walk-up instrument, if it were robust enough - put your starting solution in position A-3 on the rack, tell it what pressure and temperature you want, which catalyst to use, and add your run to the queue. Come back after lunch and there it is, eluted into another container, ready for you to pick up. NMR machines work this way, and so do microwave reactors. But do hydrogenators? Today, in the real world? Experiences with such things welcome in the comments. . .
+ TrackBacks (0) | Category: Life in the Drug Labs
October 21, 2011
Science is reporting some problems with the NIH's drug screening efforts:
A $70-million-a-year program launched 7 years ago at the National Institutes of Health (NIH) to help academic researchers move into industry-style drug discovery may soon be forced to scale back sharply. NIH Director Francis Collins has been one of its biggest champions. But the NIH Molecular Libraries, according to plan, must be weaned starting next year from the NIH director's office Common Fund and find support at other NIH institutes. In a time of tight budgets, nobody wants it.
The fate of the Molecular Libraries program became “an extremely sensitive political issue” earlier this year when NIH realized it would not be easy to find a new home for the program, said one NIH official speaking on background. . .
. . .John Reed, head of the Sanford-Burnham Medical Research Institute screening center in San Diego, which receives about $16 million a year from the Common Fund, says his center has so far attracted only modest funding from drug companies. He expressed frustration with the Common Fund process. “NIH has put a huge investment into [the Molecular Libraries], and it's running very well,” he says. “If there's not a long-term commitment to keep it available to the academic community, why did we make this hundreds of millions of dollars investment?”
Good question! This all grew out of the 2003 "NIH Roadmap" initiative - here's a press release from better days. But it looks partly to be a victim of sheer bad timing. There's not a lot of extra money sloshing around the drug industry these days, and there sure isn't a lot in NIH's budget, either. You wouldn't know that there's a problem at all from looking at the program's web site, would you?
Since I know there are readers out there from both sides of this particular fence, I'd be interesting in hearing some comments. Has the screening initiative been worthwhile? Should it be kept up - and if so, how?
+ TrackBacks (0) | Category: Academia (vs. Industry) | Drug Assays
I would like to heartily recommend the policy outlined in this post: that anyone advocating some political, economic, or social proposal should first be required to write a short essay explaining what the hell it is, and thus demonstrating that they have some minimal idea of what they're talking about. We will never see such a thing in this world, but a man can dream.
In an open forum, there is generally a good correlation between the passion with which some idea is advanced and the ignorance of the person advocating it. The comments section of any blog - this one not excepted - will demonstrate this to anyone with doubts. (That's also why I support this worthwhile initiative), one of many proposed by its parent web site. Yeats had it right: "The best lack all conviction, while the worst / Are full of passionate intensity."
And so it has always been. In the purview of this blog, for example, no one in my experience ever offers the tentative conclusion that the drug companies might possibly be an evil conspiracy to poison the public. No, that view is delivered with powerful conviction, accompanied by an equally strong belief that anyone who thinks differently is either a moron or a bought-and-paid-for tool.
Correcting for ignorance, were it possible, would change the world. I recall this insight hitting me with some force about 25 years ago. I was watching TV coverage of the House debating a bill that would have provided aid to the Nicaraguan contras. A graphic came up on the screen of a public opinion poll on the issue - this many people thought we should give them money, this many didn't. But then a follow-up question was shown, where they asked the same sample who these contras were. And an alarming number of people answered either "don't know" or thought that they were part of the Nicaraguan government forces, which made me realize that no weight whatsoever should have been given to the answers to that earlier question. If you don't know who the contras are, in other words, why should anyone care what you think should be done about them?
Allow me to wander off topic a bit - anyone who wants can bail out at this point; the rest of this post will be idle political speculation. OK, that line of thought leads one to several interesting conclusions about voting. I've long thought very much like this. I think that strenuous efforts to get people to vote are misguided - if someone is not motivated enough to get out and vote in an election, then society is better off if they do, in fact, stay home. And I'm not advocated some sort of closed-off elite; the doors are always wide open. There are thousands upon thousands of ways for someone to become more informed about any issue or any candidate, and if a person does not avail themselves of any of them, they have (in my view) disqualified themselves from voting.
That, though, leads us back to Yeats and that passionate intensity problem. Doesn't this mean that a lot of strongly motivated voters will, in fact, be ignorant? My solution to that, which I've been advocating since I was about seventeen, is for all voting booths to have two doors. The inner one can be the usual curtain. The outer one, though, presents the prospective voter with a few questions on general political and social knowledge, randomly selected from a larger pool. How often is your state's governor elected - every two years, every four, every six? Which of these names is the name of your state's other senator, the one who's not up for re-election this time? Who writes budget bills, the House or the Senate? That sort of thing. But if you can't get a majority of these high-school-civics questions right, the outer door does not open, and you must go home. When I'm in a bad mood, I toy with the idea of rigging up some sort of trap door system as well, but that's harder to implement.
Oh, I'm just full of improving ideas. I'd also like to see "None of the above" be an option on all ballots. What if NOTA wins? Well, new election in sixty days, and none of the previous candidates can run. It's been pointed out to me that had this system actually been in force that we might be behind by several presidential elections by this point, but I'm still not convinced if that's a bug or a feature. And another reform that's often occurred to me would probably only be possible in a much smaller country than the US. I could imagine, though, getting everyone in such a state together and asking which of them really, really wanted to be President. Whoever raises their hand is disqualified. There really should be some way to weed out candidates whose life's burning ambition is to Be In Charge. I'm reading The Twelve Caesars by Suetonius and Gibbon's Decline and Fall these days, and you can't help but think that the Roman Empire ran things the exact opposite way: the people who climbed to the top were the ones who were willing to make it the organizing principle of their entire lives. The same goes for any autocracy.
And in fact, just to drag things back by force to the usual topics of this site, it often goes for large companies. Recall this stuff - Tiberius would have nodded and smiled. And you didn't want to see what made him smile.
+ TrackBacks (0) | Category: Current Events
October 20, 2011
You'd think that Georgia Tech's new undergraduate chemistry buildings would be decorated with chemical structures that (at least) don't violate the most basic rules of chemistry. You would be wrong. Who signed off on this stuff?
+ TrackBacks (0) | Category: Chemical News
So Abbott is spinning off the pharma business into a separate company - did anyone see that coming? (I take a day away from the computer, attending a meeting, and this happens). Let's look at this plan and try to figure it out.
First off, this is obviously a reaction to worries about prospects for the pharma side of Abbott's business. The medical devices side is doing fine; it's not like the high-flying pharma organization is trying to toss out a sandbag or something. A lot of that worry is probably centered around the long-term prospects for Humira, which is operating in an increasingly crowded space and accounts for a rather large share of revenues all by itself.
So in that sense, this is a move peculiar to Abbott. But the thinking behind it is common to all the large drug companies, as this Wall Street Journal story details. It's just that various companies are running off in various directions in response. You have some saying "Gosh, we've just got to get back to our core business and do pharma better", while others say "Gosh, we've got to diversify - let's get some consumer products in here, some medical devices, animal health, anything less crazy than drug discovery". And even allowing for the fact that these companies are starting off from different places, with different levels of difficulty, it seems clear that no one really has a strategy that's convincing enough even to themselves. Something Has to Be Done, so everyone's doing Something, and hoping for the best.
But in this case, you have to worry that the (so far unnamed) drug company that Abbott's spinning off will have its work cut out for it. The new company will be getting, what, three quarters of its revenue from Humira? That's a rough situation for any company with any drug, much less a drug that's heading into white water. And how much of the rest of the revenues are from TriCor and Niaspan, both of which face patent expirations? I know that they have things in the clinic, sure, but it's hard to see how this new company doesn't shed jobs at some point. I had a series of worried e-mails waiting for me last night from Abbott pharma people, and I think that they're right to be worried. I'd be very glad to hear counterarguments, let me tell you.
No, when you look at it, the company seems to have decided that amputation is just the cure that they needed. The fact that the Abbott name is staying with the medical devices company is all you need to know.
+ TrackBacks (0) | Category: Business and Markets
October 19, 2011
I haven't had any chance to verify this, but I've heard from a source that I have no reason to doubt that Merck may be announcing details of a reorganization in R&D later this week. Anyone else heard the same?
And on a similar topic, here's a post from John LaMattina asking what many people have at one point or another: how come Wall Street analysts get so much influence over how much a drug organization spends on R&D? His examples are Merck, Lilly, and Amgen, and his take is:
Now, I am all for monitoring R&D budgets to maximize the returns from these investments. And I am all for accountability – asking the R&D organization to deliver new candidates to the pipeline, having formal goals with rigorous deadlines, and for running clinical trials as expeditiously as possible while keeping a close eye on costs. But for Wall Street to reward a company for lowering R&D spending and attack those that want to commit to R&D is absurd. Like it or not, R&D IS the engine that powers a pharmaceutical company. It is also a high-risk endeavor. Furthermore, given all of the hurdles that now exist especially with regard to ensuring safety and having sufficient novelty to justify pricing, R&D is more expensive than ever. But, if you want to succeed, you have to invest – substantially. There are no short cuts.
Wall Street's answer, which may be hard to refute, is that if you want the access to capital that the stock market provides, then you have to accept the backseat driving as part of the deal. But do we get the same degree of it as other industries, or more?
+ TrackBacks (0) | Category: Business and Markets
October 18, 2011
I wanted to mention a book I've received a review copy of recently: Writing Chemistry Patents and Intellectual Property: A Practical Guide. The description is accurate. It'll be most useful for people who don't have access to a lot of well-paid legal talent - or at least would like to get things into shape as much as possible before calling them in and starting the meter running. It goes into detail on what makes a valid application, what patent examiners are trained to look for, and how to draft an application that will stand the best chance of surviving scrutiny. It's not a replacement for a patent attorney - you're still going to need one - but it can keep you from wasting the time of one, or from spending your own money while doing so.
Note added for legal reasons: that's an Amazon affiliate link, meaning that Amazon will (without raising the price to you) rebate a small amount of each purchase you make to me - not just that book, but whatever else you might purchase at the same time. I promise to spend it on the sorts of riotous living that one can fund only through Amazon gift cards.
+ TrackBacks (0) | Category: Book Recommendations | Patents and IP
Under the "Who'da thought?" category, put this news about cyclodextrin. For those outside the field, that's a ring of glucose molecules, strung end to end like a necklace. (Three-dimensionally, it's a lot more like a thick-cut onion ring - see that link for a picture). The most common form, beta-cyclodextrin, has seven glucoses. That structure gives it some interesting properties - the polar hydroxy groups are mostly around the edges and outside surface, while the inside is more friendly to less water-soluble molecules. It's a longtime additive in drug formulations for just that purpose - there are many, many examples known of molecules that fit into the middle of a cyclodextrin in aqueous solution.
But as this story at the Wall Street Journal shows, it's not inert. A group studying possible therapies for Niemann-Pick C disease (a defect in cholesterol storage and handling) was going about this the usual way - one group of animals was getting the proposed therapy, while the other was just getting the drug vehicle. But this time, the vehicle group showed equivalent improvement to the drug-treatment group.
Now, most of the time that happens when neither of them worked; that'll give you equivalence all right. But in this case, both groups showed real improvement. Further study showed that the cyclodextrin derivative used in the dosing vehicle was the active agent. And that's doubly surprising, since one of the big effects seen was on cholesterol accumulation in the central neurons of the rodents. It's hard to imagine that a molecule as big (and as polar-surfaced) as cyclodextrin could cross into the brain, but it's also hard to see how you could have these effects without that happening. It's still an open question - see that PLoS One paper link for a series of hypotheses. One way or another, this will provide a lot of leads and new understanding in this field:
Although the means by which CD exerts its beneficial effects in NPC disease are not understood, the outcome of CD treatment is clearly remarkable. It leads to delay in onset of clinical signs, a significant increase in lifespan, a reduction in cholesterol and ganglioside accumulation in neurons, reduced neurodegeneration, and normalization of markers for both autophagy and neuro-inflammation. Understanding the mechanism of action for CD will not only provide key insights into the cholesterol and GSL dysregulatory events in NPC disease and related disorders, but may also lead to a better understanding of homeostatic regulation of these molecules within normal neurons. Furthermore, elucidating the role of CD in amelioration of NPC disease will likely assist in development of new therapeutic options for this and other fatal lysosomal disorders.
Meanwhile, the key role of cholesterol in the envelope of HIV has led to the use of cyclodextrin as a possible antiretroviral. This looks like a very fortunate intersection of a wide-ranging, important biomolecule (cholesterol) with a widely studied, well-tolerated complexing agent for it (cyclodextrin). It'll be fun to watch how all this plays out. . .
+ TrackBacks (0) | Category: Biological News | Infectious Diseases | The Central Nervous System | Toxicology
October 17, 2011
I've had some problems over the years with the Singularity-Is-Near line of thought, and some problems with the "If we can build a new generations of microchips in five years, we ought to be able to cure cancer in ten" idea. Here's an article by Paul Allen in Technology Review that takes aim at both of these simultaneously:
The complexity of the brain is simply awesome. Every structure has been precisely shaped by millions of years of evolution to do a particular thing, whatever it might be. It is not like a computer, with billions of identical transistors in regular memory arrays that are controlled by a CPU with a few different elements. In the brain every individual structure and neural circuit has been individually refined by evolution and environmental factors. The closer we look at the brain, the greater the degree of neural variation we find. Understanding the neural structure of the human brain is getting harder as we learn more. Put another way, the more we learn, the more we realize there is to know, and the more we have to go back and revise our earlier understandings. We believe that one day this steady increase in complexity will end—the brain is, after all, a finite set of neurons and operates according to physical principles. But for the foreseeable future, it is the complexity brake and arrival of powerful new theories, rather than the Law of Accelerating Returns, that will govern the pace of scientific progress required to achieve the singularity.
Very true. Imagine a fiendishly complex chip diagram, but with not a single component of it standardized. It's one bespoke piece of hardware after another, billions of them, and the wiring between them was put together the same idiosyncratic way. And it's altering while you study it - in fact, it may be altering because you're studying it. Glorious stuff, and understanding it is going to give us extraordinary powers. But that's not happening soon, or on anyone's schedule.
+ TrackBacks (0) | Category: General Scientific News | The Central Nervous System
Harvard is announcing a big initiative in systems biology, which is an interdisciplinary opportunity if there ever was one.
The Initiative in Systems Pharmacology is a signature component of the HMS Program in Translational Science and Therapeutics. There are two broad goals: first, to increase significantly our knowledge of human disease mechanisms, the nature of heterogeneity of disease expression in different individuals, and how therapeutics act in the human system; and second — based on this knowledge — to provide more effective translation of ideas to our patients, by improving the quality of drug candidates as they enter the clinical testing and regulatory approval process, thereby aiming to increase the number of efficacious diagnostics and therapies reaching patients.
All worthy stuff, of course. But there are a few questions that come up. These drug candidates that Harvard is going to be improving the quality of. . .whose are those, exactly? Harvard doesn't develop drugs, you know, although you might not realize that if you just read the press releases. And the e-mail announcement sent out to the Harvard Medical School list is rather less modest about the whole effort:
With this Initiative in Systems Pharmacology, Harvard Medical School is reframing classical pharmacology and marshaling its unparalleled intellectual resources to take a novel approach to an urgent problem: The alarming slowdown in development of new and lifesaving drugs.
A better understanding of the whole system of biological molecules that controls medically important biological behavior, and the effects of drugs on that system, will help to identify the best drug targets and biomarkers. This will help to select earlier the most promising drug candidates, ultimately making drug discovery and development faster, cheaper and more effective. A deeper understanding will also help clinicians personalize drug therapies, making better use of medicine we already have.
Again with all those drug candidates - and again, whose candidates are they going to be selecting? Don't get me wrong; I actually wish everyone well in this effort. There really are a lot of excellent scientists at Harvard, even if they tell you so, and this is the sort of problem that can take (and has taken) everything that people can throw at it. But it's also worth remembering Harvard's approach to licensing and industrial collaboration. It's. . .well, let's just say that they didn't get that endowment up to its present size by letting much slip through their fingers. Many are those who've negotiated with the university and come away wanting to add ". . .et Pecunia" to that Latin motto.
So we'll see what comes out of this. But Harvard Medical School is indeed on the case.
+ TrackBacks (0) | Category: Drug Development
October 14, 2011
Amgen is out today speaking the sort of language that we've all come to fear. It appears that the local Ventura County Star picked up some rumblings from inside the Thousand Oaks headquarters, and when they asked the company about it, they got this:
"We are currently evaluating some changes within our Research & Development organization to improve focus and to reallocate resources to key pipeline assets and activities. . ."
Details to come on October 24th, when earnings are announced. But I have to say, "improving focus" is rarely a sign of good news.
+ TrackBacks (0) | Category: Business and Markets
There should be a decision soon on the controversial Avastin-for-metastatic-breast-cancer indication. I've written about that several times here, and my position is unchanged: the preliminary clinical data made it worth a provisional approval, but the follow-up data didn't back it up. This happens. The provisional approval should, I think, be withdrawn, because based on the best evidence we have (which is a lot more than we had when the approval was granted), Avastin is not effective for metastatic breast cancer, and carries notable risks all its own.
Now, via NPR's Scott Hensley, I see that one of the members of the FDA's committee on this issue has published a letter in the New England Journal of Medicine explaining his vote. Says Mikkael Sekeres of the Cleveland Clinic:
"The responsibility of ODAC is to carefully consider the scientific data presented as part of an FDA application for a cancer drug and weigh the benefits that the drug may provide to patients with cancer against the risks posed by the drug's side effects. We try to be dispassionate, but we always think about the person we face in clinic sitting a foot or two away from us in our cramped examination rooms, waiting to hear what treatment we can offer to get rid of her cancer. What kind of conversation would I have with such a patient if I were trying to convince her to take a treatment like this?
“Well, I can offer you a drug that will not make you live longer, won't make you feel better, and may have life-threatening side effects, but it will keep your cancer from worsening by an average of 1 to 2 months.”
Hope? Or false hope?"
Survival is the first thing you have to consider with a cancer therapy. And right next to it comes quality of life, because extending someone's life for a brief period at the cost of horrible side effects is no bargain, either. Should women with metastatic breast cancer take Avastin? It does not, as far as anyone can tell, extend their lives. And it does not improve their quality of life - if anything, it makes it worse. Avastin can be a good drug against other forms of cancer, but it's not for this one. I very much hope the FDA follows the recommendation of the advisory panel.
+ TrackBacks (0) | Category: Cancer | Regulatory Affairs
October 13, 2011
I really hesitate to bring this up again, considering the sorts of comments that came in the last time I mentioned XMRV around here. But I wanted to note a new paper that's come out. The authors reveal the crystal structures of the XMRV protease complexed with a number of known inhibitors. Some of them are what you'd expect from homology with similar enzymes, and some have unusual features.
But the details of the structures aren't the main point here - what's worth noting is that they exist. And they took time, and effort, and money to obtain. What's more, this sort of thing also went in several drug companies with an interest in antiviral research, not that any of that work will ever see the light of day, as opposed to this academic publication. Those people accusing the scientific world of callously ignoring the whole area should sit down with these X-ray structures for a few minutes.
No, XMRV was taken seriously by the medical research community, and a lot of serious effort was put into it. That's why it's such a shame that the whole hypothesis has ended up the way it has.
+ TrackBacks (0) | Category: Infectious Diseases
There's been an interesting dispute playing out over the last few weeks about science reporting. Here's a summary, but I'll give one as well: it all got started with David Kroll, aka "Abel Pharmboy" of the Terra Sigillata blog (and another, Take as Directed). On that latter site, he'd written about science articles in the popular press, and the line between having a scientist fact-check a piece about their work, and giving that same scientist editorial power.
Ananyo Bhattacharya, editor of Nature, then wrote a column in the Guardian on the topic, where he warned that there was indeed a line that could be crossed:
". . .It's a trap I've fallen into in the past. Either a scientist you have talked to insists on checking the final version of the story with the threat of "withdrawing" their contribution to your piece (it feels churlish to point out that they have already agreed to speak to you on the record) or, an hour or two before deadline you're struck by a creeping fear that somewhere, something is dreadfully wrong and so you call on one or more of your friendly sources to read it over. . .Part of the problem is that many scientists interpret the journalist's request that they "check the facts and your quotes only please" rather loosely. Some are under the impression that because their lab carried out the work being reported, they have some sort of ownership of the subsequent coverage. This is not the case."
But then the Guardian ran a strongly dissenting view from three neuroscientists from Cardiff University. Their take was that peer review is the secret sauce, and that accuracy is the greater good:
Science is different for four reasons, one categorical, three of degree. The categorical difference is the process of peer review. Every research article in a reputable scientific journal has been through a process in which between two and five independent experts (normally anonymous) have made extensive comments. . .
Overall, since press credibility relies on both accuracy and independence, and since the question of allowing sources to check articles (or parts of them) raises a tension between these pillars, the burning question is: where should the balance be struck?
We believe that public trust in science, and in science reporting, is harmed far more by inaccuracy than by non-independence. Contrary to Bhattacharya's claim that "the reader is not a scientist's first concern," public understanding is our overriding concern when communicating with journalists.
As it happens, these very authors had recently been scorched by sensationalized reporting of their work in the British tabloids. Now, I agree that for an accurate picture of any given scientific project, I'd sooner bring in a paleolithic Amazonian shaman for his take before turning to the Sun or the Daily Mail. But I still have to disagree that accuracy is the absolute trump card - I'm willing to accept some moronic misrepresentations in order to keep things more honest, and I think that honesty is best served when things don't run quite so smoothly. We should all keep each other on our toes - the alternative is an invitation to logrolling and groupthink, which can do more harm, in the long run, than sensationalism.
And it should go without saying that the Cardiff researchers' appeal to peer review just doesn't stand up. Five minutes over on Retraction Watch will show you what peer review is capable of letting through. And there are plenty of good scientists who will tell you about what peer review is capable of keeping out of the journals as well. No, it's a very imperfect system. I'm not saying that I can think of a better one at the moment, but appealing to it as if it's one of the glories of civilization is silly. (I see that I'm not alone in reacting this way).
And a bit more than silly - it's arrogant as well. This is what we as scientists have to look out for, the de haut en bas attitude where we come in and explain all the complicated stuff to the peasants. People can detect that, you know, and when they do they get suspicious (and rightly so, at times) that we have something to hide. No, speaking as a scientist, and a blogger, and (mostly on the opinion side) perhaps a journalist as well, I think we're better off with a system where everyone keeps an eye on everyone else. If we get too cozy and consensus-driven, we're going to invite real trouble.
+ TrackBacks (0) | Category: Press Coverage
October 12, 2011
siRNA technology has famously been the subject of a huge amount of work (and a huge amount of hype) and, more recently, a huge amount of uncertainty. Now a new report will add to that last pile. A group at the University of Kentucky says that they've identified a toxic effect in the retina for a wide range of siRNAs, one that seems to be triggered independent of sequence:
"We now show a new undesirable effect of siRNAs that are 21 nucleotides or longer in length: these siRNAs, regardless of their sequence or target, can cause retinal toxicity. By activating a new immune pathway consisting of the molecules TLR3 and IRF3, these siRNAs damage a critical layer of the retina called the retinal pigmented epithelium (RPE). Damage to the RPE cells by siRNAs can also lead to secondary damage to the rods and cones, which are light-sensing cells in the retina. . ."
That's especially worrisome news, since several siRNA efforts have targeted eye diseases in particular. The eye is a privileged compartment, metabolically, and exotica like small RNA molecules have a better chance of surviving there. But if you're trying to help out with macular degeneration or diabetic retinopathy, affecting the retinal epithelium isn't what you need, is it?
As a side note, this effect seems to be mediated, in part, by TLR3. Its family, the toll-like receptors, were part of this year's Nobel in Physiology/Medicine.
+ TrackBacks (0) | Category: Toxicology
October 11, 2011
Now, I know that I'm not the first to notice this. And in the grand scheme of things, it's pretty trivial. But isn't it true, and hasn't it been true for many years, that the print advertisements of chemical companies are often strange and useless?
Here's an example from a recent issue of Chemical and Engineering News, one that was open on my desk to this very spot. Now, I don't know what a quarter-page goes for these days - probably not as much as the folks at C&E News would like for it to - but this was wasted money for sure. Let's count the ways. For one thing, the purple molecule graphic might be a neat-looking thing in a cosmetics ad, but not when placed in a magazine whose subscriber base is about 98% people with a chemistry degree. The slogan ("Our people make the difference") is such an ancient chunk of corporate goodthink that it can't even support a good covering of mold any more. And are we to infer that the model, a vaguely futuristic Eurofied Joni Mitchell, is one of those people? Not hardly. And what's with the cyber-gizmo dog collar thing she's wearing? One of those invisible-fence zappers, scaled up to human size?
The ad enjoins us to visit them at a conference booth in Geneva, which is at least a place where you're sure to find out what on earth Saltigo does. To be fair, the opposite page in the C&E News issue has another Saltigo ad, which has a couple of chemists in an unexciting but straightforward pitch that lets you know that they're a custom synthesis/process company that you can hire to try to save you money during production. (Interestingly, at least for me, I just now noticed that the first of the two, Andreas Stolle, is an old colleague of mine from my days at the Wonder Drug Factory in Connecticut - hello, Andreas! And tell your ad agency to make sure to spell "throughout" properly next time.)
No, I'm sure that Saltigo's a perfectly good outfit. But their ads aren't doing much to get that across. Nor are they the only company in that position - a glance through any issue of any magazine in the field will yield a rich harvest of ads that are drably functional at best, and baffling at worst. I wouldn't want the job of producing the things, I have to admit - but doesn't someone want to do it better than it's being done?
+ TrackBacks (0) | Category: Chemical News
According to Bruce Booth (@LifeSciVC on Twitter), Ernst & Young have estimated the proportion of drugs in the clinic in the US that are targeting cancer. Anyone want to pause for a moment to make a mental estimate of their own?
Well, I can tell you that I was a bit low. The E&Y number is 44%. The first thought I have is that I'd like to see that in some historical perspective, because I'd guess that it's been climbing for at least ten years now. My second thought is to wonder if that number is too high - no, not whether the estimate is too high. Assuming that the estimate is correct, is that too high a proportion of drug research being spent in oncology, or not?
Several factors led to the rise in the first place - lots of potential targets, ability to charge a lot for anything effective, an overall shorter and more definitive clinical pathway, no need for huge expensive ad campaigns to reach the specialists. Have these caused us to overshoot?
+ TrackBacks (0) | Category: Cancer | Clinical Trials | Drug Development | Drug Industry History
October 7, 2011
Now here's a paper, packed to the edges with data, on what kinds of drug candidate compounds different companies produce. The authors assembled their list via the best method available to outsiders: they looked at what compounds are exemplified in patent filings
What they find is that over the 2000-2010 period that not much change has taken place, on average, in the properties of the molecules that are showing up. Note that we're assuming, for purposes of discussion, that these properties - things like molecular weight, logP, polar surface area, amount of aromaticity - are relevant. I'd have to say that they are. They're not the end of the discussion, because there are plenty of drugs that violate one or more of these criteria. But there are even more that don't, and given the finite amount of time and money we have to work with, you're probably better off approaching a new target with five hundred thousand compounds that are well within the drug-like properties boxes rather than five hundred thousand that aren't. And at the other end of things, you're probably better off with ten clinical candidates that mostly fit versus ten that mostly don't.
But even if overall properties don't seem to be changing much, that doesn't mean that there aren't differences between companies. That's actually the main thrust of the paper: the authors compare Abbott, Amgen, AstraZeneca, Bayer-Schering, Boehringer, Bristol-Myers Squibb, GlaxoSmithKline, J&J, Lilly, Merck, Novartis, Pfizer, Roche, Sanofi, Schering-Plough, Takeda, Wyeth, and Vertex. Of course, these organizations filed different numbers of patents, on different targets, with different numbers of compounds. For the record, Merck and GSK filed the most patents during those ten years (over 1500), while Amgen and Takeda filed the fewest (under 300). Merck and BMS had the largest number of unique compounds (over 70,000), and Takeda and Bayer-Schering had the fewest (in the low 20,000s). I should note that AstraZeneca just missed the top two in both patents and compounds.
If you just look at the raw numbers, ignoring targeting and therapeutic areas, Wyeth, Bayer-Schering, and Novartis come out looking the worst for properties, while Vertex and Pfizer look the best. But what's interesting is that even after you correct for targets and the like, that organizations still differ quite a bit in the sorts of compounds that they turn out. Takeda, Lilly, and Wyeth, for example, were at the top of the cLogP rankings (numberically, "top" meaning the greasiest). Meanwhile, Vertex, Pfizer, and AstraZeneca were at the other end of the scale in cLogP. In molecular weight, Novartis, Boehringer, and Schering-Plough were at the high end (up around 475), while Vertex was at the low end (around 425). I'm showing a radar-style plot from the paper where they cover several different target-unbiased properties (which have been normalized for scale), and you can see that different companies do cover very different sorts of space. (The numbers next to the company names are the total number of shared targets found and the total number of shared-target observations used - see the paper if you need more details on how they compiled the numbers).
Now, it's fair to ask how relevant the whole sweep of patented compounds might be, since only a few ever make it deep into the clinic. And some companies just have different IP approaches, patenting more broadly or narrowly. But there's an interesting comparison near the end of the paper, where the authors take a look at the set of patents that cover only single compounds. Now, those are things that someone has truly found interesting and worth extra layers of IP protection, and they average to significantly lower molecular weights, cLogP values, and number of rotatable bonds than the general run of patented compounds. Which just gets back to the points I was making in the first paragraph - other things being equal, that's where you'd want to spend more of your time and money.
What's odd is that the trends over the last ten years haven't been more pronounced. As the paper puts it:
blockquote>Over the past decade, the mean overall physico-chemical space used by many pharmaceutical companies has not changed substantially, and the overall output remains worryingly at the periphery of historical oral drug chemical space. This is despite the fact that potential candidate drugs, identified in patents protecting single compounds, seem to reflect physiological and developmental pressures, as they have improved drug-like properties relative to the full industry patent portfolio. Given these facts, and the established influence of molecular properties on ADMET risks and pipeline progression, it remains surprising that many organizations are not adjusting their strategies.
The big question that this paper leaves unanswered, because there's no way for them to answer it, is how these inter-organizational differences get going and how they continue. I'll add my speculations in another post - but speculations they will be.
+ TrackBacks (0) | Category: Drug Assays | Drug Development | Drug Industry History
October 6, 2011
That's what this paper in Molecular Psychiatry is suggesting. The authors injected material from human Alzheimer's patients into the brains of normal mice, and saw what appears to be the induction of amyloid pathlology. This didn't happen in control animals, got worse with time, and wasn't just noted at the point of injection. Their hypothesis is that Alzheimer's might be a prion-type disease of protein misfolding, and possibly capable of being spread by infectious particles. I recall ideas like this being advanced in the past, but this is the first time I've seen evidence like this (hasn't this sort of experiment been run before?) It's simultaneously fascinating and alarming, and I would very much like to see it repeated and confirmed.
This comes as broadly similar ideas are being advanced in Parkinson's disease, where recent work has shown misfolded alpha-synuclein protein (long known as a key factor) spreading slowly through infected neurons. No one has ever seen evidence of transmissible Parkinson's between humans, but it does seem to move between neurons like an internal epidemic.
And that comes as broadly similar ideas are being advanced in ALS. A recent paper in PNAS suggests that a mutant form of superoxide dismutase 1 (which had already been found to be associated with the disease) can be spread by the injection of precursor cells that express it. That makes you think that the SOD1 mutant (G93A, which is not the most common mutation in humans) may also have prion-like properties, and can induce other proteins to misfold along with it. What's especially interesting (and again, rather alarming) is that it apparently can recruit normal SOD1 into this state. (In this study, though, the effects were confined to the region around the introduction of the cells, so the spread was not that fast). It's important to note again that, as in the case of Parkinson's, no one has ever seen evidence that ALS is transmissible from person to person - in fact, I don't think that anyone has ever seen ALS in anyone without the mutation in their genome. But this does shed some light on what happens internally.
So taken together, the spreading-protein-misfolding mechanism seems to have a lot of momentum behind it. The big question is whether it can result in human-to-human transmission. Even in the cases where we've confirmed prion-based disease, transmission seems (fortunately) rather difficult, although this is a very active field of research, and definitely something to keep an eye on. The possible Alzheimer's connection is especially interesting, since that one is simultaneously more common and does not have a strong genetic component. It occurs (as far as we can tell) mostly sporadically. The amyloid hypothesis for its cause has been taking some hits in recent years, but the other side of the story is still very much alive. . .
+ TrackBacks (0) | Category: The Central Nervous System
October 5, 2011
Well, the field bet won this year - no one had Daniel Schechtman and quasicrystals in their predictions, as far as I know. This is one of those prizes that is not easy to communicate to someone outside the field, but if I had to sum it up in one phrase for a nonscientist, it would be "Discovery of crystals that everyone thought were impossible".
That's because they have five-fold symmetry, among other types. And the problem there is pretty easy to show: if you take a bunch of identical triangles (any triangle at all), you can tile them out and cover a surface evenly - imagine a tabletop mosaic or a bathroom floor. And that works with any rectangle, too, naturally, and it also works with hexagons. But it does not work with regular pentagons (or with any other regular geometric figure). Gaps appear that cannot be closed. You can cheat and tile the plane with two types of bent pentagons or the like, but closer inspection shows that these cases are all really tiles of one of the allowed classes.
The same problems appear in three-dimensional crystals, and five-fold symmetry, in any of its forms, is just not allowed (and had never been seen). But in the early 1980s there came a report of just that. Daniel Schechtman, working at the National Bureau of Standards, had found a metallic crystalline substance that seemed to show clear evidence of an impossible form. I was in grad school when the result came out, and I well remember the stir it caused. Just publishing the result took a lot of nerve, since every single crystallographer in the world would tell you that if they knew one thing about their field, it was that you couldn't have something like this.
As it turned out, these issues had already been well explored by two different groups: medieval Islamic artists and mathematicians. It turns out that what looks like unallowable symmetry in two (or three) dimensions works out just fine in higher-dimensional spaces, and these theoretical underpinnings were actually a lot of help in the debates that followed.
Here's a good history of what happened afterwards. One thing that I recalled was that Linus Pauling wasn't buying it for a minute. He was, of course, quite old by that time, but he was still a force to be reckoned with in his own areas of expertise, despite the damage he'd done to his reputation with all the Vitamin C business. He kept up the barrage for the remainder of his life, publishing one of his last scientific papers (in 1992) on the subject and arguing yet again that the quasicrystal idea was mistaken. As that above-linked paper from Schechtman's co-worker John Cahn put it:
Quasicrystals provided win-win opportunities for crystallographers: If we were mistaken about them, expert crystallographers could debunk us; if we were right, here was an opportunity to be a trail blazer. While many crystallographers worldwide availed themselves of the opportunity, U.S. crystallographers avoided it, to a large extent because of Pauling’s influence.
But time has shown that the quasicrystal hypothesis is correct. You can have local symmetries of this kind, and many other "impossible" examples have been discovered since. The resolution of the X-ray structures has gotten better and better, ruling out all the other explanations - Pauling would have found it painful to watch. The resulting solids have rather odd properties, although if someone asked me to name any effect that they've had on anyone's daily life, I'd have to answer "none at all". But I'm sympathetic to anyone who proves something in science that no one thought could be proved, so Nobel Prize it is, and congratulations.
A side note: anyone want to take bets on whether some ayatollah or other Iranian politician will pop up, claiming that the whole subject of the prize was anticipated by the 15th-century Darb-e Imam shrine in Isfahan? Let's set the odds. . .
+ TrackBacks (0) | Category: Chemical News
October 4, 2011
Here's an interview that I did recently with Paul Howard of the Manhattan Institute on the results of that immunotherapy leukemia trial (and on some broader topics around the current state of drug discovery). Anyone who would like to pitch a blockbuster syndicated radio show with me on structure-activity relationships and preclinical drug development, have your people call my people.
+ TrackBacks (0) | Category: General Scientific News
Today's announcement of the Physics Nobel came as no surprise. I remember when those results came out in 1998 (that the universe's expansion was accelerating rather than slowing down to some degree), and immediately thinking "Nobel if it holds up". I thought the same thing about RNA interference when I first heard about it, and there are many other discoveries in the same category. Not all of them have been given Nobels, but what I mean are things that are immediately obvious that they are Nobel-worthy.
Now, here's my question for today: how many of these have we had in chemistry? And how many have we had recently? It seems to me that (out of the indisputable chemistry-and-not-biology prizes), there aren't as many as you might find in other fields. Perhaps chemistry is a mature enough science that fundamental surprises and breakthroughs are not as common, and when they occur, they come on more slowly. Thoughts?
+ TrackBacks (0) | Category: General Scientific News
If you haven't seen this XMRV news, then you should. The very day after I wrote my most recent post on the subject came this one from ERV over at Scienceblogs.
There are two key figures in it: one from the original Science paper, showing infected patients expressing XMRV Gag protein (a sign of viral infection). And the other was presented recently by Judy Mikovits at a conference in Ottowa. It shows a different experiment - Gag protein being expressed in some other patients only after treatment with 5-azacytidine. The problem is. . .well. . .I'll let you go see for yourselves what the problem is. It most definitely needs explaining, and the explanations had better be good.
Update: continued unraveling. Mikovits herself has been fired from her research institute, apparently for other causes.
Second update: The Chicago Tribune is on this story, breaking it to a wider public. For a link to the alleged third version of the Mikovits figure, see the comments to this post below.
+ TrackBacks (0) | Category: Infectious Diseases
October 3, 2011
The first week in October is on us again, and this Wednesday is the Nobel Prize in Chemistry. So what can we say about who should get it (and about who actually will?)
Your first place to turn should be Paul Bracher's ChemBark post on the topic. He has a comprehensive list of candidates, but the only ones with better odds than the field bet are:
Spectroscopy & Application of Lasers (Zare/Moerner/+), 6-1
Nuclear Hormone Signaling (Chambon/Evans/Jensen), 7-1
Bioinorganic Chemistry (Gray/Lippard/Holm/–), 8-1
Techniques in DNA Synthesis, (Caruthers/Hood/+), 10-1
That's a very reasonable list, and I think that Zare/Moerner (and possibly et al.) are definitely going to win at some point. There's a strong case to be made for each of the others, too. Meanwhile, Thomson/Reuters has narrowed things down quite a bit. Their three contenders are:
Electrochemistry (Bard et al.)
Molecular Dynamics (Karplus et al.)
Dendritic Polymers (Fréchet, Tomalia, Vögtle)
Those first two have been contenders for many years (which tends to move them down on the ChemBark list, and up on the Thomson one), but no one could complain about either of them. I really, really doubt that dendritic chemistry's going to win this year, though, and I'd like to know what put that topic so far up the list. Maybe some day, but not yet, in my opinion. My guess is that the sheer number of publications in the field has skewed things a bit (after all, keeping track of such things is Thomson/Reuters' business). Wavefunction's list is a good one to check out, and seems much more in tune with reality.
There are some categories of research that I would like to see win at some point, although narrowing down the names won't be easy. I think that directed-evolution methods are a great area with a lot of potential, for one, and the whole activity-based protein profiling/in vivo cell labeling stuff is another. Eventually I expect some nanotech/molecular machine discovery to win, but only once it gets far enough along to connect with the real world. And the work on photochemical energy technology (carbon dioxide fixation, hydrogen generation, and so on) will also be a strong candidate when something looks world-changing enough. Other discoveries in the surely-Nobel-worthy category are GPCR structure (and that sort of thing has usually been moved over into the Chemistry prize) and DNA-based diagnostic methods (hard to narrow that one down, though).
This year? I think the committee will go back and pick up one of those prizes for long-time contenders; I don't expect any massive surprises. And I certainly don't expect anything in organic chemistry this time. But we'll see on Wednesday. What's that? You want me to really pick something? Fine. . .I'll guess the nuclear receptor people, or the single-molecule spectroscopy people, in basically a dead heat. I think that the ChemBark odds are correct.
+ TrackBacks (0) | Category: General Scientific News