About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: firstname.lastname@example.org
In the Pipeline:
Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline
July 21, 2014
What a mess there is in the hepatitis C world. Gilead is, famously, dominating the market with Sovaldi, whose price has set off all sorts of cost/benefit debates. The companies competing with them are scrambling to claim positions, and the Wall Street Journal says that AbbVie is really pulling out all the stops. Try this strategy on for size:
In a lawsuit filed in February, AbbVie noted it patented the idea of combining two of Gilead's drugs—Sovaldi and an experimental drug called ledipasvir, which Gilead plans to combine into one treatment—and is therefore entitled to monetary damages if Gilead brings the combination pill to market. Legally, AbbVie can't market Sovaldi or ledipasvir because it doesn't have the patents on the underlying compounds. But it is legal for companies to seek and obtain patents describing a particular "method of use" of products that don't belong to them.
Gilead disputes the claims of AbbVie and the other companies. A spokeswoman said Gilead believes it has the sole right to commercialize Sovaldi and products containing Sovaldi's active ingredient, known as sofosbuvir. An AbbVie spokeswoman said the company believes Gilead infringes its patents, and that it stands behind the validity and enforceability of those patents.
You don't see that very often, and it's a good thing. Gilead is, naturally, suing Abbvie over this as well, saying that Abbvie has knowing mispresented to the USPTO that they invented the Gilead therapies. I'm not sure how that's going to play out: Abbvie didn't have to invent the drugs to get a method-of-use patent on them. At the same time, I don't know what sort of enablement Abbvie's patent claims might have behind them, given that these are, well, Gilead's compounds. The company is apparently claiming that a "sophisticated computer model" allows them to make a case that these combinations would be the effective ones, but I really don't know if that's going to cut it (and in fact, I sort of hope it doesn't). But even though I'm not enough of a patent-law guy to say either way, I'm enough of one to say, with great confidence, that this is going to be a very expensive mess to sort out. Gilead's also in court with Merck (and was with Idenix before Merck bought them), and with Roche, and will probably be in court with everyone else before all this is over.
This whole situation reminds me of one of those wildlife documentaries set around a shrinking African watering hole. A lot of lucrative drugs have gone off patent over the last few years, and a lot of them are heading that way soon. So any new therapeutic area with a lot of commercial promise is going to get a lot of attention, and start a lot of fighting. Legal battles aren't cheap on the absolute scale, but on the relative scale of the potential profits, they are. So why not? Claim this, claim that, sue everybody. It might work; you never know. Meanwhile, we have a line forming on the right of ticked-off insurance companies and government health plans, complaining about the Hep C prices, and while they wait they can watch the companies involved throwing buckets of slop on each other and hitting everyone over the head with lawsuits. What a spectacle.
+ TrackBacks (0) | Category: Business and Markets | Infectious Diseases | Patents and IP | Why Everyone Loves Us
It's getting nasty over at Allergan. They're still trying to fight off a takeover attempt by Valeant, making the case that the company's R&D efforts are not a waste of money (which, only slightly simplified, is the Valeant position regarding every company they're taking over).
But Allergan's had a lot of trouble getting one of their drugs (Semprana) through the FDA. Semprana is an inhaled version of the classic dihydroergotamine therapy for migraine, and had been rejected last year when it was still known as Levadex. The recent further delay isn't helping Allergan make its case, and Valeant is using this news to peel off some more shareholders.
This morning comes word that Allergan is cutting back staff. That Fierce Biotech report says that it looks like a lot of the cuts will be hitting discovery R&D, which makes you wonder if Allergan will manage to escape Valeant's grip only by becoming what Valeant wanted to make them.
+ TrackBacks (0) | Category: Business and Markets | Regulatory Affairs
July 18, 2014
I found this article from the Charlotte Observer on the "Food Babe" (Vani Hari) very interesting. A "menu consultant" for Chick-Fil-A, is she? Who knew?
I've come across a horribly long string of chemistry misapprehensions, mistakes, and blunders while looking at her site - she truly appears to know nothing whatsoever about chemistry, not that this would appear to bother her much. (Wavefunction has a good article on these). I noticed in the comments section of the newspaper's article that someone is apparently trying to crowdsource a fundraising drive to send her to some chemistry classes. I enjoy that idea very much, although (1) horse, water, drink, etc., and (2) she appears to have sufficient funds to do this already, were it of any possible interest to her. And more money coming in all the time. She may well make more money telling people that they're eating yoga mats than I do trying to discover drugs.
+ TrackBacks (0) | Category: General Scientific News
There's a new report in the literature on the mechanism of thalidomide, so I thought I'd spend some time talking about the compound. Just mentioning the name to anyone familiar with its history is enough to bring on a shiver. The compound, administered as a sedative/morning sickness remedy to pregnant women in the 1950s and early 1960s, famously brought on a wave of severe birth defects. There's a lot of confusion about this event in the popular literature, though - some people don't even realize that the drug was never approved in the US, although this was a famous save by the (then much smaller) FDA and especially by Frances Oldham Kelsey. And even those who know a good amount about the case can be confused by the toxicology, because it's confusing: no phenotype in rats, but big reproductive tox trouble in mice and rabbits (and humans, of course). And as I mentioned here, the compound is often used as an example of the far different effects of different enantiomers. But practically speaking, that's not the case: thalidomide has a very easily racemized chiral center, which gets scrambled in vivo. It doesn't matter if you take the racemate or a pure enantiomer; you're going to get both of the isomers once it's in circulation.
The compound's horrific effects led to a great deal of research on its mechanism. Along the way, thalidomide itself was found to be useful in the treatment of leprosy, and in recent years it's been approved for use in multiple myeloma and other cancers. (This led to an unusual lawsuit claiming credit for the idea). It's a potent anti-angiogenic compound, among other things, although the precise mechanism is still a matter for debate - in vivo, the compound has effects on a number of wide-ranging growth factors (and these were long thought to be the mechanism underlying its effects on embryos). Those embryonic effects complicate the drug's use immensely - Celgene, who got it through trials and approval for myeloma, have to keep a very tight patient registry, among other things, and control its distribution carefully. Experience has shown that turning thalidomide loose will always end up with someone (i.e. a pregnant woman) getting exposed to it who shouldn't be - it's gotten to the point that the WHO no longer recommends it for use in leprosy treatment, despite its clear evidence of benefit, and it's down to just those problems of distribution and control.
But in 2010, it was reported that the drug binds to a protein called cereblon (CRBN), and this mechanism implicated the ubiquitin ligase system in the embryonic effects. That's an interesting and important pathway - ubiquitin is, as the name implies, ubiquitous, and addition of a string of ubiquitins to a protein is a universal disposal tag in cells: off to the proteosome, to be torn to bits. It gets stuck onto exposed lysine residues by the aforementioned ligase enzyme.
But less-thorough ubiquitination is part of other pathways. Other proteins can have ubiquitin recognition domains, so there are signaling events going on. Even poly-ubiquitin chains can be part of non-disposal processes - the usual oligomers are built up using a particular lysine residue on each ubiquitin in the chain, but there are other lysine possibilities, and these branch off into different functions. It's a mess, frankly, but it's an important mess, and it's been the subject of a lot of work over the years in both academia and industry.
The new paper has the crystal structure of thalidomide (and two of its analogs) bound to the ubiquitin ligase complex. It looks like they keep one set of protein-protein interactions from occurring while the ligase end of things is going after other transcription factors to tag them for degradation. Ubiquitination of various proteins could be either up- or downregulated by this route. Interestingly, the binding is indeed enantioselective, which suggests that the teratogenic effects may well be down to the (S) enantiomer, not that there's any way to test this in vivo (as mentioned above). But the effects of these compounds in myeloma appear to go through the cereblon pathway as well, so there's never going to be a thalidomide-like drug without reproductive tox. If you could take it a notch down the pathway and go for the relevant transcription factors instead, post-cereblon, you might have something, but selective targeting of transcription factors is a hard row to hoe.
+ TrackBacks (0) | Category: Analytical Chemistry | Biological News | Cancer | Chemical News | Toxicology
July 17, 2014
There are quite a few headlines today about a link between Alzheimer's and a protein called TDP-43. This is interesting stuff, but like everything else in the neurodegeneration field, it's going to be tough to unravel what's going on. This latest work, just presented at a conference in Copenhagen, found (in a large post mortem brain study of people with diagnosed Alzheimer's pathology) that aberrant forms of the protein seem to be strongly correlated with shrinkage of the hippocampus and accompanying memory loss.
80% of the cohort with normal TDP-43 (but still showing Alzheimer's histology) had cognitive impairment at death, but 98% of the ones with TDP-43 mutations had such signs. That says several things: (A) it's possible to have classic Alzheimer's without mutated TDP-43, (B) it's possible to have classic Alzheimer's tissue pathology (up to a point, no doubt) without apparent cognitive impairment, and (C) it's apparently possible (although very unlikely) to have mutated TDP-43, show Alzheimer's pathology as well, and still not be diagnosed as cognitively impaired. Welcome to neurodegeneration. Correlations and trends are mostly what you get in that field, and you have to make of them what you can.
TDP-43, though, has already been implicated, for some years now, in ALS and several other syndromes, so it really does make sense that it would be involved. It may be that it's disproportionately a feature of more severe Alzheimer's cases, piling on to some other pathology. Its mechanism of action is not clear yet - as mentioned, it's a transcription factor, so it could be involved in stuff from anywhere and everywhere. It does show aggregation in the disease state, but that Cell paper linked to above makes the case that it's not the aggregates per se that are the problem, but the loss of function behind them (for example, there are increased amounts of the mutant protein out in the cytoplasm, rather than in the nucleus). What those lost functions are, though, remains to be discovered.
+ TrackBacks (0) | Category: Alzheimer's Disease | Biological News
Here's some big news: Ron Evans and co-workers at Salk report that treatment with the growth factor FGF1 appears to reverse type II diabetes in mice. (Article in Science Daily on this study here). Evans has been working in this field (diabetes, insulin sensitivity, and related areas like growth factors and nuclear receptors) for a long time, and I would definitely take this work seriously.
They reported a couple of years ago that FGF1 seemed to be involved in insulin sensitivity. It's induced in adipose tissue under high-fat diet conditions. FGF1 knockout mice, for their part, have a seemingly normal phenotype, but when they're put on high-fat diets they respond very poorly indeed, quickly showing abnormal glucose control and other defects.
This new paper shows that in normal mice with metabolic problems brought on by a high-fat diet, a single injection of recombinant FGF1 is sufficient to normalize glucose for up to 48 hours. Interestingly (and importantly), this mechanism doesn't seem to overshoot - you don't swing over to hypoglycemia, which is always a worry in this field. And repeated FGF1 therapy leads to increased insulin sensitivity, suppression of hepatic glucose production - basically, everything you'd want in a Type II diabetes therapy. It's great stuff, and the best candidate I've yet seen for the Real Mechanism behind the disease.
Now, FGF1 is a cellular growth factor, so there's a possibility for trouble. But the glucose/insulin effects seem to be mediated by one particular FGF receptor (FGFR1), which makes one hopeful that this axis can be separated out. I would expect to see a great deal of work coming on variants of the protein with longer plasma half-life and greater selectivity. In vivo, the protein seems to be secreted and used locally in specific tissues - it's not in wide circulation. But perhaps it should be - you can be sure that someone's going to try to find out. Overall, this is excellent, exciting news, and we're poised to learn a huge amount about type II diabetes and how to treat it.
+ TrackBacks (0) | Category: Diabetes and Obesity
July 16, 2014
If you ever find yourself needing to make large cyclic peptides, you now have a new option. This paper in Organic Letters describes a particularly clean way to do it: let glutathione-S-transferase (GST) do the work for you. Bradley Pentelute's group at MIT reports that if your protein has a glutathione attached at one end, and a pentafluoroaryl Cys at the other, that GST will step in and promote the nucleophilic aromatic substitution reaction to close the two ends together.
This is an application of their earlier work on the uncatalyzed reaction and on the use of GST for ligation.. Remarkably, the GST method seems to product very high yields of cyclic peptides up to at least 40 residues, and at reasonable concentration (10 mM) of the starting material, under aqueous conditions. Cyclic peptides themselves are interesting beasts, often showing unusual properties compared to the regular variety, and this method look as it will provide plenty more of them for study.
+ TrackBacks (0) | Category: Chemical Biology | Chemical News
When you ask a bunch of medicinal chemists to look over a list of structures - screening hits, potential additions to the compound collection, that sort of thing - you'll find that everyone will cross some of them off. But the agreement between the chemists on which ones need to go, that's the tough part. It's been shown that we don't overlap very much in our preferences, at least when it comes to the structures we'd prefer not to try to advance. That's because we don't overlap as well as we think we do when it comes to the rules we're using.
So here's a question, which might illustrate the point: what compound classes or scaffolds have you been burned by? I think that's one big factor that we all use when we're evaluating one of those compound lists - which ones are in that "Fooled me once" category? For me, a recent experience with NH pyrroles has me reluctant to go there again. And I'm not interested in things with napthalenes hanging off of them, naproxen notwithstanding. I'd also rather not see Mannich products, since I've personally seen a number of those misbehave.
So what's on your list? I think that everyone can agree on things like rhodanines, although even those have their partisans. But what semi-decent looking compounds will you go ahead and blackball, based on your own nasty experiences with them?
+ TrackBacks (0) | Category: Life in the Drug Labs
July 15, 2014
K. C. Nicolaou has an article in the latest Angewandte Chemie on the future of drug discovery, which may seem a bit surprising, considering that he's usually thought of as Mister Total Synthesis, rather than Mister Drug Development Project. But I can report that it's relentlessly sensible. Maybe too sensible. It's such a dose of the common wisdom that I don't think it's going to be of much use or interest to people who are actually doing drug discovery - you've already had all these thoughts yourself, and more than once.
But for someone catching up from outside the field, it's not a bad survey at all. It gets across how much we don't know, and how much work there is to be done. And one thing that writing this blog has taught me is that most people outside of drug discovery don't have an appreciation of either of those things. Nicolaou's article isn't aimed at a lay audience, of course, which makes it a little more problematic, since many of the people who can appreciate everything he's saying will already know what he's going to say. But it does round pretty much everything up into one place.
+ TrackBacks (0) | Category: Drug Development | Drug Industry History
Over the years, there have been more comments than anyone can count here on the often-grim employment picture for chemistry and biology employment in biopharma. Plenty of people here (myself included) can speak from experience. But we should also remember that the academic job market in the biomedical sciences is in awful shape, too, unfortunately. And since a disproportionate number of people start off grad school picturing themselves getting jobs in academia, a clear picture of what's going on is essential.
That's the point of this piece in Nature, in the Jobs section. The author, Jessica Polka (post-doc at Harvard Medical School) says that far too many of her colleagues don't have an accurate impression of the job market. She's created this graphic to get the point across. Some 16,000 students will start graduate school in biology in the US this fall. The best guess is that fewer than 10% of them will eventually become tenure-track faculty somewhere.
But at least half of them list that as their most preferred career path, which means that a lot of things are not going to work out as planned. Polka's right - the most people who understand this, and the earlier, the better.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Business and Markets
July 14, 2014
Targacept has been working on some very hard therapeutic areas over the years, and coming up dry - dramatically so. They may have just done it again.
They've been testing TC-1734 in Alzheimer's over the last year or so, a partial agonist at nicotinergic receptors. That was a long-shot mechanism to start with, although to be sure, every Alzheimer's drug is a long-shot mechanism. This would be a stopgap compound even if it worked, like the existing acetylcholinesterase compound Donepezil.
And the company has apparently released the results of the clinical trial on its web site, inadvertently, you'd have to assume. The news first came out from BioRunUp on Twitter, and the text of the announcement was the the compound had failed to show superiority to Donepezil. The company has made no official announcement (as I write, anyway), and the press release itself appears to have been taken down a little while ago. But here's a screen shot, if you're interested. The stock (TRGT) has already reacted to the news, as you'd imagine it would, suddenly dropping like a brick starting at just before 2:30 PM EST. Not a good way to get the news out, that's for sure. . .
+ TrackBacks (0) | Category: Alzheimer's Disease | Clinical Trials
What's the best carrier to take some sort of therapeutic agent into the bloodstream? That's often a tricky question to work out in animal models or in the clinic - there are a lot of possibilities. But what about using red blood cells themselves?
That idea has been in the works for a few years now, but there's a recent paper in PNAS reporting on more progress (here's a press release). Many drug discovery scientists will have encountered the occasional compound that partitions into erythrocytes all by itself (those are usually spotted by their oddly long half-lives after in vivo dosing, mimicking the effect of plasma protein binding). One of the early ways that people have attempted to try this deliberately was forcing a compound into the cells, but this tends to damage them and make them quite a bit less useful. A potentially more controllable method would be to modify the surfaces of the RBCs themselves to serve as drug carriers, but that's quite a bit more complex, too. Antibodies have been tried for this, but with mixed success.
That's what this latest paper addresses. The authors (the Lodish and Ploegh groups at Whitehead/MIT) introduce modified surface proteins (such as glycophorin A) that are substrates for Ploegh's sortase technology (two recent overview papers), which allows for a wide variety of labeling.
Experiments using modified fetal cells in irradiated mice gave animals that had up to 50% of their RBCs modified in this way. Sortase modification of these was about 85% effective, so plenty of label can be introduced. The labeling process doesn't appear to affect the viability of the cells very much as compared to wild-type - the cells were shown to circulate for weeks, which certainly breaks the records held by the other modified-RBC methods.
The team attached either biotin tags and specific antibodies to both mouse and human RBCs, which would appear to clear the way for a variety of very interesting experiments. (They also showed that simultaneous C- and N-terminal labeling is feasible, to put on two different tags at once). Here's the "coming attractions" section of the paper:
he approach presented here has many other possible applications; the wide variety of possible payloads, ranging from proteins and peptides to synthetic compounds and fluorescent probes, may serve as a guide. We have conjugated a single-domain antibody to the RBC surface with full retention of binding specificity, thus enabling the modified RBCs to be targeted to a specific cell type. We envision that sortase-engineered cells could be combined with established protocols of small-molecule encapsulation. In this scenario, engineered RBCs loaded with a therapeutic agent in the cytosol and modified on the surface with a cell type-specific recognition module could be used to deliver payloads to a precise tissue or location in the body. We also have demonstrated the attachment of two different functional probes to the surface of RBCs, exploiting the subtly different recognition specificities of two distinct sortases. Therefore it should be possible to attach both a therapeutic moiety and a targeting module to the RBC surface and thus direct the engineered RBCs to tumors or other diseased cells. Conjugation of an imaging probe (i.e., a radioisotope), together with such a targeting moiety also could be used for diagnostic purposes.
This will be worth keeping an eye on, for sure, both as a new delivery method for small (and not-so-small) molecules, fof biologics, and for its application to all the immunological work going on now in oncology. This should keep everyone involved busy for some time to come!
+ TrackBacks (0) | Category: Biological News | Chemical Biology | Pharmacokinetics
Here's an article from David Shayvitz at Forbes whose title says it all: "Should a Drug Discovery Team Ever Throw in the Towel?" The easy answer to that is "Sure". The hard part, naturally, is figuring out when.
You don’t have to be an expensive management consultant to realize that it would be helpful for the industry to kill doomed projects sooner (though all have said it).
There’s just the prickly little problem of figuring out how to do this. While it’s easy to point to expensive failures and criticize organizations for not pulling the plug sooner, it’s also true that just about every successful drug faced some legitimate existential crisis along the way — at some point during its development , there was a plausible reason to kill the program, and someone had to fight like hell to keep it going.
The question at the heart of the industry’s productivity struggles is the extent to which it’s even possible to pick the winners (or the losers), and figuring out better ways of managing this risk.
He goes on to contrast two approaches to this: one where you have a small company, focused on one thing, with the idea being that the experienced people involved will (A) be very motivated to find ways to get things to work, and (B) motivated to do something else if the writing ever does show up on the wall. The people doing the work should make the call. The other approach is to divide that up: you set things up with a project team whose mandate is to keep going, one way or another, dealing with all obstacles as best they can. Above them is a management team whose job it is to stay a bit distant from the trenches, and be ready to make the call of whether the project is still viable or not.
As Shayvitz goes on to say, quite correctly, both of these approaches can work, and both of them can run off the rails. In my view, the context of each drug discovery effort is so variable that it's probably impossible to say if one of these is truly better than the other. The people involved are a big part of that variability, too, and that makes generalizing very risky.
The big risk (in my experience) with having execution and decision-making in the same hands is that projects will run on for too long. You can always come up with more analogs to try, more experiments to run, more last-ditch efforts to take a crack it. Coming up with those things is, I think, better than not coming up with them, because (as Shayvitz mentions) it's hard to think of a successful drug that hasn't come close to dying at least once during its development. Give up too easily, and nothing will ever work at all.
But it's a painful fact that not every project can work, no matter how gritty and determined the team. We're heading out into the unknown with these drug candidates, and we find out things that we didn't know were there to be found out. Sometimes there really is no way to get the selectivity you need with the compound series you've chosen - heck, sometimes there's no way to get it with any compound series you could possibly choose, although that takes a long time to become obvious. Sometimes the whole idea behind the project is flawed from the start: blocking Kinase X will not, in fact, alter the course of Disease Y. It just won't. The hypothesis was wrong. An execute-at-all-costs team will shrug off these fatal problems, or attempt to shrug them off, for as long as you give them money.
But there's another danger waiting when you split off the executive decision-makers. If those folks get too removed from the project (or projects) then their ability to make good decisions is impaired. Just as you can have a warped perspective when you're right on top of the problems, you can have one when you're far away from them, too. It's tempting to thing that Distance = Clarity, but that's not a linear function, by any means. A little distance can certainly give you a lot of perspective, but if you keep moving out, things can start fuzzing back up again without anyone realizing what's going on.
That's true even if the managers are getting reasonably accurate reports, and we all know that that's not always the case in the real world. In many large organizations, there's a Big Monthly Meeting of some sort (or at some other regular time point) where projects are supposed to be reviewed by just those decision makers. These meetings are subject to terrible infections of Dog-And-Pony-itis. People get up to the front of the room and they tell everyone how great things are going. They minimize the flaws and paper over the mistakes. It's human nature. Anyone inclined to give a more accurate picture has a chance to see how that's going to look, when all the other projects are going Just Fine and everyone's Meeting Their Goals like it says on the form. Over time (and it may not take much time at all), the meeting floats away into its own bubble of altered reality. Managers who realize this can try to counteract it by going directly to the person running the project team in the labs, closing the office door, and asking for a verbal update on how things are really going, but sometimes people are so out of it that they mistake how things are going at the Big Monthly Meeting for what's really happening.
So yes indeed, you can (as is so often the case) screw things up in both directions. That's what makes it so hard to law down the law about how to run a drug discovery project: there are several ways to succeed, and the ways to mess them up are beyond counting. My own bias? I prefer the small-company back-to-the-wall approach, of being ready to swerve hard and try anything to make a project work. But I'd only recommend applying that to projects with a big potential payoff - it seems silly to do that sort of thing for anything less. And I'd recommend having a few people watching the process, but from as close as they can get without being quite of the project team themselves. Just enough to have some objectivity. Simple, eh? Getting this all balanced out is the hard part. Well, actually, the science is the hard part, but this is the hard part that we can actually do something about.
+ TrackBacks (0) | Category: Drug Development | Drug Industry History | Life in the Drug Labs
July 11, 2014
Another dose of reality for the "Terrible STEM Shortage!" folks, courtesy of Slate. Here's what author Jordan Weissmann has to say:
With a little cleaning up, however, the federal data do tell a pretty clear story: The market for new Ph.D.s in the much obsessed-about STEM fields—science, technology, engineering, and math—is stagnant. Over the last 20 years, employment rates are either flat or down in each major discipline, from computer science to chemistry. It’s not what you’d expect given the way companies like Microsoft talk about talent shortages.
Why no, it isn't, is it? There seems to be something a bit off. Weissmann is working with data from the
NIH NSF and their surveys of new doctorates in the sciences, and it shows several things. For one, the post-doc populations remain very high in every field, which isn't a good sign. The number of new doctorates who report being employed has not attained the levels seen in the late 1990s, for any field. And here's chemistry in particular:
Not a very pretty picture, to be sure. It's true that the number of postdocs have been declining the last few years, but the slack seems to be picked up, more or less equally, by people who are getting jobs and people falling into the flat-out unemployed category. And remember, this is a snapshot of new doctorates, so the numbers for more experienced people are going to be different (but ugly in their own way, to judge from the waves of layoffs over the last few years). It's notable that the new chemistry doctorate holders who are unemployed have outnumbered the ones with non-postdoc jobs for the last few years, which may well be unprecedented.
Weissmann's figures for computer science doctorate and engineers are telling, too, and I refer you to the article for them. Neither group has made it back to its heights back in 2000 or so, although the 2011-2012 number have picked up a bit. Whether that's a blip or a real inflection point remains to be seen. It's safe to say, though, that none of these charts support the "Just can't find anybody to hire" narrative that you hear from so many companies.
+ TrackBacks (0) | Category: Business and Markets
Here's the biggest fake-peer-review operation I've heard of yet. Retraction Watch, which does not seem to be in any danger of running out of material, reports that a researcher in Taiwan decided to not leave the review process at the Journal of Vibration and Control up to chance. He set up scores of fake identities in their online submission database, with as many as 130 fabricated e-mail addresses, and guess who got to review his manuscripts?
The journal has retracted sixty papers going back to 2010, and I'd like to know if that's the record. I haven't heard of anything better - well, worse, you know what I mean. The professor involved has been removed from his position in Taiwan, as well he might, and the editor of the journal has resigned. As well he might, too - that editor is not implicated in the publication scam, as far as I can tell, but what exactly were his editorial duties? Dozens of papers come pouring in every year from some obscure university in Taiwan, all of them with overlapping lead or co-authors, and you don't even so much as look up from your desk? Hardly a month goes by without another bulletin from the wildly productive engineers at Pingtung U, sometimes four or five of the damn things at once, and you think you're doing your job? And nobody else who reads this journal - assuming anyone ever does - wonders what's going on, either?
If the professor involved was really getting something out of this (tenure, promotion, grant money, what have you), then the people who awarded those to him were idiots, too. In fact, that's how I'd sum up the whole affair: a fool, faking papers for a bunch of incompetents, and rewarded for it by idiots. What a crew. You really cannot underestimate the low end of the scientific publishing industry, nor its customers.
+ TrackBacks (0) | Category: The Scientific Literature
July 10, 2014
We've had some big biopharma market events so far this year, but if you're wondering what's coming in the next few months, here's a handy rundown from Adam Feuerstein of what may be the top 14. There are a few regulatory events on there, but most of the list are highly anticipated clinical trial results, which is where the action is, for sure. That's what makes the sector so attractive to both legitimate investors and to cult-like lunatics alike. These people, many of whom would cross the street to avoid each other in the real world, come together to make a market - and anyone with enough nerve and a little cash can join right in.
+ TrackBacks (0) | Category: Business and Markets
I've written several times about the NIH's NCATS program, their foray into "translational medicine". Now comes this press release that the first compound from this effort has been picked up for development by a biopharma company.
The company is AesRx (recently acquired by Baxter), and the compound is AES-103. This came from the rare-disease part of the initiative, and the compound is targeting sickle cell anemia - from what I've seen, it appears to have come out of a phenotypic screening effort to identify anti-sickling agents. It appears to work by stabilizing the mutant hemoglobin into a form where it can't polymerize, which is the molecular-level problem underlying the sickle-cell phenotype. For those who don't know the history behind it, Linus Pauling and co-workers were among the first to establish that a mutation in the hemoglobin protein was the key factor. Pauling coined the term "molecular disease" to describe it, and should be considered one of the founding fathers of molecular biology for that accomplishment, among others.
So what's AES-103? Well, you'll probably be surprised: it's hydroxymethyl furfural, which I would not have put high on my list of things to screen. That page says that the NIH screened "over 700 compounds" for this effort, which I hope is a typo, because that's an insanely small number. I would have thought that detecting the inhibition of sickling would be something that could be automated. If you were only screening 700 compounds, would this be one of them?
For those outside the business, I base that opinion on several things. Furans in general do not have a happy history in drug development. They're too electron-rich to play well in vivo, for the most part. This one does have an electron-withdrawing aldehyde on it, but aldehydes have their own problems. They're fairly reactive, and they tend to have poor pharmacokinetics. Aldehydes are, for example, well-known as protease inhibitors in vitro, but most attempts to develop them as drugs have ended in failure. And the only thing that's left on the molecule, that hydroxymethyl, is problematic, too. Having a group like that next to an aromatic ring has also traditionally been an invitation to trouble - they tend to get oxidized pretty quickly. So overall, no, I wouldn't have bet on this compound. There must be a story about why it was tested, and I'd certainly like to know what it is.
But for all I know, those very properties are what are making it work. It may well be reacting with some residue on hemoglobin and stabilizing its structure in that way. The compound went into Phase I in 2011, and into Phase II last year, so it does have real clinical data backing it up at this point, and real clinical data can shut me right up. The main worry I'd have at this point is idiosyncratic tox in Phase III, which is always a worry, and more so, I'd think, with a compound that looks like this. We'll see how it goes.
+ TrackBacks (0) | Category: Clinical Trials | Drug Development
July 9, 2014
The proposed Scripps/USC deal is off, according to reporters Gary Robbins and Bradley Fikes at the San Diego Union-Tribune. No details on what comes next, though - but something presumably does come next.
+ TrackBacks (0) | Category: General Scientific News
Yesterday's post on yet another possible Alzheimer's blood test illustrates, yet again, that understanding statistics is not a strength of most headline writers (or most headline readers). I'm no statistician myself, but I have a healthy mistrust of numbers, since I deal with the little rotters all day long in one form or another. Working in science will do that to you: every result, ideally, is greeted with the hearty welcoming phrase of "Hmm. I wonder if that's real?"
A constant source for the medical headline folks is the constant flow of observational studies. Eating broccoli is associated with this. Chocolate is associated with that. Standing on your head is associated with something else. When you see these sorts of stories in the news, you can bet, quite safely, that you're not looking at the result of a controlled trial - one cohort eating broccoli while hanging upside down from their ankles, another group eating it while being whipped around on a carousel, while the control group gets broccoli-shaped rice puffs or eats the real stuff while being duct-taped to the wall. No, it's hard to get funding for that sort of thing, and it's not so easy to round up subjects who will stay the course, either. Those news stories are generated from people who've combed through large piles of data, from other studies, looking for correlations.
And those correlations are, as far as anyone can tell, usually spurious. Have a look at the 2011 paper by Young and Karr to that effect (here's a PDF). If you go back and look at the instances where observational effects in nutritional studies have been tested by randomized, controlled trials, the track record is not good. In fact, it's so horrendous that the authors state baldly that "There is now enough evidence to say what many have long thought: that any claim coming from an observational study is most likely to be wrong."
They draw the analogy between scientific publications and manufacturing lines, in terms of quality control. If you just inspect the final product rolling off the line for defects, you're doing it the expensive way. You're far better off breaking the whole flow into processes and considering each of those in turn, isolating problems early and fixing them, so you don't make so many defective products in the first place. In the same way, Young and Karr have this to say about the observational study papers:
Consider the production of an observational study: Workers – that is, researchers – do data collection, data cleaning, statistical analysis, interpretation, writing a report/paper. It is a craft with essentially no managerial control at each step of the process. In contrast, management dictates control at multiple steps in the manufacture of computer chips, to name only one process control example. But journal editors and referees inspect only the final product of the observational study production process and they release a lot of bad product. The consumer is left to sort it all out. No amount of educating the consumer will fix the process. No amount of teaching – or of blaming – the worker will materially change the group behaviour.
They propose a process control for any proposed observational study that looks like this:
Step 0: Data are made publicly available. Anyone can go in and check it if they like.
Step 1: The people doing the data collection should be totally separate from the ones doing the analysis.
Step 2: All the data should be split, right at the start, into a modeling group and a group used for testing the hypothesis that the modeling suggests.
Step 3: A plan is drawn up for the statistical treatment of the data, but using only the modeling data set, and without the response that's being predicted.
Step 4: This plan is written down, agreed on, and not modified as the data start to come in. That way lies madness.
Step 5: The analysis is done according to the protocol, and a paper is written up if there's one to be written. Note that we still haven't seen the other data set.
Step 6: The journal reviews the paper as is, based on the modeling data set, and they agree to do this without knowing what will happen when the second data set get looked at.
Step 7: The second data set gets analyzed according to the same protocol, and the results of this are attached to the paper in its published form.
Now that's a hard-core way of doing it, to be sure, but wouldn't we all be better off if something like this were the norm? How many people would have the nerve, do you think, to put their hypothesis up on the chopping block in public like this? But shouldn't we all?
+ TrackBacks (0) | Category: Clinical Trials | Press Coverage
Here's a look at Emerald Biotherapeutics (a name that's unfortunately easy to confuse with several other former Emeralds in this space). They're engaged in their own drug research, but they also have lab services for sale, using a proprietary system that they say generates fast, reproducible assays.
On July 1 the company unveiled a service that lets other labs send it instructions for their experiments via the Web. Robots then complete the work. The idea is a variation on the cloud-computing model, in which companies rent computers by the hour from Amazon.com, Google, and Microsoft instead of buying and managing their own equipment. In this case, biotech startups could offload some of their basic tasks—counting cells one at a time or isolating proteins—freeing their researchers to work on more complex jobs and analyze results. To control the myriad lab machines, Emerald has developed its own computer language and management software. The company is charging clients $1 to $100 per experiment and has vowed to return results within a day.
The Bloomberg Businessweek piece profiling them does a reasonable job, but I can't tell if its author knows that there's already a good amount of outsourcing of this type already. Emerald's system does indeed sound fast, though. But rarely does the quickness of an assay turn out to be the real bottleneck in any drug discovery effort, so I'm not sure how much of a selling point that is. The harder parts are the ones that can't be automated: figuring out what sort of assay to run, and troubleshooting it so that it can be reliably run on high-throughput machines are not trivial processes, and they can take a lot of time and effort. Even more difficult is the step before any of that: figuring out what you're going to be assaying at all. What's your target? What are you screening for? What's the great idea behind the whole project? That stuff is never going to be automated at all, and it's the key to the whole game.
But when I read things like this, I wonder a bit:
While pursuing the antiviral therapy, Emerald began developing tools to work faster. Each piece of lab equipment, made by companies including Agilent Technologies (A) and Thermo Fisher Scientific (TMO), had its own often-rudimentary software. Emerald’s solution was to write management software that centralized control of all the machines, with consistent ways to specify what type of experiment to run, what order to mix the chemicals in, how long to heat something, and so on. “There are about 100 knobs you can turn with the software,” says Frezza. Crucially, Emerald can store all the information the machines collect in a single database, where scientists can analyze it. This was a major advance over the still common practice of pasting printed reports into lab notebooks.
Well, that may be common in some places, but in my own experience, that paste-the-printed-report stuff went out a long time ago. Talking up the ability to have all the assay data collected in one place sounds like something from about fifteen or twenty years ago, although the situation can be different for the small startups who would be using Emerald (or their competitors) for outsourced assay work. But I would still expect any CRO shop to provide something better than a bunch of paper printouts!
Emerald may well have something worth selling, and I wish them success with it. Reproducible assays with fast turnaround are always welcome. But this article's "Gosh everything's gone virtual now wow" take on it isn't quite in line with reality.
+ TrackBacks (0) | Category: Drug Assays
July 8, 2014
There all all sorts of headlines today about how there's going to be a simple blood test for Alzheimer's soon. Don't believe them.
This all comes from a recent publication in the journal Alzheimer's and Dementia, from a team at King's College (London) and the company Proteome Sciences. It's a perfectly good paper, and it does what you'd think: they quantified a set of proteins in a cohort of potential Alzheimer's patients and checked to see if any of them were associated with progression of the disease. From 26 initial protein candidates (all of them previously implicated in Alzheimer's), they found that a panel of ten seemed to give a prediction that was about 87% accurate.
That figure was enough for a lot of major news outlets, who have run with headlines like "Blood test breakthrough" and "Blood test can predict Alzheimer's". Better ones said something more like "Closer to blood test" or "Progress towards blood test", but that's not so exciting and clickable, is it? This paper may well represent progress towards a blood test, but as its own authors, to their credit, are at pains to say, a lot more work needs to be done. 87%, for starters, is interesting, but not as good as it needs to be - that's still a lot of false negatives, and who knows how many false positives.
That all depends on what the rate of Alzheimer's is in the population you're screening. As Andy Extance pointed out on Twitter, these sorts of calculations are misunderstood by almost everyone, even by people who should know better. A 90 per cent accurate test on a general population whose Alzheimer's incidence rate is 1% would, in fact, be wrong 92% of the time. Here's a more detailed writeup I did in 2007, spurred by reports of a similar Alzheimer's diagnostic back then. And if you have a vague feeling that you heard about all these issue (and another blood test) just a few months ago, you're right.
Even after that statistical problem, things are not as simple as the headlines would have you believe. This new work is a multivariate model, because a number of factors were found to affect the levels of these proteins. The age and gender of the patient were two real covariants, as you'd expect, but the duration of plasma storage before testing also had an effect, as did, apparently, the center where the collection was done. That does not sound like a test that's ready to be rolled out to every doctor's office (which is again what the authors have been saying themselves). There were also different groups of proteins that could be used for a prediction model using the set of Mild Cognitive Impairment (MCI) patients, versus the ones that already appeared to show real Alzheimer's signs, which also tells you that this is not a simple turn-the-dial-on-the-disease setup. Interestingly, they also looked at whether adding brain imaging data (such as hippocampus volume) helped the prediction model. This, though, either had no real effect on the prediction accuracy, or even reduced it somewhat.
So the thing to do here is to run this on larger patient cohorts to get a more real-world idea of what the false negative and false positive rates are, which is the sort of obvious suggestion that is appearing in about the sixth or seventh paragraph of the popular press writeups. This is just what the authors are planning, naturally - they're not the ones who wrote the newspaper stories, after all. This same collaboration has been working on this problem for years now, I should add, and they've had ample opportunity to see their hopes not quite pan out. Here, for example, is a prediction of an Alzheimer's blood test entering the clinic in "12 to 18 months", from . . .well, 2009.
Update: here's a critique of the statistical approaches used in this paper - are there more problems with it than were first apparent?
+ TrackBacks (0) | Category: Alzheimer's Disease | Analytical Chemistry | Biological News
Pfizer's bid for AstraZeneca made headlines for weeks on both sides of the Atlantic. But there's another US drug company trying to buy a British one right now - AbbVie for Shire - and it's going on very quietly indeed.
Over at FierceBiotech, they're wondering why that should be so, after an article in the Telegraph. There are several reasons, some better than others. For one thing, the whole deal is a smaller affair than the Pfizer saga. Most importantly, it would involve fewer UK jobs, because Shire itself doesn't really have all that many employees in the UK (91% of them are elsewhere!) Some years ago, they reworked themselves into an Irish-domiciled company, anyway, for (you guessed it) tax purposes. But there's not much noise in Ireland about this deal, either.
Fewer politicians have an interest in what's going on. If names change on pieces of paper, and it hardly affects anyone in their constituencies, then they have other things to worry about. The financial reasons behind the deal are similar to Pfizer's - relatively generous corporate tax policies, but the principles behind those, and behind deals predicated on them, was never the primary political concern. You might have gotten a different impression from some of the speechmaking that went on during the Pfizer/AZ business, but that's what comes from listening to politicians, rather than watching their actions with the sound off. I recommend that technique; it improves the signal/noise immensely.
+ TrackBacks (0) | Category: Business and Markets
July 7, 2014
Catalysts are absolutely vital to almost every field of chemistry. And catalysis, way too often, is voodoo or a close approximation thereof. A lot of progress has been made over the years, and in some systems we have a fairly good idea of what the important factors are. But even in the comparatively well-worked-out areas one finds surprises and hard-to-explain patterns of reactivity, and when it comes to optimizing turnover, stability, side reactions, and substrate scope, there's really no substitute for good old empirical experimentation most of the time.
The heterogeneous catalysts are especially sorcerous, because the reactions are usually taken place on a poorly characterized particle surface. Nanoscale effects (and even downright quantum mechanical effects) can be important, but these things are not at all easy to get a handle on. Think of the differences between a lump of, say, iron and small particles of the same. The surface area involved (and the surface/volume ratio) is extremely different, just for starters. And when you get down to very small particles (or bits of a rough surface), you find very different behaviors because these things are no longer a bulk material. Each atom becomes important, and can perhaps behave differently.
Now imagine dealing with a heterogeneous catalyst that's not a single pure substance, but is perhaps an alloy of two or more metals, or is some metal complex that itself is adsorbed onto the surface of another finely divided solid, or needs small amounts of some other additive to perform well, etc. It's no mystery why so much time and effort goes into finding good catalysts, because there's plenty of mystery built into them already.
Here's a new short review article in Angewandte Chemie on some of the current attempts to lift some of the veils. A paper earlier this year in Science illustrated a new way of characterizing surfaces with X-ray diffraction, and at short time scales (seconds) for such a technique. Another recent report in Nature Communications describes a new X-ray tomography system to try to characterize catalyst particles.
None of these are easy techniques, and at the moment they require substantial computing power, very close attention to sample preparation, and (in many cases) the brightest X-ray synchrotron sources you can round up. But they're providing information that no one has ever had before about (in these examples) palladium surfaces and nanoparticle characteristics, with more on the way.
+ TrackBacks (0) | Category: Analytical Chemistry | Chemical News
This article from David Cyranoski at Nature News is an excellent behind-the-scenes look at all the problems with the "STAP" stem-cell work, now retracted and apparently without any foundation at all. There were indeed problems with all of it from the start, and one of the key questions is whether these things could have been caught:
The committee was more vexed by instances of manipulated and duplicated images in the STAP papers. Obokata had spliced together gel lanes from different experiments to appear as one. And she had used an image of cells in a teratoma — a tumorous growth that includes multiple types of tissue — that had also appeared in her PhD dissertation. The captions indicated that the image was being used to represent different types of cell in each case. The committee judged that in both instances, although she might not have intended to mislead, she should have been “aware of the danger” and therefore found her guilty of misconduct. Obokata claimed that they were mistakes and has denied wrongdoing. . .
. . .Philip Campbell, editor-in-chief of Nature, says: “We have concluded that we and the referees could not have detected the problems that fatally undermined the papers.” But scientists and publishers say that catching even the less egregious mistakes raises alarm bells that, on further investigation, can lead to more serious problems being discovered.
Many say that the tests should be carried out on all papers. Christopher says that it takes about one-third of her working week to check all accepted manuscripts for the four journals published by EMBO Press. At Nature and the Nature research journals, papers are subjected to random spot-checking of images during the production process. Alice Henchley, a spokeswoman for Nature, says that the journal does not check the images in all papers because of limitations in resources, and that the STAP papers were not checked. But she adds that as one outcome of this episode, editors “have decided to increase the number of checks that we undertake on Nature’s papers. The exact number or proportion of papers that will be checked is still being decided.”
A complication is that some of the common image manipulations (splicing gel lanes, for example) are done in honest attempts to present the data more clearly, or just to save space in a figure. My guess is that admitting this up front, along with submitting copies of the original figures to the editors (and for inclusion in the Supplementary Material?) would help to clear that up. The article raises another good point - that editors are actually worried about confronting every example of image manipulation that they see, for fear of raising the competence of the average image manipulator. There's an evolutionary-arms-race aspect to all this that can't be ignored.
In the end, one gets the impression that Nature's editorial staff (a separate organization from the News people) very much regret ever having accepted the work, as well they might. Opinion seems divided about whether they could have caught the problems with the papers themselves - this was one of those cases where a number of reputable co-authors, at reputable institutions, all screwed up simultaneously when it came to cross-checking and verification. What remains is a portrait of how eager people can be to send in groundbreaking results for publication, and how eager editors can be to publish it. Neither of those are going to change any time soon, are they?
Update: from the comments, see also this timeline of events for a look at the whole story.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
July 4, 2014
This, at least, I have observed in forty-five years: that there are men who search for it [truth], whatever it is, wherever it may lie, patiently, honestly, with due humility, and that there are other men who battle endlessly to put it down, even though they don't know what it is. To the first class belong the scientists, the experimenters, the men of curiosity. To the second belong politicians, bishops, professors, mullahs, tin pot messiahs, frauds and exploiters of all sorts - in brief, the men of authority. . .All I find there is a vast enmity to the free functioning of the spirit of man. There may be, for all I know, some truth there, but it is truth made into whips, rolled into bitter pills. . .
I find myself out of sympathy with such men. I shall keep on challenging them until the last galoot's ashore.
- H. L. Mencken, "Off the Grand Banks", 1925
In those days the New York dockers were renowned for their truculence, inefficiency and sheer slowness. Four hours was supposed to be the standard and we got the standard. Neverthless, the difference from the British equivalent did not strike me as very marked, and by the time we sailed out into the dusk. . .among the wondrous multi-colored lights of the New Jersey Turnpike, at that time utterly unparalleled at home - by then I knew. . .that this was my second country and always would be.
. . .I only ever spent a few nights in (New York City), but made a lot of day and evening trips and saw quite enough of the place to convince me that anyone who makes a business of hating it or being superior to it, and there were plenty then, home-grown and foreign, is a creep, and that anyone who walks up Fifth Avenue (say) on a sunny morning without feeling his spirits lift is an ***hole.
- Kingsley Amis, "Memoirs", 1991
There must be no barriers to freedom of inquiry. There is no place for dogma in science. The scientist is free, and must be free to ask any question, to doubt any assertion, to seek for any evidence, to correct any errors. ... Our political life is also predicated on openness. We know that the only way to avoid error is to detect it and that the only way to detect it is to be free to inquire. And we know that as long as men are free to ask what they must, free to say what they think, free to think what they will, freedom can never be lost, and science can never regress.
- J. Robert Oppenheimer, 1949
+ TrackBacks (0) | Category: Blog Housekeeping
July 3, 2014
Since the Fourth of July looks to be a rainy washout around here, I'm taking a day off to get a head start on the holiday weekend here. So instead of advancing the cause of science in the lab today, I'm home with a big pork shoulder, which I rubbed down with salt and dry spices last night. It's now cooking over a low charcoal fire with plenty of green hickory wood (from a small shagbark hickory tree I spotted over in the forest near the back yard). That will cook the rest of the day, and be ready for dinner. Here's a prep for barbecued pork ribs from a couple of years ago.
Dessert will be the lime sorbet I mentioned here last year. If you're at all into lemon or lime as a flavor, give that one a try - it's easy, and the results from fresh lime juice are spectacular. Two large-scale, tested preparations, then, for today - I just have to keep an eye on things to make sure that everything is going according to plan.
+ TrackBacks (0) | Category: Blog Housekeeping
July 2, 2014
Yesterday's link to the comprehensive list of chemical-free products led to some smiles, but also to some accusations of preaching to the choir, both on my part and on the part of the paper's authors. A manuscript mentioned in the blog section of Nature Chemistry is certainly going to be noticed mostly by chemists, naturally, so I think that everyone responsible knows that this is mainly for some comic relief, rather than any sort of serious attempt to educate the general public. Given the constant barrage of "chemical-free" claims, and what that does to the mood of most chemists who see them, some comedy is welcome once in a while.
But the larger point stands. The commenters here who said, several times, that chemists and the public mean completely different things by the word "chemical" have a point. But let's take a closer look at this for a minute. What this implies (and implies accurately, I'd say) is that for many nonscientists, "chemical" means "something bad or poisonous". And that puts chemists in the position of sounding like they're arguing from the "No True Scotsman" fallacy. We're trying to say that everything is a chemical, and that they range from vital to harmless to poisonous (at some dose) and everything in between. But this can sound like special pleading to someone who's not a scientist, as if we're claiming all the good stuff for our side and disavowing the nasty ones as "Not the kind of chemical we were talking about". (Of course, the lay definition of chemical does this, with the sign flipped: the nasty things are "chemicals", and the non-nasty ones are. . .well, something else. Food, natural stuff, something, but not a chemical, because chemicals are nasty).
So I think it's true that approaches that start off by arguing the definition of "chemical" are doomed. It reminds me of something you see in online political arguments once in a while - someone will say something about anti-Semitism in an Arab country, and likely as not, some other genius will step in with the utterly useless point that it's definitionally impossible, you see, for an Arab to be an anti-Semite, because technically the Arabs are also a Semitic people! Ah-hah! What that's supposed to accomplish has always been a mystery to me, but I fear that attempts to redefine that word "chemical" are in the same category, no matter how teeth-grinding I find that situation to be.
The only thing I've done in this line, when discussing this sort of thing one-on-one, is to go ahead and mention that to a chemist, everything that's made out of atoms is pretty much a "chemical", and that we don't use the word to distinguish between the ones that we like and the ones that we don't. I've used that to bring up the circular nature of some of the arguments on the opposite side: someone's against a chemical ingredient because it's toxic, and they know it's toxic because it's a chemical ingredient. If it were "natural", things would be different.
That's the point to drop in the classic line about cyanide and botulism being all-natural, too. You don't do that just to score some sort of debating point, though, satisfying though that may be - I try not to introduce that one with a flourish of the sword point. No, I think you want to come in with a slightly regretful "Well, here's the problem. . ." approach. The idea, I'd say, is to introduce the concept of there being a continuum of toxicity out there, one that doesn't distinguish between man-made compounds and natural ones.
The next step after that is the fundamental toxicological idea that the dose makes the poison, but I think it's only effective to bring that up after this earlier point has been made. Otherwise, it sounds like special pleading again: "Oh, well, yeah, that's a deadly poison, but a little bit of it probably won't hurt you. Much." My favorite example in this line is selenium. It's simultaneously a vital trace nutrient and a poison, all depending on the dose, and I think a lot of people might improve their thinking on these topics if they tried to integrate that possibility into their views of the world.
Because it's clear that a lot of people don't have room for it right now. The common view is that the world is divided into two categories of stuff: the natural, made by living things, and the unnatural, made by humans (mostly chemists, dang them). You even see this scheme applied to inorganic chemistry: you can find people out there selling makeup and nutritional supplements who charge a premium for things like calcium carbonate when it's a "natural mineral", as opposed (apparently) to that nasty sludge that comes out of the vats down at the chemical plant. (This is also one of the reasons why arguing about the chemist's definition of "organic" is even more of a losing position than arguing about the word "chemical").
There's a religious (or at least quasi-religious) aspect to all this, which makes the arguments emotional and hard to win by appeals to reason. That worldview I describe is a dualist, Manichean one: there are forces of good, and there are forces of evil, and you have to choose sides, don't you? It's sort of assumed that the "natural" world is all of a piece: living creatures are always better off with natural things. They're better; they're what living creatures are meant to consume and be surrounded by. Anything else is ersatz, a defective substitute for the real thing, and quite possibly an outright work of evil by those forces on the other side.
Note that we're heading into some very deep things in many human cultures here, which is another reason that this is never an easy or simple argument to have. That split between natural and unnatural means that there was a time, before all this industrial horror, when people lived in the natural state. They never encountered anything artificial, because there was no such thing in the world. Now, a great number of cultures have a "Golden Age" myth, that distant time when everything was so much better - more pure, somehow, before things became corrupted into their present regrettable state. The Garden of Eden is the aspect this takes in the Christian religion, but you find similar things in many other traditions. (Interestingly, this often takes the form of an ancient age when humans spoke directly with the gods, in whatever form they took, which is one of the things that led Julian Jaynes to his fascinating, although probably unprovable hypotheses in The Origin of Consciousness in the Breakdown of the Bicameral Mind).
This Prelapsarian strain of thinking permeates the all-natural chemical-free worldview. There was a time when food and human health were so much better, and industrial civilization has messed it all up. We're surrounded by man-made toxins and horrible substitutes for real food, and we've lost the true path. It's no wonder that there's all this cancer and diabetes and autism and everything: no one ever used to get those things. Note the followup to this line of thought: someone did this to us. The more hard-core believers in this worldview are actually furious at what they see as the casual, deliberate poisoning of the entire population. The forces of evil, indeed.
And there are enough small reinforcing bars of truth to make all of this hold together quite well. There's no doubt that industrial poisons have sickened vast numbers of people in the past: mercury is just the first one that's come to mind. (I'm tempted to point out that mercury and its salts, by the standards of the cosmetics and supplements industries, are most certainly some of those all-natural minerals, but let that pass for now). We've learned more about waste disposal, occupational exposure, and what can go into food, but there have been horrible incidents that live on vividly in the imagination. And civilization itself didn't necessarily go about increasing health and lifespan for quite a while, as the statistics assembled in Gregory Clark's A Farewell to Alms make clear. In fact, for centuries, living in cities was associated with shorter lifespans and higher mortality. We've turned a lot of corners, but it's been comparatively recently.
And on the topic of "comparatively recently", there's one more factor at work that I'd like to bring up. The "chemical free" view of the world has the virtue of simplicity (and indeed, sees simplicity as a virtue itself). Want to stay healthy? Simple. Don't eat things with chemicals in them. Want to know if something is the right thing to eat, drink, wear, etc.? Simple: is it natural or not? This is another thing that makes some people who argue for this view so vehement - it's not hard, it's right in front of you, and why can't you see the right way of living when it's so, so. . .simple? Arguing against that, from a scientific point of view, puts a person at several disadvantages. You necessarily have to come in with all these complications and qualifying statements, trying to show how things are actually different than they look. That sounds like more special pleading, for one thing, and it's especially ineffective against a way of thinking that often leans toward thinking that the more direct, simple, and obvious something is, the more likely it is to be correct.
That's actually the default way of human thinking, when you get down to it, which is the problem. Science, and the scientific worldview, are unnatural things, and I don't mean that just in the whole-grain no-additives sense of "natural". I mean that they do not come to most people as a normal consequence of their experience and habits of thought. A bit of it does: "Hey, every time I do X, Y seems to happen". But where that line of thinking takes you starts to feel very odd very quickly. You start finding out that the physical world is a lot more complicated than it looks, that "after" does not necessarily mean "because", and that all rules of thumb break down eventually (and usually without warning). You find that math, of all things, seems to be the language that the universe is written in (or at least a very good approximation to it), and that's not exactly an obvious concept, either. You find that many of the most important things in that physical world are invisible to our senses, and not necessarily in a reassuring way, or in a way that even makes much sense at all at first. (Magical explanations of invisible forces at least follow human intuitions). It's no wonder that scientific thinking took such a long, long time to ever catch on in human history. I still sometimes think that it's only tolerated because it brings results.
So there are plenty of reasons why it's hard to effectively argue against the all-natural chemical-free worldview. You're asking your audience to accept a number of things that don't make much sense to them, and what's worse, many of these things look like rhetorical tricks at best and active (even actively evil) attempts to mislead them at worst. And all in the service of something that many of them are predisposed to regard as suspicious even from the start. It's uphill all the way.
+ TrackBacks (0) | Category: General Scientific News | Snake Oil | Toxicology
July 1, 2014
If you've ever wondered about those deals where the large scientific publishers offer bundled discounts to libraries, wonder no more. There's a paper in PNAS whose authors used Freedom of Information Act requests to track down what various university libraries really paid for these deals, and it reveals that everyone paid something different.
Here's a comment in Nature on the study, which they can do with a straight face, since the Nature Publishing Group wasn't included in the study (although the authors seem to think, in retrospect, that they should have done so). These deals are always secret - the publishers make it a requirement not to disclose the terms. And that, as you might easily expect, benefits the publishers, since the library systems don't have a good way of finding out what the market price might be. The PNAS study reveals some odd discrepancies, with some universities getting noticeably better (and worse) deals than others. Wisconsin and Texas bargained hard, it appears, while BYU and Georgia could have done better for themselves.
As the article details, publishers used site licenses to take care of arbitrage opportunities, and the "Big Deal" bundles were used as incentives for the library systems and as tools for the publishers to figure out how much each customer might be willing to pay (using the print-based subscription data as a starting point). As you might have guessed, Elsevier comes out at the top of the pricing list when you just look at the dollar figures. On a cost-per-citation basis, though, they don't look so bad - in fact, they're the most cost-effective of the big publishers by that metric. (Sage and Taylor & Francis both look pretty bad in that table). For reference, the ACS bundle looks pretty decent, and it turns out that nearly 60% of the libraries that deal with the ACS choose the whole package (a high percentage compared to many other publishers). Interestingly, it turns out that some very wealthy schools (Harvard, MIT, Caltech) still don't take the full Elsevier bundle.
And the bundles are, naturally, a mixed bag. It's their whole purpose to be a mixed bag:
It would cost about $3.1 million at 2009 á la carte prices to buy all of the journals in Elsevier’s bundle, the “Freedom Collection.” The average research 1 university paid roughly $1.2 million, or 40% of the summed title-by- title prices, for access to the Freedom Collection. However, this bundle price is by no means equivalent to a 60% discount from journal-by-journal pricing. The Freedom Collection includes about 2,200 journals, many of which are expensive but rarely cited. The least cost-effective 1,100 journals contained in this collection supply fewer than 5% of the citations, but their prices add to more than 25% of the total of á la carte prices. A library that spent $1.2 million on Elsevier journals at listed catalog prices, selecting journals for cost-effectiveness, could obtain access to journals providing 79% of the citations to journals found in the Freedom Collection. Thus, for the average research 1 institution, the citation-scaled discount obtained from the Freedom Collection is about 21%.
Elsevier, though, drops its prices for smaller universities more quickly than many other publishers, and for Master's-level schools it's actually a better deal than many of the nonprofit publishers. We wouldn't know this, though, if these authors hadn't dug up all the info from FOIA requests, and I guess that's the take-home here: scientific publishing is a very opaque, inefficient market. And the publishers like it that way.
+ TrackBacks (0) | Category: The Scientific Literature
Here's a question for those of you who've used Selectfluor (Air Products trademark), the well-known fluorinating reagent. I've had an email from someone at Sigma-Aldrich, wondering if people have noticed corrosion problems with either glass or stainless steel when using or storing the reagent. I've hardly used it myself, so I don't have much to offer, but I figured that there was a lot of chemical experience out in the blog's readership, and someone may have something to add.
+ TrackBacks (0) | Category: Chemical News
Here's a comprehensive review of chemical-free consumer products, courtesy of Nature Chemistry. I'm flattered to have been listed as a potential referee for this manuscript, which truly does provide the most complete list possible of chemical-free cleaners, cosmetics, and every other class of commercially available product.
Along similar lines, I can also recommend this site as an accurate, clearly stated summary of the evidence for vaccines causing autism. These are important topics that many people are interested in, and good information is essential.
+ TrackBacks (0) | Category: Snake Oil
June 30, 2014
OK, the GlaxoSmithKline/China business has officially crossed over into new territory. Over the weekend, the company confirmed reports that Mark Reilly, the GSK executive in the country who's been in the middle of this affair from the beginning, was the object of a blackmail attempt by unknown parties. (The story was broken by the Sunday Times, and it's behind a paywall, but it's been picked up by every major news outlet).
Someone shot extensive footage of Reilly alone with his Chinese girlfriend, and mailed the resulting file to higher-ups at the company. The connection between all this and the corruption allegations has not been made clear, but the footage apparently accompanied some of the emails accusing the company of bribery. We may never know quite what's going on here, but I'll bet it's very interesting indeed. More on surveillance in China here.
Update: an excellent overview from the BBC.
+ TrackBacks (0) | Category: Business and Markets | The Dark Side
In keeping with the discussions around here about STEM jobs and education, I wanted to pass along this link from Coding Horror: "Please Don't Learn to Code". It's written by a programmer, as you might guess, and here's his main point:
To those who argue programming is an essential skill we should be teaching our children, right up there with reading, writing, and arithmetic: can you explain to me how Michael Bloomberg would be better at his day to day job of leading the largest city in the USA if he woke up one morning as a crack Java coder? It is obvious to me how being a skilled reader, a skilled writer, and at least high school level math are fundamental to performing the job of a politician. Or at any job, for that matter. But understanding variables and functions, pointers and recursion? I can't see it.
Look, I love programming. I also believe programming is important … in the right context, for some people. But so are a lot of skills. I would no more urge everyone to learn programming than I would urge everyone to learn plumbing. That'd be ridiculous, right?
I see his point. He goes on to say that more code is not necessarily what we need in the world, and that coding is not the proper solution to many problems. On a less philosophic level, the learn-to-code movement also makes it seem as if this is the short path to a job, which is not quite aligned with reality, either.
I suppose I can support learning a tiny bit about programming just so you can recognize what code is, and when code might be an appropriate way to approach a problem you have. But I can also recognize plumbing problems when I see them without any particular training in the area. The general populace (and its political leadership) could probably benefit most of all from a basic understanding of how computers, and the Internet, work. Being able to get around on the Internet is becoming a basic life skill, and we should be worried about fixing that first and most of all, before we start jumping all the way into code.
Now let's apply that to learning about chemistry and biology. It's not going to be a very comfortable exercise, because I (and many of the people who read this site) have put a lot of time and effort into learning an awful lot of chemistry and biology. I've written before about the problem of how much science the "average" person should know, and the least controversial answer is "More than they do now". After that, the arguing starts.
It would be nice if everyone knew enough to make some of the ridiculous scams out there harder to work. "Eat whatever you want and still lose 10 pounds a week with this miracle fat-burning supplement!" would be greeted with "Hey, isn't that thermodynamically sort of impossible?". "New Super-Ionized Oxygenated Water Reverses Aging!" would meet with "How do you "super-ionize" water? And how much oxygen can it hold, anyway? And wouldn't that be, like, bleach?" It would be good if people had a slightly better idea of what causes cancer, how diabetes works, a bit better understanding of toxicology, and so on.
But then we're already supposed to be teaching everyone some of the basics, and it doesn't necessarily seem to be going all that well (evidence, both hopeful and not, can be found here and here). Everyone's supposedly exposed to some simple astronomy, but surveys always show a depressing amount of confusion, when it comes to the earth, moon, and sun, which one of them is going around which. Everyone's supposed to have been exposed to the idea of cells making up living organisms, to DNA, and so on, but you can still seemingly get away with all kinds of off-kilter claims about such things when talking to a lay audience.
Some readers will remember the "Why Are You Forcing My Son to Take Chemistry" guy from the Washington Post. I wish that I could argue that chemistry, and a good dose of it, is prima facie a requirement for any reasonably competent citizen, but I'm not quite there yet. But I'm also sure that being completely ignorant of chemistry is a good indicator of someone whose worldview is incomplete and could use some shoring up. You need some knowledge in these areas, but we could start with getting across the stuff we're trying to get across already.
What I am sure of, though, is that a certain amount of science and math really is necessary, and not just for the bare facts. My daughter, when she was learning the quadratic equation, asked me the classic question "Why am I learning this? When will I ever use it?" My response to her was that I, too, had rarely had recourse to the quadratic equation as it stood. But at the same time, learning these things was good for the mind. I told her that when I went to the gym, it wasn't because I was planning on having to do more repetitive squats with a weighted bar on my back any time soon. But strengthening my back and legs was a good thing in general, and helped out with a lot of other things in my day-to-day life, in both the short and long terms. The same with the mind. Memorized the quadratic formula was not a great deal of use in and of itself, but that realization she had, in one of those thrown-ball problems, that the height of the ball was at the origin at just two points (at the beginning and the end of its flight), and that was why solving for time at that height gave you two solutions - that flash of understanding, I told her, was the feeling of mental muscle being built up, capacity that she would need for more than just her math homework. Everyone could do with some of that exercise.
+ TrackBacks (0) | Category: General Scientific News
June 27, 2014
Some may remember a paper from 2011 on the "reverse click" reaction, an interesting one where triazoles were pulled apart with mechanical force. This was an interesting system, because we really know surprisingly little, down on the molecular level, about what happens when bonds are physically stressed in this way. What do molecular orbitals look like when you grab both ends of the molecule and tug hard? Which bonds break first, and why? Do you get the reverse of the forward reaction, or do different mechanisms kick in (free radical intermediates, etc.)? (Note that the principle of microscopic reversibility doesn't necessarily apply when the conditions change like this).
Unfortunately, there seems to be trouble associated with this example. Science has an editorial "expression of concern" on the paper now, and it appears that much of it is not, in fact, reproducible (see this report in C&E News).
The paper was from the Bielawski lab at UT-Austin, and Bielawski is reported as saying that a former group member has confessed to manipulating data. But he also says that the conclusions of the paper are unchanged, which is interesting. My guess is that the "unclick" does happen, then, but nowhere as smoothly as reported. Someone may have sweetened things to make it all look better. At any rate, a correction is coming soon in Science, so we should get more information at that point.
This reminds me of the scheme I use to rate political and economic corruption. Stage I is paying someone off to do something they wouldn't normally do (or aren't authorized to do) for you. This happens everywhere, to some extent. Stage II is when you're bribing them just to do the job they're supposed to be doing in the first place. Many countries suffer from institutional cases of this, and it's supremely annoying, and a terrible drag on the economy. And Stage III, the worst, is when you're paying them not to harm you - a protection racket with the force of law behind it. Cynics may adduce examples from the US, but I'm thinking about countries (Russia, among others) where the problem is far worse.
Similar levels apply to fakery in the scientific literature. Here's how I break it down:
Stage I is what we may have in this case: actual conclusions and effects are made to look cleaner and better than reality. Zapping solvent peaks in the NMR is a perfect small-scale example of this - for organic chemists, solvent peaks are sometimes the training wheels of fakery. The problem is, once you're used to altering data, at what point do you find it convenient to stop? It's far better not to take that first step into matters-of-degree territory.
Stage II is unfortunately common as well, and there's a nice slippery path from Stage I that can land you here. This is when you're convinced that your results are correct, but you're having such a hard time getting things to work that you decide to "fake it until you make it". That's a stupendously bad idea, of course, because a lot of great results were never real in the first place, which leaves you hung out to dry, and even the ones that can be finally filled in don't have to do so in the way that you were faking them to happen. So now a real result is tainted by deception, which will call the whole thing into doubt when the inconsistencies become clear. And faked results are faked results, even if they're done in what you might think is a good cause. Many big cases of scientific fraud have started off this way, with someone just trying to fill in that one little gap, just for now.
Stage III, the bottom, is when something is faked from beginning to end. There was no question of it even working in the first place - it never did. Someone's just trying to get a paper, or a degree, or tenure, or fame, or something, and they're taking the shortcut. I think that there are two main classes of fakery in this category. In one group (IIIa?), you have people whipping up bogus results in low-profile cases where no one may notice for years, if ever, because no one cares. And you have IIIb, the famous high-profile cases (see Jan-Hendrik Schön, among too many others) where impressive, splashy, look-at-that stuff turns out to have been totally faked as well. Those cases are a study in human psychology. If you report a big result in superconductors, stem cells, cancer therapy or any other field where a lot of smart, competent people are paying very close attention, you will be found out at some point. How can you not be? We're in Bernie Madoff territory here, where someone comes into work every day of every week knowing that their whole reputation is a spray-painted scrim of deception that could have a hole punched through it any minute. How people can possibly live this way I don't really know, but people do. The self-confidence displayed by this sort of personality is a wonder of nature, in its way. IIIa cases are initiated by the desperate, stupid, and/or venal. IIIb cases, though, are brought on by people born to their task.
Update: as pointed out by several good comments, there are plenty of not-quite-fraud sins that neighbor these. Those are worth a separate post, partly because they're even more common than straight-up fraud.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
June 26, 2014
Something I've noticed for many years now is that I tend to get the most number of chemical ideas - bench chemistry relating to my current work - when I'm in a big conference room far removed from my actual lab. Take me off site, send me to a distant meeting, and I get all sorts of brainstorms about what I should be doing in front of my hood. Does anyone else have this problem (if it is a problem?) I write down all the things I'm thinking of, naturally, so I can actually get around to doing them. But it's funny how the ideas seem to come out of hiding once I'm not actually doing the work.
+ TrackBacks (0) | Category: Life in the Drug Labs
I wrote a couple of years ago about Andrew Lo of MIT, and his idea for securitization of drug discovery. For those of you who aren't financial engineers, that means raising funds by issuing securities (bonds and the like), and that's something that (as far as I know) has never been used to fund any specific drug development project.
Now Pharmalot has an update in an interview with Lo (who's recently published a paper on the idea in Science Translational Medicine). In particular, he's talking about issuing "Alzheimer's bonds", to pick a disease with no real therapies, a huge need for something, and gigantic cost barriers to finding something. Lo's concerned that the risks are too high for any one company to take on (and Eli Lilly might agree with him eventually), and wants to have some sort of public/private partnership floating the bonds.
We would create a fund that issues bonds. But if the private sector isn’t incentivized on its own, maybe the public sector can be incentivized to participate along with some members of the private sector. I will explain. But let’s look at the costs for a moment. The direct cost of treating the disease – never mind home care and lost wages – to Medicare and Medicaid for 2014 is estimated at $150 billion. We did a calculation and asked ourselves what kind of rate of return can we expect? We came up with $38.4 billion over 13 years. . .
. . .Originally, I thought it could come from the private sector. We’d create a fund – a mega fund of private investors, such as hedge funds, pension, various institutional investors. The question we asked ourselves is will they get a decent rate of return over a 13-year period? The answer, which is based on a best guess, given the risks of development and 64 projects, and we believed the answer was ‘no.’ It wouldn’t be like cancer or orphan diseases. It’s just not going to work. I come from that world. I talked to funds, philanthropists, medical experts. We did a reality check to see if we were off base. And it sounded like it would be difficult to create a fund to develop real drugs and still give investors a reasonable rate of return – 15% to 20%.
He's now going around to organizations like the Alzheimer's Association to see if there's some interest in giving this a try. I think that it's going to be a hard sell, but I'd actually like to see it happen. The difficulty is that there's no way to do this just a little bit to see if it works: you have to do it on a large scale to have any hope of success at all, and it's a big leap. In fact, the situation reminds one of. . .the situation with any given Alzheimer's drug idea. The clinical course of the disease, as we understand it now, does not give you any options other than a big, long, expensive path through the clinic (which is why it's the perfect example of an area where all the risk is concentrated on the expensive late stages). Lo is in the position of trying to address the go-big-or-go-home problem of Alzheimer's research with a remedy that requires investors to go big or go home.
The hope is that you could learn enough along the way to change the risk equation in media res. There's an old science fiction story by A. E. van Vogt, "Far Centaurus", which featured (among other things - van Vogt stories generally had several kitchen sinks included) a multidecade suspended-animation expedition to Alpha Centauri. The crew arrive there to find the planets already covered with human-populated cities, settled by the faster-than-light spaceships that were invented in the interim. We don't need FTL to fix Alzheimer's (fortunately), but there could be advances that would speed up the later parts of Lo's fund. But will this particular expedition ever launch?
+ TrackBacks (0) | Category: Alzheimer's Disease | Business and Markets | Clinical Trials | Drug Development
June 25, 2014
A look through some of the medicinal chemistry literature this morning got me to thinking: does anyone have any idea of which drug target has the most different/diverse chemical matter that's been reported against it? I realize that different scaffolds are in the eye of the beholder, so it's going to be impossible to come up with any exact counts. But I think that all the sulfonamides that hit carbonic anhydrase, for example, should for this purpose be lumped together: that interaction with the zinc is crucial, and everything else follows after. Non-sulfonamide CA inhibitors would each form a new class for each new zinc-interacting motif, and any compounds that don't hit the zinc at all (are there any?) would add to the list, too. Then you have allosteric compounds, which are necessarily going to look different than active-site inhibitors.
My guess is that some of the nuclear receptors would turn out to win this competition. They can have large, flexible binding pockets that seem to recognize a variety of chemotypes. So maybe this question should be divided up a bit more:
1. What enzyme is known to have the widest chemical variety of active-site inhibitors?
2. Which GPCR has the widest chemical variety of agonists? Antagonists? (The antagonists are going to win this one, surely).
3. And the the open field question asked above: what drug target of any kind has had the widest variety of molecules reported to act on it, in any fashion?
I don't imagine that we'll come to any definitive answer to any of these, but some people may have interesting nominations.
Update: in response to a query in the comments, maybe we should exempt the drug-metabolizing enzymes from the competition, since their whole reason for living is to take on a wide variety of unknown chemical structures.
+ TrackBacks (0) | Category: Chemical News | Drug Assays