Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

July 28, 2014

Summer Blogging

Email This Entry

Posted by Derek

I wanted to let everyone know that blogging will be irregular around here for the next week or so. I'll be taking some time off here and there, and while I'll surely get a few blog posts in, they won't show up at the usual times. (I started the whole time-off process over the weekend, with a trip to Stellafane, the big amateur astronomy gathering up in Vermont. Despite earlier weather forecasts, Saturday night was clear and dark, the best night skies I've seen in years. My wife and kids joined me (their first star party), and as we were getting pulled pork sandwiches from the food tent, my son looked around and said "Wow, there sure are a lot of barbecue-eating, telescope-owning guys with beards around here. You fit right in!"

Comments (0) + TrackBacks (0) | Category: Blog Housekeeping

July 25, 2014

The Antibiotic Gap: It's All of the Above

Email This Entry

Posted by Derek

Here's a business-section column at the New York Times on the problem of antibiotic drug discovery. To those of us following the industry, the problems of antibiotic drug discovery are big pieces of furniture that we've lived with all our lives; we hardly even notice if we bump into them again. You'd think that readers of the Times or other such outlets would have come across the topic a few times before, too, but there must always be a group for which it's new, no matter how many books and newspaper articles and magazine covers and TV segments are done on it. It's certainly important enough - there's no doubt that we really are going to be in big trouble if we don't keep up the arms race against the bacteria.

This piece takes the tack of "If drug discovery is actually doing OK, where are the new antibiotics?" Here's a key section:

Antibiotics face a daunting proposition. They are not only becoming more difficult to develop, but they are also not obviously profitable. Unlike, say, cancer drugs, which can be spectacularly expensive and may need to be taken for life, antibiotics do not command top dollar from hospitals. What’s more, they tend to be prescribed for only short periods of time.

Importantly, any new breakthrough antibiotic is likely to be jealously guarded by doctors and health officials for as long as possible, and used only as a drug of last resort to prevent bacteria from developing resistance. By the time it became a mass-market drug, companies fear, it could be already off patent and subject to competition from generics that would drive its price down.

Antibiotics are not the only drugs getting the cold shoulder, however. Research on treatments to combat H.I.V./AIDS is also drying up, according to the research at Yale, mostly because the cost and time required for development are increasing. Research into new cardiovascular therapies has mostly stuck to less risky “me too” drugs.

This mixes several different issues, unfortunately, and if a reader doesn't follow the drug industry (or medical research in general), then they may well not realize this. (And that's the most likely sort of reader for this article - people who do follow such things have heard all of this before). The reason that cardiovascular drug research seems to have waned is that we already have a pretty good arsenal of drugs for the most common cardiovascular conditions. There are a huge number of options for managing high blood pressure, for example, and they're mostly generic drugs by now. The same goes for lowering LDL: it's going to be hard to beat the statins, especially generic Lipitor. But there is a new class coming along targeting PCSK9 that is going to try to do just that. This is a very hot area of drug development (as the author of the Times column could have found without much effort), although the only reason it's so big is that PCSK9 is the only pathway known that could actually be more effective at lowering LDL than the statins. (How well it does that in the long term, and what the accompanying safety profile might be, are the subject of ongoing billion-dollar efforts). The point is, the barriers to entry in cardiovascular are, by now, rather high: a lot of good drugs are known that address a lot of the common problems. If you want to go after a new drug in the space, you need a new mechanism, like PCSK9 (and those are thin on the ground), or you need to find something that works against some of the unmet needs that people have already tried to fix and failed (such as stroke, a notorious swamp of drug development which has swallowed many large expeditions without a trace).

To be honest, HIV is a smaller-scale version of the same thing. The existing suite of therapies is large and diverse, and keeps the disease in check in huge numbers of patients. All sorts of other mechanisms have been tried as well, and found wanting in the development stage. If you want to find a new drug for HIV, you have a very high entry barrier again, because pretty most of the reasonable ways to attack the problem have already been tried. The focus now is on trying to "flush out" latent HIV from cells, which might actually lead to a cure. But no one knows yet if that's feasible, how well it will work when it's tried, or what the best way to do it might be. There were headlines on this just the other day.

The barriers to entry in the antibiotic field area similarly high, and that's what this article seems to have missed completely. All the known reasonable routes of antibiotic action have been thoroughly worked over by now. As mentioned here the other day, if you just start screening your million-compound libraries against bacteria to see what kills them, you will find a vast pile of stuff that will kill your own cells, too, which is not what you want, and once you've cleared those out, you will find a still-pretty-vast pile of compounds that work through mechanisms that we already have antibiotics targeting. Needles in haystacks have nothing on this.

In fact, a lot of not-so-reasonable routes have been worked over, too. I keep sending people to this article, which is now seven years old and talks about research efforts even older than that. It's the story of GlaxoSmithKline's exhaustive antibiotics research efforts, and it also tells you how many drugs they got out of it all in the end: zip. Not a thing. From what I can see, the folks who worked on this over the last fifteen or twenty years at AstraZeneca could easily write the same sort of article - they've published all kinds of things against a wide variety of bacterial targets, and I don't think any of it has led to an actual drug.

This brings up another thing mentioned in the Times column. Here's the quote:

This is particularly striking at a time when the pharmaceutical industry is unusually optimistic about the future of medical innovation. Dr. Mikael Dolsten, who oversees worldwide research and development at Pfizer, points out that if progress in the 15 years until 2010 or so looked sluggish, it was just because it takes time to figure out how to turn breakthroughs like the map of the human genome into new drugs.

Ah, but bacterial genomes were sequenced before the human one was (and they're more simple, at that). Keep in mind also that proof-of-concept for new targets can be easier to obtain in bacteria (if you manage to find any chemical matter, that is). I well recall talking with a bunch of people in 1997 who were poring over the sequence data for a human pathogen, fresh off the presses, and their optimism about all the targets that they were going to find in there, and the great new approaches they were going to be able to take. They tried it. None of it worked. Over and over, none of it worked. People had a head start in this area, genomically speaking, with an easier development path than many other therapeutic areas, and still nothing worked.

So while many large drug companies have exited antibiotic research over the years, not all of them did. But the ones that stayed have poured effort and money, over and over, down a large drain. Nothing has come out of the work. There are a number of smaller companies in the space as well, for whom even a small success would mean a lot, but they haven't been having an easy time of it, either.

Now, one thing the Times article gets right is that the financial incentives for new antibiotics are a different thing entirely than the rest of the drug discovery world. Getting one of these new approaches in LDL or HIV to work would at least be highly profitable - the PCSK9 competitors certainly are working on that basis. Alzheimer's is another good example of an area that has yielded no useful drugs whatsoever despite ferocious amounts of effort, but people keep at it because the first company to find a real Alzheimer's drug will be very well rewarded indeed. (The Times article says that this hasn't been researched enough, either, which makes me wonder what areas have been). But any great new antibiotic would be shelved for emergencies, and rightly so.

But that by itself is not enough to explain the shortage of those great new antibiotics. It's everything at once: the traditional approaches are played out and the genomic-revolution stuff has been tried, so the unpromising economics makes the search for yet another approach that much harder.

Note: be sure to see the comments for perspectives from others who've also done antibiotic research, including some who disagree. I don't think we'll find anyone who says it's easy, though, but you never know.

Comments (46) + TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Industry History | Infectious Diseases

July 24, 2014

Phenotypic Assays in Cancer Drug Discovery

Email This Entry

Posted by Derek

The topic of phenotypic screening has come up around here many times, as indeed it comes up very often in drug discovery. Give your compounds to cells or to animals and look for the effect you want: what could be simpler? Well, a lot of things could, as anyone who's actually done this sort of screening will be glad to tell you, but done right, it's a very powerful technique.

It's also true that a huge amount of industrial effort is going into cancer drug discovery, so you'd think that there would be a natural overlap between these: see if your compounds kill or slow cancer cells, or tumors in an animal, and you're on track, right? But there's a huge disconnect here, and that's the subject of a new paper in Nature Reviews Drug Discovery. (Full disclosure: one of the authors is a former colleague, and I had a chance to look over the manuscript while it was being prepared). Here's the hard part:

Among the factors contributing to the growing interest in phenotypic screening in drug discovery in general is the perception that, by avoiding oversimplified reductionist assumptions regarding molecular targets and instead focusing on functional effects, compounds that are discovered in phenotypic assays may be more likely to show clinical efficacy. However, cancer presents a challenge to this perception as the cell-based models that are typically used in cancer drug discovery are poor surrogates of the actual disease. The definitive test of both target hypotheses and phenotypic models can only be carried out in the clinic. The challenge of cancer drug discovery is to maximize the probability that drugs discovered by either biochemical or phenotypic methods will translate into clinical efficacy and improved disease control.

Good models in living systems, which are vital to any phenotypic drug discovery effort, are very much lacking in oncology. It's not that you can't get plenty of cancer cells to grow in a dish - they'll take over your other cell cultures if they get a chance. But those aren't the cells that you're going to be dealing with in vivo, not any more. Cancer cells tend to be genetically unstable, constantly throwing off mutations, and the in vitro lines are adapted to living in cull culture. That's true even if you implant them back into immune-compromised mice (the xenograft models). The number of drugs that look great in xenograft models and failed out in the real world is too large to count.

So doing pure phenotypic drug discovery against cancer is very difficult - you go down a lot of blind alleys, which is what phenotypic screening is supposed to prevent. The explosion of knowledge about cellular pathways in tumor cells has led to uncountable numbers of target-driven approaches instead, but (as everyone has had a chance to find out), it's rare to find a real-world cancer patient who can be helped by a single-target drug. Gleevec is the example that everyone thinks of, but the cruel truth is that it's the exceptional exception. All those newspaper articles ten years ago that heralded a wonderful era of targeted wonder drugs for cancer? They were wrong.

So what to do? This paper suggests that the answer is a hybrid approach:

For the purpose of this article, we consider ‘pure’ phenotypic screening to be a discovery process that identifies chemical entities that have desirable biological (phenotypic) effects on cells or organisms without having prior knowledge of their biochemical activity or mode of action against a specific molecular target or targets. However, in practice, many phenotypically driven discovery projects are not target-agnostic; conversely, effective target-based discovery relies heavily on phenotypic assays. Determining the causal relationships between target inhibition and phenotypic effects may well open up new and unexpected avenues of cancer biology.

In light of these considerations, we propose that in practice a considerable proportion of cancer drug discovery falls between pure PDD and TDD, in a category that we term ‘mechanism-informed phenotypic drug discovery’ (MIPDD). This category includes inhibitors of known or hypothesized molecular targets that are identified and/or optimized by assessing their effects on a therapeutically relevant phenotype, as well as drug candidates that are identified by their effect on a mechanistically defined phenotype or phenotypic marker and subsequently optimized for a specific target-engagement MOA.

I've heard these referred to as "directed phenotypic screens", and while challenging, it can be a very fruitful way to go. Balancing the two ways of working is the tricky part: you don't want to slack up on the model just so it'll give you results, if those results aren't going to be meaningful. And you don't want to be so dogmatic about your target ideas that you walk away from something that could be useful, but doesn't fit your scheme. If you can keep all these factors in line, you're a real drug discovery scientist, and no mistake.

How hard this is can be seen from the paper's Table 1, where they look over the oncology approvals since 1999, and classify them by what approaches were used for lead discovery and lead optimization. There's a pile of 21 kinase inhibitors (and eight other compounds) over in the box where both phases were driven by inhibition of a known target. And there are ten compounds whose origins were in straight phenotypic screening, with various paths forward after that. But the "mechanism-informed phenotypic screen" category is the shortest list of the three lead discovery approaches: seven compounds, optimized in various ways. (The authors are upfront about the difficulties of assembling this sort of overview - it can be hard to say just what really happened during discovery and development, and we don't have the data on the failures).

Of those 29 pure-target-based drugs, 18 were follow-ons to mechanisms that had already been developed. At this point, you'd expect to hear that the phenotypic assays, by contrast, delivered a lot more new mechanisms. But this isn't the case: 14 follow-ons versus five first-in-class. This really isn't what phenotypic screening is supposed to deliver (and has delivered in the past), and I agree with the paper that this shows how difficult it has been to do real phenotypic discovery in this field. The few assays that translate to the clinic tend to keep discovering the same sorts of things. (And once again, the analogy to antibacterials comes to mind, because that's exactly what happens if you do a straight phenotypic screen for antibacterials. You find the same old stuff. That field, too, has been moving toward hybrid target/phenotypic approaches).

The situation might be changing a bit. If you look at the drugs in the clinic (Phase II and Phase III), as opposed to the older ones that have made it all the way through, there are still a vast pile of target-driven ones (mostly kinase inhibitors). But you can find more examples of phenotypic candidates, and among them an unusually high proportion of outright no-mechanism-known compounds. Those are tricky to develop in this field:

In cases where the efficacy arises from the engagement of a cryptic target (or mechanism) other than the nominally identified one, there is potential for substan- tial downside. One of the driving rationales of targeted discovery in cancer is that patients can be selected by pre- dictive biomarkers. Therefore, if the nominal target is not responsible for the actions of the drug, an incorrect diagnostic hypothesis may result in the selection of patients who will — at best — not derive benefit. For example, multiple clinical trials of the nominal RAF inhibitor sorafenib in melanoma showed no benefit, regardless of the BRAF mutation status. This is consistent with the evidence that the primary target and pharmacodynamic driver of efficacy for sorafenib is actually VEGFR2. The more recent clinical success of the bona fide BRAF inhibitor vemurafenib in melanoma demonstrates that the target hypothesis of BRAF for melanoma was valid.

So, if you're going to do this mechanism-informed phenotypic screening, just how do you go about it? High-content screening techniques are one approach: get as much data as possible about the effects of your compounds, both at the molecular and cellular level (the latter by imaging). Using better cell assays is crucial: make them as realistic as you can (three-dimensional culture, co-culture with other cell types, etc.), and go for cells that are as close to primary tissue as possible. None of this is easy, or cheap, but the engineer's triangle is always in effect ("Fast, Cheap, Good: Pick Any Two").

Comments (20) + TrackBacks (0) | Category: Cancer | Drug Assays | Drug Development

July 23, 2014

Neratinib Comes Through For Puma

Email This Entry

Posted by Derek

Yet another entry in the "Why do people keep investing in biopharma?" files. Take a look at the case of Puma Biotechnology. Their stock was as high as $140/share earlier in the year, and it gradually deflated to the high 50s/low 60s as time went on. But yesterday, after hours, they reported unexpectedly good Phase III results for neratinib in breast cancer, and as I write, they're at $228 or so, up about $167 per share.

It's another HER2/EGFR tyrosine kinase inhibitor (like Tykerb/lapatinib in the small molecule space, although neratinib is an irreversible inhibitor) and would be targeted at patients who are now taking Herceptin. Neratinib itself has not had a smooth path to this stage, though. Puma licensed the compound out from Pfizer, and took on the responsibility for all of the development. Pfizer ditched the compound a few years ago in a review of their oncology portfolio. I note that the two companies have reworked their licensing agreement on this news as well. Puma's entire business model is taking up oncology candidates that other companies have shed, and it certainly seems to have come through for them this time.

So chalk one up for irreversible kinase inhibitors, and (of course) for Puma. And for the patients who will be taking the drug, naturally, and lastly, for Puma's shareholders, who are having an excellent day indeed.

Comments (18) + TrackBacks (0) | Category: Business and Markets | Cancer | Clinical Trials

How Many Biopharma Employees Would Rather Be Working Somewhere Else?

Email This Entry

Posted by Derek

How many people working in the biopharma industry would jump to another company if they could? According to this survey, it's just over half: well above the average set by other industry sectors.

The usual reasons are cited, in part (pay, opportunity for advancement). But two factors that seemed unusually prominent in our industry were high stress levels and "difficult relations with supervisors and co-workers". I found that last one interesting, because (like all science and engineering fields) we do have a certain number of people in this business who can be described, as the old British music hall song has it, as "E's all right when you know 'im, but you've got to know 'im first". And there's the ever-present disconnect between the scientists and any non-science-background upper managers, a clash of worldviews if ever there was one.

I've worked in some situations where most people seemed satisfied, but I've probably spent more of my career in those other workplaces described by this survey. I well recall an employee survey many years ago, given out with pencil and paper, yet, where people were calling out to each other before the start with helpful questions like "Is "half-assed" hyphenated or not?" and "Do you capitalize "Moron" when you're talking about a specific one?"

Some of what this survey found, though, is surely pent-up demand. There have been fewer and fewer opportunities to change companies over the years, and it used to be a fairly frequent career move. So I'd guess that there are plenty of people who would be glad to jump if there were anywhere much to jump to.

Comments (20) + TrackBacks (0) | Category: Business and Markets

July 22, 2014

The Broad Gets $650 Million For Psychiatric Research

Email This Entry

Posted by Derek

The Broad Institute seems to have gone through a bit of rough funding patch some months ago, but things are looking up: they've received a gift of $650 million to do basic research in psychiatric disorders. Believe it, that'll keep everyone busy, for sure.

I enjoyed Eric Lander's characterization of much of the 1990s work on the genetic basis of mental illness as "pretty much completely useless", and I don't disagree one bit. His challenge, as he and the rest of the folks at the Broad well know, is to keep someone from being able to say that about them in the year 2034. CNS work is the ultimate black box, which makes a person nervous, but on the other hand, anything solid that gets discovered will be a real advance. Good luck to them.

You might also be interested to know where the Stanley Foundation, the benefactors here, came up with over half a billion dollars to donate to basic medical research (and more to come, apparently). You'd never guess: selling collectibles. Sports figurines. Small replicas of classic cars, trucks, and tractors. Miniature porcelain versions of popular dolls. Leather-bound sets of great (public domain) novels. Order now for the complete set of Presidential Coins - that sort of thing. It looks to be a lot more lucrative than discovering drugs (!)

Comments (47) + TrackBacks (0) | Category: General Scientific News | The Central Nervous System

Put Them in Cells and Find Out

Email This Entry

Posted by Derek

So, when you put some diverse small molecules into cellular assays, how many proteins are they really hitting? You may know a primary target or two that they're likely to interact with, or (if you're doing phenotypic screening), you may not have any idea at all. But how many proteins (or other targets) are there that bind small molecules at all?

This is a question that many people are interested in, but hard data to answer it are not easily obtained. There have been theoretical estimates via several techniques, but (understandably) not too much experimental evidence. Now comes this paper from Ben Cravatt's group, and it's one of the best attempts yet.

What they've done is to produce a library of compounds, via Ugi chemistry, containing both a photoaffinity handle and an alkyne (for later "click" tagging). They'd done something similar before, but the photoaffinity group in that case was a benzophenone, which is rather hefty. This time they used a diazirine, which is both small and the precursor to a very reactive carbene once it's irradiated. (My impression is that the diazirine is the first thing to try if you're doing photoaffinity work, for just those reasons). They made a small set of fairly diverse compounds (about 60), with no particular structural biases in mind, and set out to see what these things would label.

They treated PC-3 cells (human prostate-cancer derived) with each member of the library at 10 µM, then hit them with UV to do the photoaffinity reaction, labeled with a fluorescent tag via the alkyne, and fished for proteins. What they found was a pretty wide variety, all right, but not in the nonselective shotgun style. Most compounds showed distinct patterns of protein labeling, and most proteins picked out distinct SAR from the compound set. They picked out six members of the library for close study, and found that these labeled about 24 proteins (one compound only picked up one target, while the most promiscuous compound labeled nine). What's really interesting is that only about half of these were known to have any small-molecule ligands at all. There were proteins from a number of different classes, and some (9 out of 24) weren't even enzymes, but rather scaffolding and signaling proteins (which wouldn't be expected to have many small-molecule binding possibilities).

A closer look at non-labeled versions of the probe compounds versus more highly purified proteins confirmed that the compounds really are binding as expected (in some cases, a bit better than the non-photoaffinity versions, in some cases worse). So even as small a probe as a diazirine is not silent, which is just what medicinal chemists would have anticipated. (Heck, even a single methyl or fluoro isn't always silent, and a good thing, too). But overall, what this study suggests is that most small molecules are going to hit a number of proteins (1 up to a dozen?) in any given cell with pretty good affinity. It also (encouragingly) suggests that there are more small-molecule binding sites than you'd think, with proteins that have not evolved for ligand responses still showing the ability to pick things up.

There was another interesting thing that turned up: while none of the Ugi compounds was a nonselective grab-everything compound, some of the proteins were. A subset of proteins tended to pick up a wide variety of the non-clickable probe compounds, and appear to be strong, promiscuous binders. Medicinal chemists already know a few of these things - CYP metabolizing enzymes, serum albumin, and so on. This post has some other suggestions. But there are plenty more of them out there, unguessable ones that we don't know about yet (in this case, PTGR and VDAC subtypes, along with NAMPT). There's a lot to find out.

Comments (7) + TrackBacks (0) | Category: Chemical Biology | Drug Assays

July 21, 2014

The Hep C Field Gets Nastier By the Minute

Email This Entry

Posted by Derek

What a mess there is in the hepatitis C world. Gilead is, famously, dominating the market with Sovaldi, whose price has set off all sorts of cost/benefit debates. The companies competing with them are scrambling to claim positions, and the Wall Street Journal says that AbbVie is really pulling out all the stops. Try this strategy on for size:

In a lawsuit filed in February, AbbVie noted it patented the idea of combining two of Gilead's drugs—Sovaldi and an experimental drug called ledipasvir, which Gilead plans to combine into one treatment—and is therefore entitled to monetary damages if Gilead brings the combination pill to market. Legally, AbbVie can't market Sovaldi or ledipasvir because it doesn't have the patents on the underlying compounds. But it is legal for companies to seek and obtain patents describing a particular "method of use" of products that don't belong to them.

Gilead disputes the claims of AbbVie and the other companies. A spokeswoman said Gilead believes it has the sole right to commercialize Sovaldi and products containing Sovaldi's active ingredient, known as sofosbuvir. An AbbVie spokeswoman said the company believes Gilead infringes its patents, and that it stands behind the validity and enforceability of those patents.

You don't see that very often, and it's a good thing. Gilead is, naturally, suing Abbvie over this as well, saying that Abbvie has knowing mispresented to the USPTO that they invented the Gilead therapies. I'm not sure how that's going to play out: Abbvie didn't have to invent the drugs to get a method-of-use patent on them. At the same time, I don't know what sort of enablement Abbvie's patent claims might have behind them, given that these are, well, Gilead's compounds. The company is apparently claiming that a "sophisticated computer model" allows them to make a case that these combinations would be the effective ones, but I really don't know if that's going to cut it (and in fact, I sort of hope it doesn't). But even though I'm not enough of a patent-law guy to say either way, I'm enough of one to say, with great confidence, that this is going to be a very expensive mess to sort out. Gilead's also in court with Merck (and was with Idenix before Merck bought them), and with Roche, and will probably be in court with everyone else before all this is over.

This whole situation reminds me of one of those wildlife documentaries set around a shrinking African watering hole. A lot of lucrative drugs have gone off patent over the last few years, and a lot of them are heading that way soon. So any new therapeutic area with a lot of commercial promise is going to get a lot of attention, and start a lot of fighting. Legal battles aren't cheap on the absolute scale, but on the relative scale of the potential profits, they are. So why not? Claim this, claim that, sue everybody. It might work; you never know. Meanwhile, we have a line forming on the right of ticked-off insurance companies and government health plans, complaining about the Hep C prices, and while they wait they can watch the companies involved throwing buckets of slop on each other and hitting everyone over the head with lawsuits. What a spectacle.

Comments (42) + TrackBacks (0) | Category: Business and Markets | Infectious Diseases | Patents and IP | Why Everyone Loves Us

Allergan Twists and Turns

Email This Entry

Posted by Derek

It's getting nasty over at Allergan. They're still trying to fight off a takeover attempt by Valeant, making the case that the company's R&D efforts are not a waste of money (which, only slightly simplified, is the Valeant position regarding every company they're taking over).

But Allergan's had a lot of trouble getting one of their drugs (Semprana) through the FDA. Semprana is an inhaled version of the classic dihydroergotamine therapy for migraine, and had been rejected last year when it was still known as Levadex. The recent further delay isn't helping Allergan make its case, and Valeant is using this news to peel off some more shareholders.

This morning comes word that Allergan is cutting back staff. That Fierce Biotech report says that it looks like a lot of the cuts will be hitting discovery R&D, which makes you wonder if Allergan will manage to escape Valeant's grip only by becoming what Valeant wanted to make them.

Comments (6) + TrackBacks (0) | Category: Business and Markets | Regulatory Affairs

July 18, 2014

Chemistry Class For the "Food Babe"?

Email This Entry

Posted by Derek

I found this article from the Charlotte Observer on the "Food Babe" (Vani Hari) very interesting. A "menu consultant" for Chick-Fil-A, is she? Who knew?

I've come across a horribly long string of chemistry misapprehensions, mistakes, and blunders while looking at her site - she truly appears to know nothing whatsoever about chemistry, not that this would appear to bother her much. (Wavefunction has a good article on these). I noticed in the comments section of the newspaper's article that someone is apparently trying to crowdsource a fundraising drive to send her to some chemistry classes. I enjoy that idea very much, although (1) horse, water, drink, etc., and (2) she appears to have sufficient funds to do this already, were it of any possible interest to her. And more money coming in all the time. She may well make more money telling people that they're eating yoga mats than I do trying to discover drugs.

Comments (46) + TrackBacks (0) | Category: General Scientific News

Thalidomide, Bound to Its Target

Email This Entry

Posted by Derek

There's a new report in the literature on the mechanism of thalidomide, so I thought I'd spend some time talking about the compound. Just mentioning the name to anyone familiar with its history is enough to bring on a shiver. The compound, administered as a sedative/morning sickness remedy to pregnant women in the 1950s and early 1960s, famously brought on a wave of severe birth defects. There's a lot of confusion about this event in the popular literature, though - some people don't even realize that the drug was never approved in the US, although this was a famous save by the (then much smaller) FDA and especially by Frances Oldham Kelsey. And even those who know a good amount about the case can be confused by the toxicology, because it's confusing: no phenotype in rats, but big reproductive tox trouble in mice and rabbits (and humans, of course). And as I mentioned here, the compound is often used as an example of the far different effects of different enantiomers. But practically speaking, that's not the case: thalidomide has a very easily racemized chiral center, which gets scrambled in vivo. It doesn't matter if you take the racemate or a pure enantiomer; you're going to get both of the isomers once it's in circulation.

The compound's horrific effects led to a great deal of research on its mechanism. Along the way, thalidomide itself was found to be useful in the treatment of leprosy, and in recent years it's been approved for use in multiple myeloma and other cancers. (This led to an unusual lawsuit claiming credit for the idea). It's a potent anti-angiogenic compound, among other things, although the precise mechanism is still a matter for debate - in vivo, the compound has effects on a number of wide-ranging growth factors (and these were long thought to be the mechanism underlying its effects on embryos). Those embryonic effects complicate the drug's use immensely - Celgene, who got it through trials and approval for myeloma, have to keep a very tight patient registry, among other things, and control its distribution carefully. Experience has shown that turning thalidomide loose will always end up with someone (i.e. a pregnant woman) getting exposed to it who shouldn't be - it's gotten to the point that the WHO no longer recommends it for use in leprosy treatment, despite its clear evidence of benefit, and it's down to just those problems of distribution and control.

But in 2010, it was reported that the drug binds to a protein called cereblon (CRBN), and this mechanism implicated the ubiquitin ligase system in the embryonic effects. That's an interesting and important pathway - ubiquitin is, as the name implies, ubiquitous, and addition of a string of ubiquitins to a protein is a universal disposal tag in cells: off to the proteosome, to be torn to bits. It gets stuck onto exposed lysine residues by the aforementioned ligase enzyme.

But less-thorough ubiquitination is part of other pathways. Other proteins can have ubiquitin recognition domains, so there are signaling events going on. Even poly-ubiquitin chains can be part of non-disposal processes - the usual oligomers are built up using a particular lysine residue on each ubiquitin in the chain, but there are other lysine possibilities, and these branch off into different functions. It's a mess, frankly, but it's an important mess, and it's been the subject of a lot of work over the years in both academia and industry.

The new paper has the crystal structure of thalidomide (and two of its analogs) bound to the ubiquitin ligase complex. It looks like they keep one set of protein-protein interactions from occurring while the ligase end of things is going after other transcription factors to tag them for degradation. Ubiquitination of various proteins could be either up- or downregulated by this route. Interestingly, the binding is indeed enantioselective, which suggests that the teratogenic effects may well be down to the (S) enantiomer, not that there's any way to test this in vivo (as mentioned above). But the effects of these compounds in myeloma appear to go through the cereblon pathway as well, so there's never going to be a thalidomide-like drug without reproductive tox. If you could take it a notch down the pathway and go for the relevant transcription factors instead, post-cereblon, you might have something, but selective targeting of transcription factors is a hard row to hoe.

Comments (8) + TrackBacks (0) | Category: Analytical Chemistry | Biological News | Cancer | Chemical News | Toxicology

July 17, 2014

TDP-43 and Alzheimer's

Email This Entry

Posted by Derek

There are quite a few headlines today about a link between Alzheimer's and a protein called TDP-43. This is interesting stuff, but like everything else in the neurodegeneration field, it's going to be tough to unravel what's going on. This latest work, just presented at a conference in Copenhagen, found (in a large post mortem brain study of people with diagnosed Alzheimer's pathology) that aberrant forms of the protein seem to be strongly correlated with shrinkage of the hippocampus and accompanying memory loss.

80% of the cohort with normal TDP-43 (but still showing Alzheimer's histology) had cognitive impairment at death, but 98% of the ones with TDP-43 mutations had such signs. That says several things: (A) it's possible to have classic Alzheimer's without mutated TDP-43, (B) it's possible to have classic Alzheimer's tissue pathology (up to a point, no doubt) without apparent cognitive impairment, and (C) it's apparently possible (although very unlikely) to have mutated TDP-43, show Alzheimer's pathology as well, and still not be diagnosed as cognitively impaired. Welcome to neurodegeneration. Correlations and trends are mostly what you get in that field, and you have to make of them what you can.

TDP-43, though, has already been implicated, for some years now, in ALS and several other syndromes, so it really does make sense that it would be involved. It may be that it's disproportionately a feature of more severe Alzheimer's cases, piling on to some other pathology. Its mechanism of action is not clear yet - as mentioned, it's a transcription factor, so it could be involved in stuff from anywhere and everywhere. It does show aggregation in the disease state, but that Cell paper linked to above makes the case that it's not the aggregates per se that are the problem, but the loss of function behind them (for example, there are increased amounts of the mutant protein out in the cytoplasm, rather than in the nucleus). What those lost functions are, though, remains to be discovered.

Comments (2) + TrackBacks (0) | Category: Alzheimer's Disease | Biological News

Reversal of Type II Diabetes May Be Possible

Email This Entry

Posted by Derek

Here's some big news: Ron Evans and co-workers at Salk report that treatment with the growth factor FGF1 appears to reverse type II diabetes in mice. (Article in Science Daily on this study here). Evans has been working in this field (diabetes, insulin sensitivity, and related areas like growth factors and nuclear receptors) for a long time, and I would definitely take this work seriously.

They reported a couple of years ago that FGF1 seemed to be involved in insulin sensitivity. It's induced in adipose tissue under high-fat diet conditions. FGF1 knockout mice, for their part, have a seemingly normal phenotype, but when they're put on high-fat diets they respond very poorly indeed, quickly showing abnormal glucose control and other defects.

This new paper shows that in normal mice with metabolic problems brought on by a high-fat diet, a single injection of recombinant FGF1 is sufficient to normalize glucose for up to 48 hours. Interestingly (and importantly), this mechanism doesn't seem to overshoot - you don't swing over to hypoglycemia, which is always a worry in this field. And repeated FGF1 therapy leads to increased insulin sensitivity, suppression of hepatic glucose production - basically, everything you'd want in a Type II diabetes therapy. It's great stuff, and the best candidate I've yet seen for the Real Mechanism behind the disease.

Now, FGF1 is a cellular growth factor, so there's a possibility for trouble. But the glucose/insulin effects seem to be mediated by one particular FGF receptor (FGFR1), which makes one hopeful that this axis can be separated out. I would expect to see a great deal of work coming on variants of the protein with longer plasma half-life and greater selectivity. In vivo, the protein seems to be secreted and used locally in specific tissues - it's not in wide circulation. But perhaps it should be - you can be sure that someone's going to try to find out. Overall, this is excellent, exciting news, and we're poised to learn a huge amount about type II diabetes and how to treat it.

Comments (23) + TrackBacks (0) | Category: Diabetes and Obesity

July 16, 2014

An Easy Way to Make Cyclic Peptides

Email This Entry

Posted by Derek

If you ever find yourself needing to make large cyclic peptides, you now have a new option. This paper in Organic Letters describes a particularly clean way to do it: let glutathione-S-transferase (GST) do the work for you. Bradley Pentelute's group at MIT reports that if your protein has a glutathione attached at one end, and a pentafluoroaryl Cys at the other, that GST will step in and promote the nucleophilic aromatic substitution reaction to close the two ends together.
cyclic%20GST.jpg
This is an application of their earlier work on the uncatalyzed reaction and on the use of GST for ligation.. Remarkably, the GST method seems to product very high yields of cyclic peptides up to at least 40 residues, and at reasonable concentration (10 mM) of the starting material, under aqueous conditions. Cyclic peptides themselves are interesting beasts, often showing unusual properties compared to the regular variety, and this method look as it will provide plenty more of them for study.

Comments (7) + TrackBacks (0) | Category: Chemical Biology | Chemical News

What Structures Have Turned on You?

Email This Entry

Posted by Derek

When you ask a bunch of medicinal chemists to look over a list of structures - screening hits, potential additions to the compound collection, that sort of thing - you'll find that everyone will cross some of them off. But the agreement between the chemists on which ones need to go, that's the tough part. It's been shown that we don't overlap very much in our preferences, at least when it comes to the structures we'd prefer not to try to advance. That's because we don't overlap as well as we think we do when it comes to the rules we're using.

So here's a question, which might illustrate the point: what compound classes or scaffolds have you been burned by? I think that's one big factor that we all use when we're evaluating one of those compound lists - which ones are in that "Fooled me once" category? For me, a recent experience with NH pyrroles has me reluctant to go there again. And I'm not interested in things with napthalenes hanging off of them, naproxen notwithstanding. I'd also rather not see Mannich products, since I've personally seen a number of those misbehave.

So what's on your list? I think that everyone can agree on things like rhodanines, although even those have their partisans. But what semi-decent looking compounds will you go ahead and blackball, based on your own nasty experiences with them?

Comments (16) + TrackBacks (0) | Category: Life in the Drug Labs

July 15, 2014

K. C. Nicolaou on Drug Discovery

Email This Entry

Posted by Derek

K. C. Nicolaou has an article in the latest Angewandte Chemie on the future of drug discovery, which may seem a bit surprising, considering that he's usually thought of as Mister Total Synthesis, rather than Mister Drug Development Project. But I can report that it's relentlessly sensible. Maybe too sensible. It's such a dose of the common wisdom that I don't think it's going to be of much use or interest to people who are actually doing drug discovery - you've already had all these thoughts yourself, and more than once.

But for someone catching up from outside the field, it's not a bad survey at all. It gets across how much we don't know, and how much work there is to be done. And one thing that writing this blog has taught me is that most people outside of drug discovery don't have an appreciation of either of those things. Nicolaou's article isn't aimed at a lay audience, of course, which makes it a little more problematic, since many of the people who can appreciate everything he's saying will already know what he's going to say. But it does round pretty much everything up into one place.

Comments (57) + TrackBacks (0) | Category: Drug Development | Drug Industry History

The Prospects of an Academic Job

Email This Entry

Posted by Derek

Over the years, there have been more comments than anyone can count here on the often-grim employment picture for chemistry and biology employment in biopharma. Plenty of people here (myself included) can speak from experience. But we should also remember that the academic job market in the biomedical sciences is in awful shape, too, unfortunately. And since a disproportionate number of people start off grad school picturing themselves getting jobs in academia, a clear picture of what's going on is essential.

That's the point of this piece in Nature, in the Jobs section. The author, Jessica Polka (post-doc at Harvard Medical School) says that far too many of her colleagues don't have an accurate impression of the job market. She's created this graphic to get the point across. Some 16,000 students will start graduate school in biology in the US this fall. The best guess is that fewer than 10% of them will eventually become tenure-track faculty somewhere.

But at least half of them list that as their most preferred career path, which means that a lot of things are not going to work out as planned. Polka's right - the most people who understand this, and the earlier, the better.

Comments (40) + TrackBacks (0) | Category: Academia (vs. Industry) | Business and Markets

July 14, 2014

Targacept Fumbles the Bad News on Alzheimer's

Email This Entry

Posted by Derek

Targacept has been working on some very hard therapeutic areas over the years, and coming up dry - dramatically so. They may have just done it again.

They've been testing TC-1734 in Alzheimer's over the last year or so, a partial agonist at nicotinergic receptors. That was a long-shot mechanism to start with, although to be sure, every Alzheimer's drug is a long-shot mechanism. This would be a stopgap compound even if it worked, like the existing acetylcholinesterase compound Donepezil.

And the company has apparently released the results of the clinical trial on its web site, inadvertently, you'd have to assume. The news first came out from BioRunUp on Twitter, and the text of the announcement was the the compound had failed to show superiority to Donepezil. The company has made no official announcement (as I write, anyway), and the press release itself appears to have been taken down a little while ago. But here's a screen shot, if you're interested. The stock (TRGT) has already reacted to the news, as you'd imagine it would, suddenly dropping like a brick starting at just before 2:30 PM EST. Not a good way to get the news out, that's for sure. . .

Comments (10) + TrackBacks (0) | Category: Alzheimer's Disease | Clinical Trials

Modifying Red Blood Cells As Carriers

Email This Entry

Posted by Derek

What's the best carrier to take some sort of therapeutic agent into the bloodstream? That's often a tricky question to work out in animal models or in the clinic - there are a lot of possibilities. But what about using red blood cells themselves?

That idea has been in the works for a few years now, but there's a recent paper in PNAS reporting on more progress (here's a press release). Many drug discovery scientists will have encountered the occasional compound that partitions into erythrocytes all by itself (those are usually spotted by their oddly long half-lives after in vivo dosing, mimicking the effect of plasma protein binding). One of the early ways that people have attempted to try this deliberately was forcing a compound into the cells, but this tends to damage them and make them quite a bit less useful. A potentially more controllable method would be to modify the surfaces of the RBCs themselves to serve as drug carriers, but that's quite a bit more complex, too. Antibodies have been tried for this, but with mixed success.

That's what this latest paper addresses. The authors (the Lodish and Ploegh groups at Whitehead/MIT) introduce modified surface proteins (such as glycophorin A) that are substrates for Ploegh's sortase technology (two recent overview papers), which allows for a wide variety of labeling.

Experiments using modified fetal cells in irradiated mice gave animals that had up to 50% of their RBCs modified in this way. Sortase modification of these was about 85% effective, so plenty of label can be introduced. The labeling process doesn't appear to affect the viability of the cells very much as compared to wild-type - the cells were shown to circulate for weeks, which certainly breaks the records held by the other modified-RBC methods.

The team attached either biotin tags and specific antibodies to both mouse and human RBCs, which would appear to clear the way for a variety of very interesting experiments. (They also showed that simultaneous C- and N-terminal labeling is feasible, to put on two different tags at once). Here's the "coming attractions" section of the paper:

he approach presented here has many other possible applications; the wide variety of possible payloads, ranging from proteins and peptides to synthetic compounds and fluorescent probes, may serve as a guide. We have conjugated a single-domain antibody to the RBC surface with full retention of binding specificity, thus enabling the modified RBCs to be targeted to a specific cell type. We envision that sortase-engineered cells could be combined with established protocols of small-molecule encapsulation. In this scenario, engineered RBCs loaded with a therapeutic agent in the cytosol and modified on the surface with a cell type-specific recognition module could be used to deliver payloads to a precise tissue or location in the body. We also have demonstrated the attachment of two different functional probes to the surface of RBCs, exploiting the subtly different recognition specificities of two distinct sortases. Therefore it should be possible to attach both a therapeutic moiety and a targeting module to the RBC surface and thus direct the engineered RBCs to tumors or other diseased cells. Conjugation of an imaging probe (i.e., a radioisotope), together with such a targeting moiety also could be used for diagnostic purposes.

This will be worth keeping an eye on, for sure, both as a new delivery method for small (and not-so-small) molecules, fof biologics, and for its application to all the immunological work going on now in oncology. This should keep everyone involved busy for some time to come!

Comments (7) + TrackBacks (0) | Category: Biological News | Chemical Biology | Pharmacokinetics

How to Run a Drug Project: Are There Any Rules at All?

Email This Entry

Posted by Derek

Here's an article from David Shayvitz at Forbes whose title says it all: "Should a Drug Discovery Team Ever Throw in the Towel?" The easy answer to that is "Sure". The hard part, naturally, is figuring out when.

You don’t have to be an expensive management consultant to realize that it would be helpful for the industry to kill doomed projects sooner (though all have said it).

There’s just the prickly little problem of figuring out how to do this. While it’s easy to point to expensive failures and criticize organizations for not pulling the plug sooner, it’s also true that just about every successful drug faced some legitimate existential crisis along the way — at some point during its development , there was a plausible reason to kill the program, and someone had to fight like hell to keep it going.

The question at the heart of the industry’s productivity struggles is the extent to which it’s even possible to pick the winners (or the losers), and figuring out better ways of managing this risk.

He goes on to contrast two approaches to this: one where you have a small company, focused on one thing, with the idea being that the experienced people involved will (A) be very motivated to find ways to get things to work, and (B) motivated to do something else if the writing ever does show up on the wall. The people doing the work should make the call. The other approach is to divide that up: you set things up with a project team whose mandate is to keep going, one way or another, dealing with all obstacles as best they can. Above them is a management team whose job it is to stay a bit distant from the trenches, and be ready to make the call of whether the project is still viable or not.

As Shayvitz goes on to say, quite correctly, both of these approaches can work, and both of them can run off the rails. In my view, the context of each drug discovery effort is so variable that it's probably impossible to say if one of these is truly better than the other. The people involved are a big part of that variability, too, and that makes generalizing very risky.

The big risk (in my experience) with having execution and decision-making in the same hands is that projects will run on for too long. You can always come up with more analogs to try, more experiments to run, more last-ditch efforts to take a crack it. Coming up with those things is, I think, better than not coming up with them, because (as Shayvitz mentions) it's hard to think of a successful drug that hasn't come close to dying at least once during its development. Give up too easily, and nothing will ever work at all.

But it's a painful fact that not every project can work, no matter how gritty and determined the team. We're heading out into the unknown with these drug candidates, and we find out things that we didn't know were there to be found out. Sometimes there really is no way to get the selectivity you need with the compound series you've chosen - heck, sometimes there's no way to get it with any compound series you could possibly choose, although that takes a long time to become obvious. Sometimes the whole idea behind the project is flawed from the start: blocking Kinase X will not, in fact, alter the course of Disease Y. It just won't. The hypothesis was wrong. An execute-at-all-costs team will shrug off these fatal problems, or attempt to shrug them off, for as long as you give them money.

But there's another danger waiting when you split off the executive decision-makers. If those folks get too removed from the project (or projects) then their ability to make good decisions is impaired. Just as you can have a warped perspective when you're right on top of the problems, you can have one when you're far away from them, too. It's tempting to thing that Distance = Clarity, but that's not a linear function, by any means. A little distance can certainly give you a lot of perspective, but if you keep moving out, things can start fuzzing back up again without anyone realizing what's going on.

That's true even if the managers are getting reasonably accurate reports, and we all know that that's not always the case in the real world. In many large organizations, there's a Big Monthly Meeting of some sort (or at some other regular time point) where projects are supposed to be reviewed by just those decision makers. These meetings are subject to terrible infections of Dog-And-Pony-itis. People get up to the front of the room and they tell everyone how great things are going. They minimize the flaws and paper over the mistakes. It's human nature. Anyone inclined to give a more accurate picture has a chance to see how that's going to look, when all the other projects are going Just Fine and everyone's Meeting Their Goals like it says on the form. Over time (and it may not take much time at all), the meeting floats away into its own bubble of altered reality. Managers who realize this can try to counteract it by going directly to the person running the project team in the labs, closing the office door, and asking for a verbal update on how things are really going, but sometimes people are so out of it that they mistake how things are going at the Big Monthly Meeting for what's really happening.

So yes indeed, you can (as is so often the case) screw things up in both directions. That's what makes it so hard to law down the law about how to run a drug discovery project: there are several ways to succeed, and the ways to mess them up are beyond counting. My own bias? I prefer the small-company back-to-the-wall approach, of being ready to swerve hard and try anything to make a project work. But I'd only recommend applying that to projects with a big potential payoff - it seems silly to do that sort of thing for anything less. And I'd recommend having a few people watching the process, but from as close as they can get without being quite of the project team themselves. Just enough to have some objectivity. Simple, eh? Getting this all balanced out is the hard part. Well, actually, the science is the hard part, but this is the hard part that we can actually do something about.

Comments (14) + TrackBacks (0) | Category: Drug Development | Drug Industry History | Life in the Drug Labs

July 11, 2014

Employment Among New Chemistry PhDs

Email This Entry

Posted by Derek

Another dose of reality for the "Terrible STEM Shortage!" folks, courtesy of Slate. Here's what author Jordan Weissmann has to say:

With a little cleaning up, however, the federal data do tell a pretty clear story: The market for new Ph.D.s in the much obsessed-about STEM fields—science, technology, engineering, and math—is stagnant. Over the last 20 years, employment rates are either flat or down in each major discipline, from computer science to chemistry. It’s not what you’d expect given the way companies like Microsoft talk about talent shortages.

Why no, it isn't, is it? There seems to be something a bit off. Weissmann is working with data from the NIH NSF and their surveys of new doctorates in the sciences, and it shows several things. For one, the post-doc populations remain very high in every field, which isn't a good sign. The number of new doctorates who report being employed has not attained the levels seen in the late 1990s, for any field. And here's chemistry in particular:
employment.jpg
Not a very pretty picture, to be sure. It's true that the number of postdocs have been declining the last few years, but the slack seems to be picked up, more or less equally, by people who are getting jobs and people falling into the flat-out unemployed category. And remember, this is a snapshot of new doctorates, so the numbers for more experienced people are going to be different (but ugly in their own way, to judge from the waves of layoffs over the last few years). It's notable that the new chemistry doctorate holders who are unemployed have outnumbered the ones with non-postdoc jobs for the last few years, which may well be unprecedented.

Weissmann's figures for computer science doctorate and engineers are telling, too, and I refer you to the article for them. Neither group has made it back to its heights back in 2000 or so, although the 2011-2012 number have picked up a bit. Whether that's a blip or a real inflection point remains to be seen. It's safe to say, though, that none of these charts support the "Just can't find anybody to hire" narrative that you hear from so many companies.

Comments (43) + TrackBacks (0) | Category: Business and Markets

My Imaginary Friends Would Be Glad to Serve as Referees

Email This Entry

Posted by Derek

Here's the biggest fake-peer-review operation I've heard of yet. Retraction Watch, which does not seem to be in any danger of running out of material, reports that a researcher in Taiwan decided to not leave the review process at the Journal of Vibration and Control up to chance. He set up scores of fake identities in their online submission database, with as many as 130 fabricated e-mail addresses, and guess who got to review his manuscripts?

The journal has retracted sixty papers going back to 2010, and I'd like to know if that's the record. I haven't heard of anything better - well, worse, you know what I mean. The professor involved has been removed from his position in Taiwan, as well he might, and the editor of the journal has resigned. As well he might, too - that editor is not implicated in the publication scam, as far as I can tell, but what exactly were his editorial duties? Dozens of papers come pouring in every year from some obscure university in Taiwan, all of them with overlapping lead or co-authors, and you don't even so much as look up from your desk? Hardly a month goes by without another bulletin from the wildly productive engineers at Pingtung U, sometimes four or five of the damn things at once, and you think you're doing your job? And nobody else who reads this journal - assuming anyone ever does - wonders what's going on, either?

If the professor involved was really getting something out of this (tenure, promotion, grant money, what have you), then the people who awarded those to him were idiots, too. In fact, that's how I'd sum up the whole affair: a fool, faking papers for a bunch of incompetents, and rewarded for it by idiots. What a crew. You really cannot underestimate the low end of the scientific publishing industry, nor its customers.

Comments (14) + TrackBacks (0) | Category: The Scientific Literature

July 10, 2014

Biopharma Stock Events for the Rest of the Year

Email This Entry

Posted by Derek

We've had some big biopharma market events so far this year, but if you're wondering what's coming in the next few months, here's a handy rundown from Adam Feuerstein of what may be the top 14. There are a few regulatory events on there, but most of the list are highly anticipated clinical trial results, which is where the action is, for sure. That's what makes the sector so attractive to both legitimate investors and to cult-like lunatics alike. These people, many of whom would cross the street to avoid each other in the real world, come together to make a market - and anyone with enough nerve and a little cash can join right in.

Comments (1) + TrackBacks (0) | Category: Business and Markets

A Drug Candidate from NCATS

Email This Entry

Posted by Derek

I've written several times about the NIH's NCATS program, their foray into "translational medicine". Now comes this press release that the first compound from this effort has been picked up for development by a biopharma company.

The company is AesRx (recently acquired by Baxter), and the compound is AES-103. This came from the rare-disease part of the initiative, and the compound is targeting sickle cell anemia - from what I've seen, it appears to have come out of a phenotypic screening effort to identify anti-sickling agents. It appears to work by stabilizing the mutant hemoglobin into a form where it can't polymerize, which is the molecular-level problem underlying the sickle-cell phenotype. For those who don't know the history behind it, Linus Pauling and co-workers were among the first to establish that a mutation in the hemoglobin protein was the key factor. Pauling coined the term "molecular disease" to describe it, and should be considered one of the founding fathers of molecular biology for that accomplishment, among others.

So what's AES-103? Well, you'll probably be surprised: it's hydroxymethyl furfural, which I would not have put high on my list of things to screen. That page says that the NIH screened "over 700 compounds" for this effort, which I hope is a typo, because that's an insanely small number. I would have thought that detecting the inhibition of sickling would be something that could be automated. If you were only screening 700 compounds, would this be one of them?

For those outside the business, I base that opinion on several things. Furans in general do not have a happy history in drug development. They're too electron-rich to play well in vivo, for the most part. This one does have an electron-withdrawing aldehyde on it, but aldehydes have their own problems. They're fairly reactive, and they tend to have poor pharmacokinetics. Aldehydes are, for example, well-known as protease inhibitors in vitro, but most attempts to develop them as drugs have ended in failure. And the only thing that's left on the molecule, that hydroxymethyl, is problematic, too. Having a group like that next to an aromatic ring has also traditionally been an invitation to trouble - they tend to get oxidized pretty quickly. So overall, no, I wouldn't have bet on this compound. There must be a story about why it was tested, and I'd certainly like to know what it is.

But for all I know, those very properties are what are making it work. It may well be reacting with some residue on hemoglobin and stabilizing its structure in that way. The compound went into Phase I in 2011, and into Phase II last year, so it does have real clinical data backing it up at this point, and real clinical data can shut me right up. The main worry I'd have at this point is idiosyncratic tox in Phase III, which is always a worry, and more so, I'd think, with a compound that looks like this. We'll see how it goes.

Comments (18) + TrackBacks (0) | Category: Clinical Trials | Drug Development

July 9, 2014

No Scripps/USC

Email This Entry

Posted by Derek

The proposed Scripps/USC deal is off, according to reporters Gary Robbins and Bradley Fikes at the San Diego Union-Tribune. No details on what comes next, though - but something presumably does come next.

Comments (15) + TrackBacks (0) | Category: General Scientific News

Studies Show? Not So Fast.

Email This Entry

Posted by Derek

Yesterday's post on yet another possible Alzheimer's blood test illustrates, yet again, that understanding statistics is not a strength of most headline writers (or most headline readers). I'm no statistician myself, but I have a healthy mistrust of numbers, since I deal with the little rotters all day long in one form or another. Working in science will do that to you: every result, ideally, is greeted with the hearty welcoming phrase of "Hmm. I wonder if that's real?"

A constant source for the medical headline folks is the constant flow of observational studies. Eating broccoli is associated with this. Chocolate is associated with that. Standing on your head is associated with something else. When you see these sorts of stories in the news, you can bet, quite safely, that you're not looking at the result of a controlled trial - one cohort eating broccoli while hanging upside down from their ankles, another group eating it while being whipped around on a carousel, while the control group gets broccoli-shaped rice puffs or eats the real stuff while being duct-taped to the wall. No, it's hard to get funding for that sort of thing, and it's not so easy to round up subjects who will stay the course, either. Those news stories are generated from people who've combed through large piles of data, from other studies, looking for correlations.

And those correlations are, as far as anyone can tell, usually spurious. Have a look at the 2011 paper by Young and Karr to that effect (here's a PDF). If you go back and look at the instances where observational effects in nutritional studies have been tested by randomized, controlled trials, the track record is not good. In fact, it's so horrendous that the authors state baldly that "There is now enough evidence to say what many have long thought: that any claim coming from an observational study is most likely to be wrong."

They draw the analogy between scientific publications and manufacturing lines, in terms of quality control. If you just inspect the final product rolling off the line for defects, you're doing it the expensive way. You're far better off breaking the whole flow into processes and considering each of those in turn, isolating problems early and fixing them, so you don't make so many defective products in the first place. In the same way, Young and Karr have this to say about the observational study papers:

Consider the production of an observational study: Workers – that is, researchers – do data collection, data cleaning, statistical analysis, interpretation, writing a report/paper. It is a craft with essentially no managerial control at each step of the process. In contrast, management dictates control at multiple steps in the manufacture of computer chips, to name only one process control example. But journal editors and referees inspect only the final product of the observational study production process and they release a lot of bad product. The consumer is left to sort it all out. No amount of educating the consumer will fix the process. No amount of teaching – or of blaming – the worker will materially change the group behaviour.

They propose a process control for any proposed observational study that looks like this:

Step 0: Data are made publicly available. Anyone can go in and check it if they like.

Step 1: The people doing the data collection should be totally separate from the ones doing the analysis.

Step 2: All the data should be split, right at the start, into a modeling group and a group used for testing the hypothesis that the modeling suggests.

Step 3: A plan is drawn up for the statistical treatment of the data, but using only the modeling data set, and without the response that's being predicted.

Step 4: This plan is written down, agreed on, and not modified as the data start to come in. That way lies madness.

Step 5: The analysis is done according to the protocol, and a paper is written up if there's one to be written. Note that we still haven't seen the other data set.

Step 6: The journal reviews the paper as is, based on the modeling data set, and they agree to do this without knowing what will happen when the second data set get looked at.

Step 7: The second data set gets analyzed according to the same protocol, and the results of this are attached to the paper in its published form.

Now that's a hard-core way of doing it, to be sure, but wouldn't we all be better off if something like this were the norm? How many people would have the nerve, do you think, to put their hypothesis up on the chopping block in public like this? But shouldn't we all?

Comments (20) + TrackBacks (0) | Category: Clinical Trials | Press Coverage

Outsourced Assays, Now a Cause For Wonder?

Email This Entry

Posted by Derek

Here's a look at Emerald Biotherapeutics (a name that's unfortunately easy to confuse with several other former Emeralds in this space). They're engaged in their own drug research, but they also have lab services for sale, using a proprietary system that they say generates fast, reproducible assays.

On July 1 the company unveiled a service that lets other labs send it instructions for their experiments via the Web. Robots then complete the work. The idea is a variation on the cloud-computing model, in which companies rent computers by the hour from Amazon.com, Google, and Microsoft instead of buying and managing their own equipment. In this case, biotech startups could offload some of their basic tasks—counting cells one at a time or isolating proteins—freeing their researchers to work on more complex jobs and analyze results. To control the myriad lab machines, Emerald has developed its own computer language and management software. The company is charging clients $1 to $100 per experiment and has vowed to return results within a day.

The Bloomberg Businessweek piece profiling them does a reasonable job, but I can't tell if its author knows that there's already a good amount of outsourcing of this type already. Emerald's system does indeed sound fast, though. But rarely does the quickness of an assay turn out to be the real bottleneck in any drug discovery effort, so I'm not sure how much of a selling point that is. The harder parts are the ones that can't be automated: figuring out what sort of assay to run, and troubleshooting it so that it can be reliably run on high-throughput machines are not trivial processes, and they can take a lot of time and effort. Even more difficult is the step before any of that: figuring out what you're going to be assaying at all. What's your target? What are you screening for? What's the great idea behind the whole project? That stuff is never going to be automated at all, and it's the key to the whole game.

But when I read things like this, I wonder a bit:

While pursuing the antiviral therapy, Emerald began developing tools to work faster. Each piece of lab equipment, made by companies including Agilent Technologies (A) and Thermo Fisher Scientific (TMO), had its own often-rudimentary software. Emerald’s solution was to write management software that centralized control of all the machines, with consistent ways to specify what type of experiment to run, what order to mix the chemicals in, how long to heat something, and so on. “There are about 100 knobs you can turn with the software,” says Frezza. Crucially, Emerald can store all the information the machines collect in a single database, where scientists can analyze it. This was a major advance over the still common practice of pasting printed reports into lab notebooks.

Well, that may be common in some places, but in my own experience, that paste-the-printed-report stuff went out a long time ago. Talking up the ability to have all the assay data collected in one place sounds like something from about fifteen or twenty years ago, although the situation can be different for the small startups who would be using Emerald (or their competitors) for outsourced assay work. But I would still expect any CRO shop to provide something better than a bunch of paper printouts!

Emerald may well have something worth selling, and I wish them success with it. Reproducible assays with fast turnaround are always welcome. But this article's "Gosh everything's gone virtual now wow" take on it isn't quite in line with reality.

Comments (13) + TrackBacks (0) | Category: Drug Assays

July 8, 2014

An Alzheimer's Blood Test? Not So Fast.

Email This Entry

Posted by Derek

There all all sorts of headlines today about how there's going to be a simple blood test for Alzheimer's soon. Don't believe them.

This all comes from a recent publication in the journal Alzheimer's and Dementia, from a team at King's College (London) and the company Proteome Sciences. It's a perfectly good paper, and it does what you'd think: they quantified a set of proteins in a cohort of potential Alzheimer's patients and checked to see if any of them were associated with progression of the disease. From 26 initial protein candidates (all of them previously implicated in Alzheimer's), they found that a panel of ten seemed to give a prediction that was about 87% accurate.

That figure was enough for a lot of major news outlets, who have run with headlines like "Blood test breakthrough" and "Blood test can predict Alzheimer's". Better ones said something more like "Closer to blood test" or "Progress towards blood test", but that's not so exciting and clickable, is it? This paper may well represent progress towards a blood test, but as its own authors, to their credit, are at pains to say, a lot more work needs to be done. 87%, for starters, is interesting, but not as good as it needs to be - that's still a lot of false negatives, and who knows how many false positives.

That all depends on what the rate of Alzheimer's is in the population you're screening. As Andy Extance pointed out on Twitter, these sorts of calculations are misunderstood by almost everyone, even by people who should know better. A 90 per cent accurate test on a general population whose Alzheimer's incidence rate is 1% would, in fact, be wrong 92% of the time. Here's a more detailed writeup I did in 2007, spurred by reports of a similar Alzheimer's diagnostic back then. And if you have a vague feeling that you heard about all these issue (and another blood test) just a few months ago, you're right.

Even after that statistical problem, things are not as simple as the headlines would have you believe. This new work is a multivariate model, because a number of factors were found to affect the levels of these proteins. The age and gender of the patient were two real covariants, as you'd expect, but the duration of plasma storage before testing also had an effect, as did, apparently, the center where the collection was done. That does not sound like a test that's ready to be rolled out to every doctor's office (which is again what the authors have been saying themselves). There were also different groups of proteins that could be used for a prediction model using the set of Mild Cognitive Impairment (MCI) patients, versus the ones that already appeared to show real Alzheimer's signs, which also tells you that this is not a simple turn-the-dial-on-the-disease setup. Interestingly, they also looked at whether adding brain imaging data (such as hippocampus volume) helped the prediction model. This, though, either had no real effect on the prediction accuracy, or even reduced it somewhat.

So the thing to do here is to run this on larger patient cohorts to get a more real-world idea of what the false negative and false positive rates are, which is the sort of obvious suggestion that is appearing in about the sixth or seventh paragraph of the popular press writeups. This is just what the authors are planning, naturally - they're not the ones who wrote the newspaper stories, after all. This same collaboration has been working on this problem for years now, I should add, and they've had ample opportunity to see their hopes not quite pan out. Here, for example, is a prediction of an Alzheimer's blood test entering the clinic in "12 to 18 months", from . . .well, 2009.

Update: here's a critique of the statistical approaches used in this paper - are there more problems with it than were first apparent?

Comments (30) + TrackBacks (0) | Category: Alzheimer's Disease | Analytical Chemistry | Biological News

AbbVie and Shire, Quietly

Email This Entry

Posted by Derek

Pfizer's bid for AstraZeneca made headlines for weeks on both sides of the Atlantic. But there's another US drug company trying to buy a British one right now - AbbVie for Shire - and it's going on very quietly indeed.

Over at FierceBiotech, they're wondering why that should be so, after an article in the Telegraph. There are several reasons, some better than others. For one thing, the whole deal is a smaller affair than the Pfizer saga. Most importantly, it would involve fewer UK jobs, because Shire itself doesn't really have all that many employees in the UK (91% of them are elsewhere!) Some years ago, they reworked themselves into an Irish-domiciled company, anyway, for (you guessed it) tax purposes. But there's not much noise in Ireland about this deal, either.

Fewer politicians have an interest in what's going on. If names change on pieces of paper, and it hardly affects anyone in their constituencies, then they have other things to worry about. The financial reasons behind the deal are similar to Pfizer's - relatively generous corporate tax policies, but the principles behind those, and behind deals predicated on them, was never the primary political concern. You might have gotten a different impression from some of the speechmaking that went on during the Pfizer/AZ business, but that's what comes from listening to politicians, rather than watching their actions with the sound off. I recommend that technique; it improves the signal/noise immensely.

Comments (7) + TrackBacks (0) | Category: Business and Markets

July 7, 2014

Catalyst Voodoo, Yielding to Spectroscopy?

Email This Entry

Posted by Derek

Catalysts are absolutely vital to almost every field of chemistry. And catalysis, way too often, is voodoo or a close approximation thereof. A lot of progress has been made over the years, and in some systems we have a fairly good idea of what the important factors are. But even in the comparatively well-worked-out areas one finds surprises and hard-to-explain patterns of reactivity, and when it comes to optimizing turnover, stability, side reactions, and substrate scope, there's really no substitute for good old empirical experimentation most of the time.

The heterogeneous catalysts are especially sorcerous, because the reactions are usually taken place on a poorly characterized particle surface. Nanoscale effects (and even downright quantum mechanical effects) can be important, but these things are not at all easy to get a handle on. Think of the differences between a lump of, say, iron and small particles of the same. The surface area involved (and the surface/volume ratio) is extremely different, just for starters. And when you get down to very small particles (or bits of a rough surface), you find very different behaviors because these things are no longer a bulk material. Each atom becomes important, and can perhaps behave differently.

Now imagine dealing with a heterogeneous catalyst that's not a single pure substance, but is perhaps an alloy of two or more metals, or is some metal complex that itself is adsorbed onto the surface of another finely divided solid, or needs small amounts of some other additive to perform well, etc. It's no mystery why so much time and effort goes into finding good catalysts, because there's plenty of mystery built into them already.

Here's a new short review article in Angewandte Chemie on some of the current attempts to lift some of the veils. A paper earlier this year in Science illustrated a new way of characterizing surfaces with X-ray diffraction, and at short time scales (seconds) for such a technique. Another recent report in Nature Communications describes a new X-ray tomography system to try to characterize catalyst particles.

None of these are easy techniques, and at the moment they require substantial computing power, very close attention to sample preparation, and (in many cases) the brightest X-ray synchrotron sources you can round up. But they're providing information that no one has ever had before about (in these examples) palladium surfaces and nanoparticle characteristics, with more on the way.

Comments (2) + TrackBacks (0) | Category: Analytical Chemistry | Chemical News

That Retracted Stressed Stem Cell Work

Email This Entry

Posted by Derek

This article from David Cyranoski at Nature News is an excellent behind-the-scenes look at all the problems with the "STAP" stem-cell work, now retracted and apparently without any foundation at all. There were indeed problems with all of it from the start, and one of the key questions is whether these things could have been caught:

The committee was more vexed by instances of manipulated and duplicated images in the STAP papers. Obokata had spliced together gel lanes from different experiments to appear as one. And she had used an image of cells in a teratoma — a tumorous growth that includes multiple types of tissue — that had also appeared in her PhD dissertation. The captions indicated that the image was being used to represent different types of cell in each case. The committee judged that in both instances, although she might not have intended to mislead, she should have been “aware of the danger” and therefore found her guilty of misconduct. Obokata claimed that they were mistakes and has denied wrongdoing. . .

. . .Philip Campbell, editor-in-chief of Nature, says: “We have concluded that we and the referees could not have detected the problems that fatally undermined the papers.” But scientists and publishers say that catching even the less egregious mistakes raises alarm bells that, on further investigation, can lead to more serious problems being discovered.

Many say that the tests should be carried out on all papers. Christopher says that it takes about one-third of her working week to check all accepted manuscripts for the four journals published by EMBO Press. At Nature and the Nature research journals, papers are subjected to random spot-checking of images during the production process. Alice Henchley, a spokeswoman for Nature, says that the journal does not check the images in all papers because of limitations in resources, and that the STAP papers were not checked. But she adds that as one outcome of this episode, editors “have decided to increase the number of checks that we undertake on Nature’s papers. The exact number or proportion of papers that will be checked is still being decided.”

A complication is that some of the common image manipulations (splicing gel lanes, for example) are done in honest attempts to present the data more clearly, or just to save space in a figure. My guess is that admitting this up front, along with submitting copies of the original figures to the editors (and for inclusion in the Supplementary Material?) would help to clear that up. The article raises another good point - that editors are actually worried about confronting every example of image manipulation that they see, for fear of raising the competence of the average image manipulator. There's an evolutionary-arms-race aspect to all this that can't be ignored.

In the end, one gets the impression that Nature's editorial staff (a separate organization from the News people) very much regret ever having accepted the work, as well they might. Opinion seems divided about whether they could have caught the problems with the papers themselves - this was one of those cases where a number of reputable co-authors, at reputable institutions, all screwed up simultaneously when it came to cross-checking and verification. What remains is a portrait of how eager people can be to send in groundbreaking results for publication, and how eager editors can be to publish it. Neither of those are going to change any time soon, are they?

Update: from the comments, see also this timeline of events for a look at the whole story.

Comments (13) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

July 4, 2014

Happy Fourth of July, 2014

Email This Entry

Posted by Derek

This, at least, I have observed in forty-five years: that there are men who search for it [truth], whatever it is, wherever it may lie, patiently, honestly, with due humility, and that there are other men who battle endlessly to put it down, even though they don't know what it is. To the first class belong the scientists, the experimenters, the men of curiosity. To the second belong politicians, bishops, professors, mullahs, tin pot messiahs, frauds and exploiters of all sorts - in brief, the men of authority. . .All I find there is a vast enmity to the free functioning of the spirit of man. There may be, for all I know, some truth there, but it is truth made into whips, rolled into bitter pills. . .

I find myself out of sympathy with such men. I shall keep on challenging them until the last galoot's ashore.

- H. L. Mencken, "Off the Grand Banks", 1925

In those days the New York dockers were renowned for their truculence, inefficiency and sheer slowness. Four hours was supposed to be the standard and we got the standard. Neverthless, the difference from the British equivalent did not strike me as very marked, and by the time we sailed out into the dusk. . .among the wondrous multi-colored lights of the New Jersey Turnpike, at that time utterly unparalleled at home - by then I knew. . .that this was my second country and always would be.

. . .I only ever spent a few nights in (New York City), but made a lot of day and evening trips and saw quite enough of the place to convince me that anyone who makes a business of hating it or being superior to it, and there were plenty then, home-grown and foreign, is a creep, and that anyone who walks up Fifth Avenue (say) on a sunny morning without feeling his spirits lift is an ***hole.

- Kingsley Amis, "Memoirs", 1991

There must be no barriers to freedom of inquiry. There is no place for dogma in science. The scientist is free, and must be free to ask any question, to doubt any assertion, to seek for any evidence, to correct any errors. ... Our political life is also predicated on openness. We know that the only way to avoid error is to detect it and that the only way to detect it is to be free to inquire. And we know that as long as men are free to ask what they must, free to say what they think, free to think what they will, freedom can never be lost, and science can never regress.

- J. Robert Oppenheimer, 1949

Comments (13) + TrackBacks (0) | Category: Blog Housekeeping

July 3, 2014

An Early Day Off

Email This Entry

Posted by Derek

Since the Fourth of July looks to be a rainy washout around here, I'm taking a day off to get a head start on the holiday weekend here. So instead of advancing the cause of science in the lab today, I'm home with a big pork shoulder, which I rubbed down with salt and dry spices last night. It's now cooking over a low charcoal fire with plenty of green hickory wood (from a small shagbark hickory tree I spotted over in the forest near the back yard). That will cook the rest of the day, and be ready for dinner. Here's a prep for barbecued pork ribs from a couple of years ago.

Dessert will be the lime sorbet I mentioned here last year. If you're at all into lemon or lime as a flavor, give that one a try - it's easy, and the results from fresh lime juice are spectacular. Two large-scale, tested preparations, then, for today - I just have to keep an eye on things to make sure that everything is going according to plan.

Comments (12) + TrackBacks (0) | Category: Blog Housekeeping

July 2, 2014

All Natural And Chemical Free

Email This Entry

Posted by Derek

Yesterday's link to the comprehensive list of chemical-free products led to some smiles, but also to some accusations of preaching to the choir, both on my part and on the part of the paper's authors. A manuscript mentioned in the blog section of Nature Chemistry is certainly going to be noticed mostly by chemists, naturally, so I think that everyone responsible knows that this is mainly for some comic relief, rather than any sort of serious attempt to educate the general public. Given the constant barrage of "chemical-free" claims, and what that does to the mood of most chemists who see them, some comedy is welcome once in a while.

But the larger point stands. The commenters here who said, several times, that chemists and the public mean completely different things by the word "chemical" have a point. But let's take a closer look at this for a minute. What this implies (and implies accurately, I'd say) is that for many nonscientists, "chemical" means "something bad or poisonous". And that puts chemists in the position of sounding like they're arguing from the "No True Scotsman" fallacy. We're trying to say that everything is a chemical, and that they range from vital to harmless to poisonous (at some dose) and everything in between. But this can sound like special pleading to someone who's not a scientist, as if we're claiming all the good stuff for our side and disavowing the nasty ones as "Not the kind of chemical we were talking about". (Of course, the lay definition of chemical does this, with the sign flipped: the nasty things are "chemicals", and the non-nasty ones are. . .well, something else. Food, natural stuff, something, but not a chemical, because chemicals are nasty).

So I think it's true that approaches that start off by arguing the definition of "chemical" are doomed. It reminds me of something you see in online political arguments once in a while - someone will say something about anti-Semitism in an Arab country, and likely as not, some other genius will step in with the utterly useless point that it's definitionally impossible, you see, for an Arab to be an anti-Semite, because technically the Arabs are also a Semitic people! Ah-hah! What that's supposed to accomplish has always been a mystery to me, but I fear that attempts to redefine that word "chemical" are in the same category, no matter how teeth-grinding I find that situation to be.

The only thing I've done in this line, when discussing this sort of thing one-on-one, is to go ahead and mention that to a chemist, everything that's made out of atoms is pretty much a "chemical", and that we don't use the word to distinguish between the ones that we like and the ones that we don't. I've used that to bring up the circular nature of some of the arguments on the opposite side: someone's against a chemical ingredient because it's toxic, and they know it's toxic because it's a chemical ingredient. If it were "natural", things would be different.

That's the point to drop in the classic line about cyanide and botulism being all-natural, too. You don't do that just to score some sort of debating point, though, satisfying though that may be - I try not to introduce that one with a flourish of the sword point. No, I think you want to come in with a slightly regretful "Well, here's the problem. . ." approach. The idea, I'd say, is to introduce the concept of there being a continuum of toxicity out there, one that doesn't distinguish between man-made compounds and natural ones.

The next step after that is the fundamental toxicological idea that the dose makes the poison, but I think it's only effective to bring that up after this earlier point has been made. Otherwise, it sounds like special pleading again: "Oh, well, yeah, that's a deadly poison, but a little bit of it probably won't hurt you. Much." My favorite example in this line is selenium. It's simultaneously a vital trace nutrient and a poison, all depending on the dose, and I think a lot of people might improve their thinking on these topics if they tried to integrate that possibility into their views of the world.

Because it's clear that a lot of people don't have room for it right now. The common view is that the world is divided into two categories of stuff: the natural, made by living things, and the unnatural, made by humans (mostly chemists, dang them). You even see this scheme applied to inorganic chemistry: you can find people out there selling makeup and nutritional supplements who charge a premium for things like calcium carbonate when it's a "natural mineral", as opposed (apparently) to that nasty sludge that comes out of the vats down at the chemical plant. (This is also one of the reasons why arguing about the chemist's definition of "organic" is even more of a losing position than arguing about the word "chemical").

There's a religious (or at least quasi-religious) aspect to all this, which makes the arguments emotional and hard to win by appeals to reason. That worldview I describe is a dualist, Manichean one: there are forces of good, and there are forces of evil, and you have to choose sides, don't you? It's sort of assumed that the "natural" world is all of a piece: living creatures are always better off with natural things. They're better; they're what living creatures are meant to consume and be surrounded by. Anything else is ersatz, a defective substitute for the real thing, and quite possibly an outright work of evil by those forces on the other side.

Note that we're heading into some very deep things in many human cultures here, which is another reason that this is never an easy or simple argument to have. That split between natural and unnatural means that there was a time, before all this industrial horror, when people lived in the natural state. They never encountered anything artificial, because there was no such thing in the world. Now, a great number of cultures have a "Golden Age" myth, that distant time when everything was so much better - more pure, somehow, before things became corrupted into their present regrettable state. The Garden of Eden is the aspect this takes in the Christian religion, but you find similar things in many other traditions. (Interestingly, this often takes the form of an ancient age when humans spoke directly with the gods, in whatever form they took, which is one of the things that led Julian Jaynes to his fascinating, although probably unprovable hypotheses in The Origin of Consciousness in the Breakdown of the Bicameral Mind).

This Prelapsarian strain of thinking permeates the all-natural chemical-free worldview. There was a time when food and human health were so much better, and industrial civilization has messed it all up. We're surrounded by man-made toxins and horrible substitutes for real food, and we've lost the true path. It's no wonder that there's all this cancer and diabetes and autism and everything: no one ever used to get those things. Note the followup to this line of thought: someone did this to us. The more hard-core believers in this worldview are actually furious at what they see as the casual, deliberate poisoning of the entire population. The forces of evil, indeed.

And there are enough small reinforcing bars of truth to make all of this hold together quite well. There's no doubt that industrial poisons have sickened vast numbers of people in the past: mercury is just the first one that's come to mind. (I'm tempted to point out that mercury and its salts, by the standards of the cosmetics and supplements industries, are most certainly some of those all-natural minerals, but let that pass for now). We've learned more about waste disposal, occupational exposure, and what can go into food, but there have been horrible incidents that live on vividly in the imagination. And civilization itself didn't necessarily go about increasing health and lifespan for quite a while, as the statistics assembled in Gregory Clark's A Farewell to Alms make clear. In fact, for centuries, living in cities was associated with shorter lifespans and higher mortality. We've turned a lot of corners, but it's been comparatively recently.

And on the topic of "comparatively recently", there's one more factor at work that I'd like to bring up. The "chemical free" view of the world has the virtue of simplicity (and indeed, sees simplicity as a virtue itself). Want to stay healthy? Simple. Don't eat things with chemicals in them. Want to know if something is the right thing to eat, drink, wear, etc.? Simple: is it natural or not? This is another thing that makes some people who argue for this view so vehement - it's not hard, it's right in front of you, and why can't you see the right way of living when it's so, so. . .simple? Arguing against that, from a scientific point of view, puts a person at several disadvantages. You necessarily have to come in with all these complications and qualifying statements, trying to show how things are actually different than they look. That sounds like more special pleading, for one thing, and it's especially ineffective against a way of thinking that often leans toward thinking that the more direct, simple, and obvious something is, the more likely it is to be correct.

That's actually the default way of human thinking, when you get down to it, which is the problem. Science, and the scientific worldview, are unnatural things, and I don't mean that just in the whole-grain no-additives sense of "natural". I mean that they do not come to most people as a normal consequence of their experience and habits of thought. A bit of it does: "Hey, every time I do X, Y seems to happen". But where that line of thinking takes you starts to feel very odd very quickly. You start finding out that the physical world is a lot more complicated than it looks, that "after" does not necessarily mean "because", and that all rules of thumb break down eventually (and usually without warning). You find that math, of all things, seems to be the language that the universe is written in (or at least a very good approximation to it), and that's not exactly an obvious concept, either. You find that many of the most important things in that physical world are invisible to our senses, and not necessarily in a reassuring way, or in a way that even makes much sense at all at first. (Magical explanations of invisible forces at least follow human intuitions). It's no wonder that scientific thinking took such a long, long time to ever catch on in human history. I still sometimes think that it's only tolerated because it brings results.

So there are plenty of reasons why it's hard to effectively argue against the all-natural chemical-free worldview. You're asking your audience to accept a number of things that don't make much sense to them, and what's worse, many of these things look like rhetorical tricks at best and active (even actively evil) attempts to mislead them at worst. And all in the service of something that many of them are predisposed to regard as suspicious even from the start. It's uphill all the way.

Comments (52) + TrackBacks (0) | Category: General Scientific News | Snake Oil | Toxicology

July 1, 2014

Scientific Journals: Who Pays What?

Email This Entry

Posted by Derek

If you've ever wondered about those deals where the large scientific publishers offer bundled discounts to libraries, wonder no more. There's a paper in PNAS whose authors used Freedom of Information Act requests to track down what various university libraries really paid for these deals, and it reveals that everyone paid something different.

Here's a comment in Nature on the study, which they can do with a straight face, since the Nature Publishing Group wasn't included in the study (although the authors seem to think, in retrospect, that they should have done so). These deals are always secret - the publishers make it a requirement not to disclose the terms. And that, as you might easily expect, benefits the publishers, since the library systems don't have a good way of finding out what the market price might be. The PNAS study reveals some odd discrepancies, with some universities getting noticeably better (and worse) deals than others. Wisconsin and Texas bargained hard, it appears, while BYU and Georgia could have done better for themselves.

As the article details, publishers used site licenses to take care of arbitrage opportunities, and the "Big Deal" bundles were used as incentives for the library systems and as tools for the publishers to figure out how much each customer might be willing to pay (using the print-based subscription data as a starting point). As you might have guessed, Elsevier comes out at the top of the pricing list when you just look at the dollar figures. On a cost-per-citation basis, though, they don't look so bad - in fact, they're the most cost-effective of the big publishers by that metric. (Sage and Taylor & Francis both look pretty bad in that table). For reference, the ACS bundle looks pretty decent, and it turns out that nearly 60% of the libraries that deal with the ACS choose the whole package (a high percentage compared to many other publishers). Interestingly, it turns out that some very wealthy schools (Harvard, MIT, Caltech) still don't take the full Elsevier bundle.

And the bundles are, naturally, a mixed bag. It's their whole purpose to be a mixed bag:

It would cost about $3.1 million at 2009 á la carte prices to buy all of the journals in Elsevier’s bundle, the “Freedom Collection.” The average research 1 university paid roughly $1.2 million, or 40% of the summed title-by- title prices, for access to the Freedom Collection. However, this bundle price is by no means equivalent to a 60% discount from journal-by-journal pricing. The Freedom Collection includes about 2,200 journals, many of which are expensive but rarely cited. The least cost-effective 1,100 journals contained in this collection supply fewer than 5% of the citations, but their prices add to more than 25% of the total of á la carte prices. A library that spent $1.2 million on Elsevier journals at listed catalog prices, selecting journals for cost-effectiveness, could obtain access to journals providing 79% of the citations to journals found in the Freedom Collection. Thus, for the average research 1 institution, the citation-scaled discount obtained from the Freedom Collection is about 21%.

Elsevier, though, drops its prices for smaller universities more quickly than many other publishers, and for Master's-level schools it's actually a better deal than many of the nonprofit publishers. We wouldn't know this, though, if these authors hadn't dug up all the info from FOIA requests, and I guess that's the take-home here: scientific publishing is a very opaque, inefficient market. And the publishers like it that way.

Comments (6) + TrackBacks (0) | Category: The Scientific Literature

Corrosion Using Selectfluor?

Email This Entry

Posted by Derek

Here's a question for those of you who've used Selectfluor (Air Products trademark), the well-known fluorinating reagent. I've had an email from someone at Sigma-Aldrich, wondering if people have noticed corrosion problems with either glass or stainless steel when using or storing the reagent. I've hardly used it myself, so I don't have much to offer, but I figured that there was a lot of chemical experience out in the blog's readership, and someone may have something to add.

Comments (11) + TrackBacks (0) | Category: Chemical News