Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Monthly Archives

February 28, 2013

IBM's Watson Does Drug Discovery?

Email This Entry

Posted by Derek

I saw this story this morning, about IBM looking for more markets for its Watson information-sifting system (the one that performed so publicly on "Jeopardy". And this caught my eye for sure:

John Baldoni, senior vice president for technology and science at GlaxoSmithKline, got in touch with I.B.M. shortly after watching Watson’s “Jeopardy” triumph. He was struck that Watson frequently had the right answer, he said, “but what really impressed me was that it so quickly sifted out so many wrong answers.”

That is a huge challenge in drug discovery, which amounts to making a high-stakes bet, over years of testing, on the success of a chemical compound. The failure rate is high. Improving the odds, Mr. Baldoni said, could have a huge payoff economically and medically.

Glaxo and I.B.M. researchers put Watson through a test run. They fed it all the literature on malaria, known anti-malarial drugs and other chemical compounds. Watson correctly identified known anti-malarial drugs, and suggested 15 other compounds as potential drugs to combat malaria. The two companies are now discussing other projects.

“It doesn’t just answer questions, it encourages you to think more widely,” said Catherine E. Peishoff, vice president for computational and structural chemistry at Glaxo. “It essentially says, ‘Look over here, think about this.’ That’s one of the exciting things about this technology.”

Now, without seeing some structures and naming some names, it's completely impossible to say how valuable the Watson suggestions were. But I would very much like to know on what basis these other compounds were suggested: structural similarity? Mechanisms in common? Mechanisms that are in the same pathway, but hadn't been specifically looked at for malaria? Something else entirely? Unfortunately, we're probably not going to be able to find out, unless GSK is forthcoming with more details.

Eventually, there's coing to be another, somewhat more disturbing answer to that "what basis?" question. As this Slate article says, we could well get to the point where such systems make discoveries or correlations that are correct, but beyond our ability to figure out. Watson is most certainly not there yet. I don't think anything is, or is really all that close. But that doesn't mean it won't happen.

For a look at what this might be like, see Ted Chiang's story "Catching Crumbs From the Table", which appeared first in Nature, and then in his collection Stories of Your Life and Others, which I highly recommend, as "The Evolution of Human Science".

Comments (32) + TrackBacks (0) | Category: In Silico | Infectious Diseases

The Industrial Diels Alder, Revisited in Detail

Email This Entry

Posted by Derek

Last year I mentioned the "good ol' Diels-Alder reaction", and talked about how it doesn't get used as much in drug discovery and industrial chemistry as one might think.

Now Stefan Abele from Actelion (in Switzerland) sends along this new paper, which will tell you pretty much all you need to know about the reaction's industrial side. The scarcity of D-A chemistry on scale that I'd noticed was no illusion (links below added by me):

According to a survey by Dugger et al. in 2005 of the type of reaction scaled in a research facility at Pfizer, and an analysis of the reactions used for the preparation of drug candidate molecules by Carey et al. in 2006, the DA reaction falls into the “miscellaneous” category that accounts for only 5 to 11 % of C-C bond-forming reactions performed under Good Manufacturing Practice. This observation mirrors the finding that C-C bond-forming reactions account for 11.5% of the entire reaction repertoire used by medicinal chemists in the pursuit of drug candidates. In this group, palladium-catalyzed reactions represent about 60% of the occurrences, while the “other” category, into which the DA reaction falls, represents only 1.8% of the total number of reactions. Careful examination of the top 200 pharmaceutical products by US retail sales in 2010 revealed that only one marketed drug, namely Buprenorphine, is produced industrially by using the DA reaction. Two other drugs were identified in the top 200 generic drugs of US retail sales in 2008: Calcitriol and its precursor Calciferol. Since 2002, Liu and co-workers have been compiling the new drugs introduced each year to the market. From 2002 to 2010, 174 new chemical entities were reported. Among them, two examples (Varenicline from Pfizer in 2006 and Peramivir by Shionogi in 2010) have been explicitly manufactured through a DA reaction. Similarly, and not surprisingly, our consultation with a large corpus of peers, colleagues, and experts in industry and academia worldwide revealed that the knowledge of such examples of the DA reaction run on a large scale is scarce, except perhaps in the field of fragrance chemistry.

But pretty much every reaction that has been run on large scale is in this review, so if you're leaning that way, this is the place to go. It doesn't shy away from the potential problems (chief among them being potential polymerization of one or both of the starting materials, which would really ruin your afternoon). But it's a powerful enough reaction that it really would seem to have more use than it gets.

Comments (11) + TrackBacks (0) | Category: Chemical News

February 27, 2013

A Nobel Follow-Up

Email This Entry

Posted by Derek

Those of you who remember the Green Fluorescent Protein Nobel story will likely recall Douglas Prasher. He was the earliest discoverer of GFP, and Roger Tsien has said that he has no idea why he didn't get the Nobel as well. But Prasher, after a series of career and personal reverses, ended up driving a shuttle bus in Huntsville by the time the prize was announced.

Well, he's back in science again - and working in the Tsien lab. Here's the story, which I was very glad to read. Prasher's clearly smart and talented, and I hope that he can put all that to good use. A happy ending?

Comments (15) + TrackBacks (0) | Category: General Scientific News

Not What It Says On the Label, Though

Email This Entry

Posted by Derek

The topic of compound purity has come up here before, as well it should. Every experienced medicinal chemist knows that when you have an interesting new hit compound, that one of the first things to do is go back and make sure that it really is what it says on the label. Re-order it from the archive (in both powder and DMSO stock), re-order it if it's from a commercial source, and run it through the LC/MS and the NMR. (And as one of those links above says, if you have any thought that metal reagents were used to make the compound, check for those, too - they can be transparent to LC and NMR).

So when you do this, how many compounds flunk? Here are some interesting statistics from the folks at Emerald:

Recently, we selected a random set of commercial fragment compounds for analysis, and closely examined those that failed to better understand the reasons behind it. The most common reason for QC failure was insolubility (47%), followed by degradation or impurities (39%), and then spectral mismatch (17%) [Note: Compounds can acquire multiple QC designations, hence total incidences > 100% ]. Less than 4% of all compounds assayed failed due to solvent peak overlap or lack of non-exchangeable protons, both requirements for NMR screening. Failure rates were as high as 33% per individual vendor, with an overall average of 16%. . .

I very much wish that they'd identified that 33% failure rate vendor. But overall, they're suggesting that of 10 to 15% compounds will wipe out, regardless of source. Now, you may not feel that solubility is a key criterion for your work, because you're not doing NMR assays. (That's one that will only get worse as you move out of fragment-sized space, too). But that "degradation or impurities" category is still pretty significant. What are your estimates for commercial-crap-in-a-vial rates?

Comments (11) + TrackBacks (0) | Category: Chemical News | Drug Assays

Selective Inhibitor, The Catalog Says

Email This Entry

Posted by Derek

There's an interesting addendum to yesterday's post about natural product fragments. bAP15.pngDan Erlanson was pointing out that many of the proposed fragments were PAINS, and that prompted Jonathan Baell (author of the original PAINS paper) to leave a comment there mentioning this compound. Yep, you can buy that beast from Millipore, and it's being sold as a selective inhibitor of two particular enzymes. (Here's the original paper describing it). If it's really that selective, I will join one of those Greek monasteries where they eat raw onions and dry bread, and spend my time in atonement for ever thinking that a double nitrophenyl Schiff base enone with an acrylamide on it might be trouble.

Honestly, guys. Do a Ben Cravatt-style experiment across a proteome with that think, and see what you get. I'm not saying that it's going to absolutely label everything it comes across, but it's surely going to stick to more than two things, and have more effects than you can ascribe to those "selective" actions.

Comments (20) + TrackBacks (0) | Category: Chemical Biology | Drug Assays

February 26, 2013

Standard of Care? Not So Fast, Not in the United Kingdom

Email This Entry

Posted by Derek

Did you know that in the UK, patent law says that using a competitor's compound as a comparison in a clinical trial is an infringement? I sure didn't. The government has realized that this rule is much stricter than most other countries, and is moving to change it in a bid to keep more clinical research in the country. Thanks to FierceBiotech for the heads-up on this.

Comments (11) + TrackBacks (0) | Category: Patents and IP

Natural Product Fragments: Get Rid of the Ugly Ones Now

Email This Entry

Posted by Derek

Here's a paper at the intersection of two useful areas: natural products and fragments. Dan Erlanson over at Practical Fragments has a good, detailed look at the the work. What the authors have done is tried to break down known natural product structures into fragment-sized pieces, and cluster those together for guidance in assembling new screening libraries.

I'm sympathetic to that goal. I like fragment-based techniques, and I think that too many fragment libraries tend to be top-heavy with aromatic and heteroaromatic groups. Something with more polarity, more hydrogen-bonding character, and more three-dimensional structures would be useful, and natural products certainly fit that space. (Some of you may be familiar with a similar approach, the deCODE/Emerald "Fragments of Life", which Dan blogged about here). Synthetically, these fragments turn out to be a mixed bag, which is either a bug or a feature depending on your point of view (and what you have funding for or a mandate to pursue):

The natural-product-derived fragments are often far less complex structurally than the guiding natural products themselves. However, their synthesis will often still require considerable synthetic effort, and for widespread access to the full set of natural-product-derived fragments, the development of novel, efficient synthesis methodologies is required. However, the syntheses of natural-product-derived fragments will by no means have to meet the level of difficulty encountered in the multi-step synthesis of genuine natural products.

But take a look at Dan's post for the real downside:

Looking at the structures of some of the phosphatase inhibitors, however, I started to worry. One strong point of the paper is that it is very complete: the chemical structures of all 193 tested fragments are provided in the supplementary information. Unfortunately, the list contains some truly dreadful members; 17 of the worst are shown here, with the nasty bits shown in red. All of these are PAINS that will nonspecifically interfere with many different assays.

Boy, is he right about that, as you'll see when you take a look at the structures. They remind me of this beast, blogged about here back last fall. These structures should not be allowed into a fragment screening library; there are a lot of other things one could use instead, and their chances of leading only to heartbreak are just too high.

Comments (9) + TrackBacks (0) | Category: Chemical News | Drug Assays | Natural Products

Phil Baran at Blog Syn

Email This Entry

Posted by Derek

I linked recently to the latest reaction check at Blog Syn, benzylic oxidation by IBX. Now Prof. Baran (a co-author on the original paper, from his Nicoloau days) has written See Arr Oh with a detailed repeat of the experiment. He gets it to work, so I think it's fair to say that (1) the reaction is doable, but (2) it's not as easy to reproduce right out of the box as it might be.

I'd like to congratulate him for responding like this. The whole idea of publicly rechecking literature reactions is still fairly new, and (as the comments here have shown), there's a wide range of opinion on it. Getting a detailed, prompt, and civil response from the Baran lab is the best outcome, I think. After all, the point of a published procedure - the point of science - is reproducibility. The IBX reaction is now better known than it was, the details that could make it hard to run are now there for people who want to try it, and Prof. Baran's already high reputation as a scientist actually goes up a bit among the people who've been following this story.

Public reproducibility is an idea whose time, I think, has come, and Blog Syn is only one part of it. When you think about the increasingly well-known problems with reproducing big new biological discoveries, things that could lead to tens and hundreds of millions being spent on clinical research, reproducing organic chemistry reactions shouldn't be controversial at all. As they say to novelists, if you're afraid of bad reviews, there's only one solution: don't show anyone your book.

Comments (59) + TrackBacks (0) | Category: Chemical News | The Scientific Literature

February 25, 2013

An Interview

Email This Entry

Posted by Derek

For those of you with subscriptions to Trends in Pharmacological Sciences, they have an interview with me up at the journal's site. It's in the "Scientific Life" section, and I was very happy to be asked to do it.

Comments (8) + TrackBacks (0) | Category: Blog Housekeeping

ENCODE: The Nastiest Dissent I've Seen in Quite Some Time

Email This Entry

Posted by Derek

Last fall we had the landslide of data from the ENCODE project, along with a similar landslide of headlines proclaiming that 80% of the human genome was functional. That link shows that many people (myself included) were skeptical of this conclusion at the time, and since then others have weighed in with their own doubts.

A new paper, from Dan Graur at Houston (and co-authors from Houston and Johns Hopkins) is really stirring things up. And whether you agree with its authors or not, it's well worth reading - you just don't see thunderous dissents like this one in the scientific literature very often. Here, try this out:

Thus, according to the ENCODE Consortium, a biological function can be maintained indefinitely without selection, which implies that (at least 70%) of the genome is perfectly invulnerable to deleterious mutations, either because no mutation can ever occur in these “functional” regions, or because no mutation in these regions can ever be deleterious. This absurd conclusion was reached through various means, chiefly (1) by employing the seldom used “causal role” definition of biological function and then applying it inconsistently to different biochemical properties, (2) by committing a logical fallacy known as “affirming the consequent,” (3) by failing to appreciate the crucial difference between “junk DNA” and “garbage DNA,” (4) by using analytical methods that yield biased errors and inflate estimates of functionality, (5) by favoring statistical sensitivity over specificity, and (6) by emphasizing statistical significance rather than the magnitude of the effect.

Other than that, things are fine. The paper goes on to detailed objections in each of those categories, and the tone does not moderate. One of the biggest objections is around the use of the word "function". The authors are at pains to distinguish selected effect functions from causal role functions, and claim that one of the biggest shortcomings of the ENCODE claims is that they blur this boundary. "Selected effects" are what most of us think about as well-proven functions: a TATAAA sequence in the genome binds a transcription factor, with effects on the gene(s) downstream of it. If there is a mutation in this sequence, there will almost certainly be functional consequences (and these will almost certainly be bad). If, however, imagine a random sequence of nucelotides that's close enough to TATAAA to bind a transcription factor. But in this case, there are no functional consequences - genes aren't transcribed differently, and nothing really happens other than the transcription factor parking there once in a while. That's a "causal role" function, and the whopping majority of the ENCODE functions appear to be in this class. "It looks sort of like something that has a function, therefore it has one". And while this can lead to discoveries, you have to be careful:

The causal role concept of function can lead to bizarre outcomes in the biological sciences. For example, while the selected effect function of the heart can be stated unambiguously to be the pumping of blood, the heart may be assigned many additional causal role functions, such as adding 300 grams to body weight, producing sounds, and preventing the pericardium from deflating onto itself. As a result, most biologists use the selected effect concept of function. . .

A mutation in that random TATAAA-like sequence would be expected to be silent compared to what would happen in a real binding motif. So one would want to know what percent of the genome is under selection pressure - that is, what part of it is unlikely to be mutatable without something happening. Those studies are where we get the figures of perhaps 10% of the DNA sequence being functional. Almost all of what ENCODE has declared to be functional, though, can show mutations with relative impunity:

From an evolutionary viewpoint, a function can be assigned to a DNA sequence if and only if it is possible to destroy it. All functional entities in the universe can be rendered nonfunctional by the ravages of time, entropy, mutation, and what have you. Unless a genomic functionality is actively protected by selection, it will accumulate deleterious mutations and will cease to be functional. The absurd alternative, which unfortunately was adopted by ENCODE, is to assume that no deleterious mutations can ever occur in the regions they have deemed to be functional. Such an assumption is akin to claiming that a television set left on and unattended will still be in working condition after a million years because no natural events, such as rust, erosion, static electricity, and earthquakes can affect it. The convoluted rationale for the decision to discard evolutionary conservation and constraint as the arbiters of functionality put forward by a lead ENCODE author (Stamatoyannopoulos 2012) is groundless and self-serving.

Basically, if you can't destroy a function by mutation, then there is no function to destroy. Even the most liberal definitions take this principle to apply to about 15% of the genome at most, so the 80%-or-more figure really does stand out. But this paper has more than philosophical objections to the ENCODE work. They point out that the consortium used tumor cell lines for its work, and that these are notoriously permissive in their transcription. One of the principles behind the 80% figure is that "if it gets transcribed, it must have a function", but you can't say that about HeLa cells and the like, which read off all sorts of pseudogenes and such (introns, mobile DNA elements, etc.)

One of the other criteria the ENCODE studies used for assigning function was histone modification. Now, this bears on a lot of hot topics in drug discovery these days, because an awful lot of time and effort is going into such epigenetic mechanisms. But (as this paper notes), this recent study illustrated that all histone modifications are not equal - there may, in fact, be a large number of silent ones. Another ENCODE criterion had to do with open (accessible) regions of chromatin, but there's a potential problem here, too:

They also found that more than 80% of the transcription start sites were contained within open chromatin regions. In yet another breathtaking example of affirming the consequent, ENCODE makes the reverse claim, and adds all open chromatin regions to the “functional” pile, turning the mostly true statement “most transcription start sites are found within open chromatin regions” into the entirely false statement “most open chromatin regions are functional transcription start sites.”

Similar arguments apply to the 8.5% of the genome that ENCODE assigns to transcription factor binding sites. When you actually try to experimentally verify function for such things, the huge majority of them fall out. (It's also noted that there are some oddities in ENCODE's definitions here - for example, they seem to be annotating 500-base stretches as transcription factor binding sites, when most of the verified ones are below 15 bases in length).

Now, it's true that the ENCODE studies did try to address the idea of selection on all these functional sequences. But this new paper has a lot of very caustic things to say about the way this was done, and I'll refer you to it for the full picture. To give you some idea, though:

By choosing primate specific regions only, ENCODE effectively removed everything that is of interest functionally (e.g., protein coding and RNA-specifying genes as well as evolutionarily conserved regulatory regions). What was left consisted among others of dead transposable and retrotransposable elements. . .

. . .Because polymorphic sites were defined by using all three human samples, the removal of two samples had the unfortunate effect of turning some polymorphic sites into monomorphic ones. As a consequence, the ENCODE data includes 2,136 alleles each with a frequency of exactly 0. In a miraculous feat of “next generation” science, the ENCODE authors were able to determine the frequencies of nonexistent derived alleles.

That last part brings up one of the objections that many people many have to this paper - it does take on a rather bitter tone. I actually don't mind it - who am I to object, given some of the things I've said on this blog? But it could be counterproductive, leading to arguments over the insults rather than arguments over the things being insulted (and over whether they're worthy of the scorn). People could end up waving their hands and running around shouting in all the smoke, rather than figuring out how much fire there is and where it's burning. The last paragraph of the paper is a good illustration:

The ENCODE results were predicted by one of its authors to necessitate the rewriting of textbooks. We agree, many textbooks dealing with marketing, mass-media hype, and public relations may well have to be rewritten.

Well, maybe that was necessary. The amount of media hype was huge, and the only way to counter it might be to try to generate a similar amount of noise. It might be working, or starting to work - normally, a paper like this would get no popular press coverage at all. But will it make CNN? The Science section of the New York Times? ENCODE's results certainly did.

But what the general public things about this controversy is secondary. The real fight is going to be here in the sciences, and some of it is going to spill out of academia and into the drug industry. As mentioned above, a lot of companies are looking at epigenetic targets, and a lot of companies would (in general) very much like to hear that there are a lot more potential drug targets than we know about. That was what drove the genomics frenzy back in 1999-2000, an era that was not without its consequences. The coming of the ENCODE data was (for some people) the long-delayed vindication of the idea that gene sequencing was going to lead to a vast landscape of new disease targets. There was already a comment on my entry at the time suggesting that some industrial researchers were jumping on the ENCODE work as a new area to work in, and it wouldn't surprise me to see many others thinking similarly.

But we're going to have to be careful. Transcription factors and epigenetic mechanisms are hard enough to work on, even when they're carefully validated. Chasing after ephemeral ones would truly be a waste of time. . .

More reactions around the science blogging world: Wavefunction, Pharyngula, SciLogs, Openhelix. And there are (and will be) many more.

Comments (24) + TrackBacks (0) | Category: Biological News

February 22, 2013

What If the Journal Disappears?

Email This Entry

Posted by Derek

Hmm, here's a question I hadn't considered. What happens when an online-only journal quits publishing and (apparently) deletes its archives? That's what seems to have happened with the "Journal of Advances in Developmental Research".

Now, to a first approximation, the loss of many of the papers in this journal will not, in all likelihood, be much of a setback. Here is (was?) its stated focus:

The Journal of Advances in Developmental Research is a peer-reviewed multidisciplinary journal that publishes research articles, general articles, research communications, review article and abstracts of theses from the fields of science, social sciences, sports science, humanities, medical, education, engineering, technology, biotechnology, home science, computer, history, arts and other fields which participates in overall development of society.

It provides a platform to discuss current and future trends of research and their role in development of society.

Now, that doesn't sound like anything anyone would want to read. But as long as your check cleared, you could publish in it - it was one of those bottom-of-the-barrel predatory publishing venues. What happens now, though? If there was something worthwhile in any of those papers, we'll never have any way of knowing, because they're all gone. Can (or should) the authors resubmit the papers somewhere else where they can be seen?

Here, for reference are Jeffrey Beall's current criteria for a predatory publisher. One of them is that they "(Have) no policies or practices for digital preservation". Although these guys seem to have had a policy, if you count "wipe the hard drive" as a policy.

Tip via Ivan Oransky and Jeffrey Beall on Twitter.

Comments (14) + TrackBacks (0) | Category: The Scientific Literature

Nativis Returns

Email This Entry

Posted by Derek

Well, since it's Friday, I thought I'd quickly revisit one of the favorite companies I've written about here: Nativis. You'll recall that this is the outfit that claimed "photonic signatures" of drugs were as effective as the physical molecules themselves. My comments (and those of the readership here) led to some public exchanges with the company's chief financial officer, but last I heard of them they had moved out of San Diego and back to Seattle. Readers mentioned that the company was developing some sort of cancer-treatment device based on their ideas.

A couple of alert readers have now sent along links to the latest news. Nativis has produced a device they're calling the "Voyager", which is being tested in veterinary applications. Here is a YouTube video from a clinic that's trying it out. I have no reason to think that the doctor being interviewed is anything but sincere, but I also tend to think that he may not realize just what the opinion of many observers is about the Nativis technology. The veterinarian says things in the clip about how "the healing energy is then emitted to the tumor from this coil" and "The radiofrequency signal is stored on this device and then played, if you will, through this coil, to the tumor itself".

He does not appear to be misrepresenting Nativis' claims. I believe that this is the relevant patent application. The first claim reads:

"1. An aqueous anti-tumor composition produced by treating an aqueous medium free of paclitaxel, a paclitaxel analog, or other cancer-cell inhibitory compound with a low-frequency, time-domain signal derived from paclitaxel or an analog thereof, until the aqueous medium acquires a detectable paclitaxel activity, as evidenced by the ability of the composition (i) to inhibit growth of human glioblastoma cells when the composition is added to the cells in culture, over a 24 hour culture period, under standard culture conditions, and/or (ii), to inhibit growth of a paclitaxel-responsive tumor when administered to a subject having such a tumor."
.

So yes, we're apparently still talking about turning a sample of water into a drug by playing some sort of radio frequency into it. And no, I still have no idea how this is physically possible, and to the extent that I understand the company's explanations, I do not find them convincing. Here's some more language out of the patent application:

[0151] In one exemplary method, paclitaxel time-domain signals were obtained by recording low-frequency signals from a sample of paclitaxel suspended in CremophorEL™ 529 ml and anhydrous ethanol 69.74 mi to a final concentration of 8 mg/rrtl. The signals were recorded with injected DC offset, at noise level settings between 10 and 241 mV and in increments of 1 mV. A total of 241 time-domain signals over this injected-noise level range were obtained, and these were analyzed by an enhanced autocorrelation algorithm detailed above, yielding 8 time-domain paclitaxel-derived signals for further in vitro testing. One of these, designated signal M2{3), was selected as an exemplary paclitaxel signal effective in producing taxol-specific effects in biological response systems (described below), and when used for producing paclriaxei-specific aqueous compositions in accordance with the invention, also as described below.

[0152] Figs. 9A-9C show frequency-domain spectra of two paclitaxel signals with noise removed by Fourier subtraction (Figs. 9A and 98), and a cross-correlation of the two signals (Fig. 9C), showing agent-specific spectral features over a portion of the frequency spectrum from 3510 to 3650 Hz. As can be seen from Fig. 9C, when a noise threshold corresponding to an ordinate value of about 3 is imposed, the paclitaxel signal in this region is characterized by 7 peaks. The spectra shown in Figs. 9A-9C, but expanded to show spectral features over the entire region between 0-20kHz, illustrate how optimal time-domain signals can be selected, by examining the frequency spectrum of the signal for unique, agent-specific peaks, and selecting a time-domain signal that contains a number of such peaks.

[0153] The time-domain signals recorded, processed, and selected as above may be stored on a compact disc or any other suitable storage media for analog or digital signals and supplied to the transduction system during a signal transduction operation The signal carried on the compact disc is representative, more generally, of a tangible data storage medium having stored thereon, a low-frequency time domain signal effective to produce a magnetic field capable of transducing a chemical or biological system, or in producing an agent-specific aqueous composition in accordance with the invention, when the signal is supplied to electromagnetic transduction coil(s) at a signal current calculated to produce a magnetic field strength in the range between 1 G and 10"8 G, Although the specific signal tested was derived from a paclitaxel sample, it will be appreciated that any taxane-iike compound should generate a signal having the same mechanism of action in transduced form.

I just fail to see how recording "signals" from a drug preparation can then be used to turn water (or water/bubble mixtures, etc., as the patent goes on to claim) into something that acts like the original drug. All the objections I raised in my first post on this company are still in force as far as I'm concerned, and my suggestions for more convincing experimental data are still out there waiting to be fulfilled. Despite various mentions of publications and IND filings when I interacted with Nativis back in 2010, I am unaware of any evidence that has been divulged past their patent filings.

And no, I do not regard patent filings as sufficient evidence that anything actually works - here's one for a process of reincarnation leading to immortality, for example. Even issued patents have proven insufficient in the past: here's one for a faster-than-light radio antenna. If Nativis wants to end up in a different bin than those people, they are, in my opinion, taking an odd path to doing so.

Comments (96) + TrackBacks (0) | Category: Snake Oil

February 21, 2013

An Incentive For Hype

Email This Entry

Posted by Derek

Here's an article illustrating what goes into high-profile journal publications, and why you should always read past the title and the abstract. Björn Brembs noticed this paper coming out in Current Biology on fruit fly receptors and behavior, whose abstract claims that "blocking synaptic output from octopamine neurons inverts the valence assigned to CO2 and elicits an aversive response in flight". As Brembs puts it:

We currently have a few projects in our lab that target these octopamine neurons, so this was a potentially very important finding. It was my postdoc, Julien Colomb, who spotted the problem with this statement first. In fact, if it wasn't for Julien, I might have never looked at the data myself, as I know the technique and I know and trust the lab the paper was from. I probably would just have laid the contents of the abstract to my memory and cited the paper where appropriate, as the results confirmed our data and those in the literature (a clear case of confirmation bias on my part).

When you look harder, you find that yes, the genetically manipulated flied do seem averse to carbon dioxide plumes. But when you check the control experiments, you find that the two transgenes added to the flies (independent of the change to the octopamine system that's the subject of the paper) both decrease the tropism for CO2. So there's really no way of knowing what the effect of both of them might be, octopamine signaling or not, and you might well suspect that the two of them together could hose up the carbon dioxide response without invoking the receptor pathways at all.

As Brembs says, though, the authors aren't trying to hide this. It's in the body of their paper. Abstract be damned, the paper itself states:

"We note that the Tdc2-GAL4/+ driver line does not spend a significantly greater amount of time in the CO2 plume by comparison to air, but this line, as well as the UAS-TNT/+ parent line, spends significantly more time in the CO2 plume in comparison to their progeny. Therefore, this experimental result cannot be fully attributable to the genetic background."

No, not fully attributable at all, especially if the progeny show some sort of additive effect of the two transgenes. Of course, if you water down your conclusions too much, you might not get the paper into as good a journal as you'd like. I'll let Brembs sum up:

To make this unambiguously clear: I can't find any misconduct whatsoever in this paper, only clever marketing of the sort that occurs in almost every 'top-journal' paper these days and is definitely common practice. On the contrary, this is exactly the behavior incentivized by the current system, it's what the system demands, so this is what we get. It's precisely this kind of marketing we refer to in our manuscript, that is selected for in the current evolution of the scientific community. If you don't do it, you'll end up unemployed. It's what we do to stay alive.

If there's anyone out there who thinks that this doesn't go on in the chemistry literature, my advice is to please look around you a bit. This sort of thing goes on all the time, and I'd guess that most of us automatically dial down the statements in paper titles and abstracts as we read them, without even realizing any more that we're doing so. But in a case like this (and there are many others), even that process will still let erroneous conclusions into your head. And we all have enough of those already.

Comments (6) + TrackBacks (0) | Category: The Scientific Literature

The Hard Targets: How Far Along Are We?

Email This Entry

Posted by Derek

I wrote here about whole classes of potential drug targets that we really don't know how to deal with. It's been several years since then, and I don't think that the situation has improved all that much. (In 2011 I reviewed a book that advocated attacking these as a way forward for drug discovery).

Protein-protein interactions are still the biggest of these "undruggable targets", and there has been some progress made there. But I think we still don't have much in the way of general knowledge in this area. Every PPI target is its own beast, and you get your leads where you can, if you can. Transcription factors are the bridge between these and the protein-nucleic acid targets, which have been even harder to get a handle on (accounting for their appearance on lists like this one).

There are several chicken-and-egg questions in these areas. Getting chemical matter seems to be hard (that's something we can all agree on). Is that because we don't have compound collections that are biased the right way? If so, what the heck would the right way look like? Is is because we have trouble coming up with good screening techniques for some of these targets? (And if so, what are we lacking?) How much of the slower progress in these areas has been because of their intrinsic difficulty, and how much has been because people tend to avoid them (because of their, well, intrinsic difficulty?) If we all had our backs to the wall, could we do better, or would we generate just a lot more of the same?

I ask these questions because for years now, a lot of people in the industry have been saying that we need to get more of a handle on these things, because the good ol' small-molecule binding sites are getting scarcer. Am I right to think that we're still at the stage of telling each other this, or are there advances that I haven't kept up with?

Comments (14) + TrackBacks (0) | Category: Drug Assays | Drug Industry History

An Anniversary

Email This Entry

Posted by Derek

I wanted to repost an old entry of mine, from back in 2002 (!) It's appropriate this week, and just as I was in 2002, I'm a couple of days late with the commemeration:

I missed a chance yesterday to note an anniversary. Giordano Bruno was something of a crank, not normally the sort of person I'd be commemorating. But in his time, it didn't take very much to be considered either of those, or worse, and we have to make allowances.

He was headstrong. We can see now that he was sometimes eerily right, other times totally wrong. Either way, many of these strongly held positions were sure sources of trouble for anyone who advocated them. All living things were made up of matter, and that matter was the same across the universe - that one was not going to go over well in the late 16th century.

There was more. The stars, he said, were nothing more than other suns, and our sun was nothing more than a nearby star. He saw no reason why these other suns should not have planets around them, and no reason why those planets should not have life: "Innumerable suns exist; innumerable earths revolve around these suns in a manner similar to the way the seven planets revolve around our sun. Living beings inhabit these worlds."

He went on at length. And as I said, much of it was, by scientific standards, mystical rot. His personality was no help whatsoever in getting his points across. He appears to have eventually gotten on the nerves of everyone he dealt with. But no one deserves to pay what he did for it all.

Bruno was excommunicated and hauled off in chains. He spent the next several years in prison, and was given chances to recant up until the very end. He refused. On February 19th, 1600, he was led into the Campo dei Fiori plaza in Rome, tied to a post, and burned to death in front of a crowd.

Mystic, fool, pain in the neck. I went out tonight to see Saturn disappear behind the dark edge of the moon, putting the telescope out on the driveway and calling my wife out to see. Then I came inside, sat down at my computer, wrote exactly what I thought, and put it out for anyone who wanted to read it around the world. While I did all that, I remembered that things haven't always been this way, haven't been this way for long at all, actually. And resolved to remember to enjoy it all as much as I can, and to remember those who never got to see it.

Comments (6) + TrackBacks (0) | Category: Who Discovers and Why

February 20, 2013

A New Old Diabetes and Obesity Drug Candidate

Email This Entry

Posted by Derek

Obesity is a therapeutic area that has broken a lot of hearts (and wallets) over the years. A scroll back through this category will show some of the wreckage, and there's plenty more out there. But hope does that springing-eternal thing that it does, and there's an intriguing new possibility for a target in this area. Alan Saltiel of Michigan (whose group has had a long presence in this sort of research), along with a number of other well-known collaborators, report work on the inflammation connection between diabetes and obesity:

Although the molecular events underlying the relationship between obesity and insulin resistance remain uncertain, numerous studies have implicated an inflammatory link. Obesity produces a state of chronic, low-grade inflammation in liver and fat accompanied by the local secretion of cytokines and chemokines that attenuate insulin action. Knockout or pharmacological inhibition of inflammatory pathways can disrupt the link between genetic- or diet-induced obesity and insulin resistance, suggesting that local inflammation is a key step in the generation of cellular resistance to important hormones that regulate metabolism.

Saltiel's lab had already implicated IKK-epsilon as a kinase involved in this pathway in obese mouse models, and they've been searching for small-molecule inhibitors of it. As it turns out, a known compound (amlexanox) with an uncertain mechanism of action is such an inhibitor. It's best-known, if it's known at all, as a topical canker sore treatment, and has been around since at least the early 1990s.

Administration of this selective TBK1 and IKK-ε inhibitor to obese mice produces reversible weight loss and improved insulin sensitivity, reduced inflammation and attenuated hepatic steatosis without affecting food intake. These data suggest that IKK-ε and TBK1 are part of a counterinflammatory process that sustains energy storage in the context of insulin resistance. Disruption of this process by amlexanox thus increases adaptive energy expenditure and restores insulin sensitivity. Because of the apparent safety of this drug in patients, we propose that it undergo study for the treatment of obesity, type 2 diabetes and nonalcoholic fatty liver disease in patients.

I don't see why not. The compound does seem to be absorbed after oral dosing (most of the topical paste ends up going down into the stomach and intestines), and about 17% is excreted unchanged in the urine. You'd think some sort of oral formulation could be worked out, given those numbers. It looks like a low-micromolar inhibitor, and is selective against a kinase panel, which is good news. And treatment of mice on a high fat diet prevented weight gain, while not altering food intake. Their insulin sensitivity improved, as did the amount of fat in the liver tissue. Giving the compound to already-obese mice (either through diet or genetically predisposed (ob/ob) animals) caused the same effect. Metabolic cage studies showed that increased energy expenditure seemed to be the mechanism (as you'd think - thermodynamics will only give you so many ways of losing weight while eating the same amount of food, and the obvious alternative mechanism might not be very popular).

Just how the compound does all this is somewhat mysterious:

The precise mechanisms by which amlexanox produces these beneficial effects in obese rodents have not yet been completely elucidated. Although amlexanox is known to be a mast cell stabilizer of unknown mechanism20, and depletion of mast cells may have beneficial metabolic effects59, most of the in vivo and in vitro evidence points to a role for the drug in increasing expenditure of energy while reducing its storage in adipocytes and hepatocytes. Furthermore, the lack of a phenotype in wild-type mice reconstituted with Ikbke knockout bone marrow indicates that the role of IKK-ε in bone marrow-derived cells such as mast cells and macrophages is less important than its role in other cell types such as adipocytes and hepatocytes. Although IKK-ε and TBK1 expression is elevated as part of the inflammatory program downstream of NF-κB, the kinase targets of the drug do not seem to be direct participants in the increased inflammatory program. In fact, the reduced inflammation observed in vivo with amlexanox treatment may be an indirect effect of improved metabolic disease or, perhaps, of elimination of a feedback pathway that maintains inflammation at low levels such that inflammation is permitted to resolve. Moreover, despite the fact that administration of amlexanox to obese mice restores insulin sensitivity, these compounds are not direct insulin sensitizers in vitro.

This level of unworkedoutness will surely interest some companies in taking a look at this, and if proof-of-concept can be found with amlexanox itself, a more potent inhibitor would also be something to search for. I have just one worry, though (he said, in his Peter Falk voice).

We were just talking around here about how mouse models of inflammation are probably useless, were we not? So it would be good news if, as speculated above, the inflammation component of this mechanism were to be an effect, not a cause. A direct attack on metabolic syndrome inflammation in mouse models is something that I'd be quite wary of, given the recent reports. But this might well escape the curse. Worth keeping an eye on!

Comments (12) + TrackBacks (0) | Category: Diabetes and Obesity

February 19, 2013

The Wages of Copy-Pasting

Email This Entry

Posted by Derek

A few weeks ago I mentioned this situation regarding work by Prof. Xi Yan. Two recent papers seem to have been substantially copy-pasted from earlier work published by completely different groups. Now See Arr Oh has some details on what happens to you when you have the nerve to do that in a journal of the Royal Society of Chemistry: why, you have to publish a note regretting that you didn't cite the paper you copied from, that's what. "The authors apologize for this oversight."

There, that should square things up. Right? See Arr Oh is not very happy about this response, and I don't blame him for a minute. The RSC editors seem to be ignoring the word-for-word aspect of a substantial part of the new paper; it really is a paste job, and you're not supposed to do that. And the only problem they have is that the paper being stolen from wasn't cited? Oversight, my various body parts.

Comments (20) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

More From Blog Syn

Email This Entry

Posted by Derek

I wanted to mention that there are two more entries up on Blog Syn: one of them covering this paper on alkenylation of pyridines. (It's sort of like a Heck reaction, only you don't have to have an iodo or triflate on the pyridine; it just goes right into the CH bond). The short answer: the reaction works, but there are variables that seem crucial for its success that were under-reported in the original paper (and have been supplied, in part, by responses from the original author to the Blog Syn post). Anyone thinking about running this reaction definitely needs to be aware of this information.

The latest is a re-evaluation of a older paper on the use of IBX to (among many other things) oxidize arylmethane centers. It's notable for a couple of reasons: it's been claimed that this particular reaction completely fails across multiple substrates, and the reaction itself is from the Nicolau lab (with Phil Baran as a co-author). Here's the current literature situation:

A day in the library can save you a week in the lab, so let’s examine this paper’s impact using SciFinder: it's been cited 179 times from 2002-2013. Using the “Get Reactions" tool, coupled with SciFinder’s convenient new “Group by Transformation” feature, we identified 54 reactions from the citing articles that can be classified as “Oxidations of Arylmethanes to Aldehydes/Ketones" (the original reaction's designation). Of these 54 reactions, only four (4) use the conditions reported in this paper, and all four of those come from one article: Binder, J. T.; Kirsch, S. F. Org. Lett. 2006, 8, 2151–2153, which describes IBX as “an excellent reagent for the selective oxidation to generate synthetically useful 5-formylpyrroles.” Kirsch's yields range from 53-79% for relatively complex substrates, not too shabby.

I'll send you over to Blog Syn for the further details, but let's just say that not many NMR peaks are being observed around 10 ppm. Phil Baran himself makes an appearance with more details about his recollection of the work (to his credit). Several issues remain, well, unresolved. (If any readers here have ever tried the reaction, or have experience with IBX in general, I'm sure comments would be very welcome over there as well).

Comments (73) + TrackBacks (0) | Category: Chemical News | The Scientific Literature

February 18, 2013

What Would Go Into the Chemistry Museum Displays, Anyway?

Email This Entry

Posted by Derek

Well, Cambridge is quiet today, as are many workplaces across the US. My plan is to go out for some good Chinese food and then spend the afternoon in here with my family; my kids haven't been there for at least a couple of years now.

And that brings up a thought that I know many chemists have had: how ill-served chemistry is by museums, science centers, and so on. Physics has a better time of it, or at least some parts of it. You can demo Newtonian mechanics with a lot of hands-on stuff, and there's plenty to do with light, electricity and magnetism and so on. (Quantum mechanics and particle physics, well, not so much). Biology at least can have some live creatures (large and small), and natural-history type exhibits, but its problems for public display really kick in when it shades over to biochemistry.

Chemistry, though, is a tough sell. Displays of the elements aren't bad, but many of them are silvery metals that can't be told apart by the naked eye. Crystals are always good, so perhaps we can claim some of the mineral displays for our own. But physical chemistry, organic chemistry, and analytical chemistry are difficult to show off. The time scales tend to be either too fast or too slow for human perception, or the changes aren't noticeable except with the help of instrumentation. There are still some good demonstrations, but many of these have to be run with freshly prepared materials, and by a single trained person. You can't just turn everyone loose with the stuff, and it's hard to come up with an automated, foolproof display that can run behind glass (and still attract anyone's interest). An interactive "add potassium to water to see what happens" display would be very popular, but rather hard to stage, both practically and from an insurance standpoint. You'd also run through a lot of potassium, come to think of it.

Another problem is that chemistry tends to deal with topics that people either don't see, or don't notice. Cooking food, for example, is sheer chemistry, but no one thinks of it like that - well, except Harold McGee and now the molecular gastronomy people. (Speaking of which, if any of you are crazy enough to order this from Amazon, I'll be very impressed indeed). Washing with soap or detergent, starting a fire, using paint or dye - there are plenty of everyday processes that illustrate chemistry, but they're so familiar that it's hard to use them as demonstrations. Products as various as distilled liquor, plastic containers, gasoline, and (of course) drugs of all sorts are pure examples of all sorts of chemical ideas, but again, it's hard to show them as such. They're either too well-known (think of Dustin Hoffman being advised to go into plastics), or too esoteric (medicinal chemistry, for most people).

So I started asking myself, what would I do if I had to put up some chemistry exhibits in a museum? How would I make them interesting? For med-chem, I'm imagining some big video display that starts out with a molecule and lets people choose from some changes they can make to it (oxidation, adding a fluorine, changing a carbon to nitrogen, etc.) The parts of the molecule where these change are allowed could glow or something when an option is chosen, then when you make the change, the structure snazzily shifts and the display tells you if you made a better drug, a worse one, something inactive, or a flat-out poison. You'd have to choose your options and structures carefully, but you might be able to come up with something.

But other things would just have to be seen and demonstrated, which is tricky. Seen on a screen, the Belousov-Zhabotinskii reaction just looks like a special effect, and a rather cheap one at that. But seeing it done by mixing up real chemicals and solutions right in front of you is much more impressive, but it's hard for me to think of a way to do that which would be done often enough (and on large enough scale) for people to see it, and wouldn't cost too much to do (supplies, staff, flippin' insurance, etc.)

If you had to build out the chemistry hallway at the museum, then, what would you fill it with? Suggestions welcome.

Comments (70) + TrackBacks (0) | Category: Chemical News

February 15, 2013

Pfizer's Covx Closing?

Email This Entry

Posted by Derek

A Friday night blog entry is a rare event around here, but I've had a report that Pfizer has been closing down their Covx unit in San Diego today. It is (or was) the peptide therapeutic part of the company. This makes this part of the Pfizer web site a bit. . .inoperative:

CovX and Rinat are two biotechnology companies acquired by Pfizer that are currently operating as independent units within Worldwide R&D. This operating model allows CovX and Rinat to maintain their unique cultures and scientific approaches while having full access to Pfizer's world-class capabilities and resources.

So much for that. Can anyone confirm the report?

Comments (35) + TrackBacks (0) | Category: Business and Markets

The Finest Blue in the Lab

Email This Entry

Posted by Derek

CuSO4.jpg
For Friday afternoon, a bit of chem-geekery. I recently had occasion to use some copper sulfate, and the bottle I had was marked "large crystals" of the pentahydrate. I have loved the color of that stuff since I was a kid, and still do. Powdered, you lose a lot of the effect, but the chunks of crystalline stuff are the very definition of blue. (Photo from egeorge96 on Flickr).

Does anyone know a better one? That's my candidate for the solid phase. In solution, the complex of copper II and pyridine is a good one, a bit more towards royal blue/purple. You can definitely see the change when the pyridine hits it. I can't find a photo of that one on the web; if anyone has one, I'll be glad to post it. More colors to come on other slow Friday afternoons.

Update: a rare gas-phase blue (!) from the comments. Never seen that before!

And another one from the comments: here's someone who really, really, really likes copper sulfate. Here's how it was done.

Comments (42) + TrackBacks (0) | Category: Life in the Drug Labs

ABT-199 Clinical Trial Suspended (Updated)

Email This Entry

Posted by Derek

Abbott - whoops, pardon me, I mean AbbVie, damn that name - has been developing ABT-199, a selective Bcl-2-targeted oncology compound for CLL. Unlike some earlier shots in this area (ABT-263, navitoclax), it appeared to spare platelet function, and was considered a promising drug candidate in the mid-stage clinical pipeline.

Not any more, perhaps. Clinical work has been suspended after a patient death due to tumor lysis syndrome. This is a group of effects caused by sudden breakdown of the excess cells associated with leukemia. You get too much potassium, too much calcium, too much uric acid, all sorts of things at once, which lead to many nasty downstream events, among them irreversible kidney damage and death. So yes, this can be caused by a drug candidate working too well and too suddenly.

The problem is, as the Biotech Strategy Blog says in that link above, that this would be more understandable in some sort of acute leukemia, as opposed to CLL, which is the form that ABT-199 is being tested against. So there's going to be some difficulty figuring out how to proceed. My guess is that they'll be able to restart testing, but that they'll be creeping up on the dosages, with a lot of blood monitoring along the way, until they get a better handle on this problem - if a better handle is available, that is. ABT-199 looks too promising to abandon, and after all, we're talking about a fatal disease. But this is going to slow things down, for sure.

Update: I've had email from the company, clarifying things a bit: "While AbbVie has voluntarily suspended enrollment in Phase 1 trials evaluating ABT-199 as a single agent and in combination with other agents such as rituximab, dosing of active patients in ABT-199 trials is continuing. Previous and current trials have shown that dose escalation methods can control tumor lysis syndrome and we have every expectation that the trials will come off of clinical hold and that we will be able to initiate Phase 3 trials in 2013, as planned."

Comments (18) + TrackBacks (0) | Category: Cancer | Clinical Trials | Toxicology

Merck Finally Settles Over Vytorin

Email This Entry

Posted by Derek

You may remember that Merck and Schering-Plough took a lot of fire for the way that they released the clinical data for one of the key Vytorin trials (ENHANCE). The numbers were delayed for months, and when they were finally released, they were. . .problematic for the drug. And for the companies' stocks.

The institutional shareholders did not take that one well; and a number of them filed suit. This week it was announced that Merck has settled for $688 million, while admitting no wrongdoing. This settles the suit, but it isn't going to settle anyone's nerves, as Matthew Herper rightly observes:

Merck admitted no liability or wrongdoing in the decision, and continues to believe its handling of the study was proper. But the settlement could make investors nervous anyway. One of the reasons Vytorin has never recovered (sales of the pill are $1.5 billion, $1 billion less than before the results were released, but that partly reflects a price increase) is that Merck’s other clinical trials, so far, have never again compared Vytorin to Zocor to look for differences in real cardiovascular problems like heart attack and stroke. Instead, the other big trial of Vytorin compared it to placebo in patients who had a heart valve that did not close fully.

But Merck is doing that big Vytorin versus Zocor study, a giant clinical trial called IMPROVE-IT. Results have been delayed several times, and probably won’t come until next year. But the company has said that the independent board that is monitoring the results of the trial will meet in March. They could decide to stop the trial if it has already proved more effective, if Vytorin appears more dangerous than Zocor, or if there is no hope that Vytorin will prove more effective.

I doubt that the trial will be stopped, but at this point I'll be surprised if it yield enough strong data to vindicate Vytorin, either. The delays seen in the trial so far make that look like a very outside chance. My guess is "beneficial effect, but not as much as you'd want", which won't satisfy anyone.

Comments (3) + TrackBacks (0) | Category: Cardiovascular Disease

February 14, 2013

How Can There Be a Shortage of Scientists And An Excess At The Same Time/

Email This Entry

Posted by Derek

I wanted to come back to the topic of whether we have (1) too many unemployed (or underemployed) scientists and technology people in the US, or (2) a critical shortage of qualified people that's leading companies to complain that they can't fill positions. Can we really have both at the same time? All this bears on (3): should we revise the visa rules to let in more technically qualified immigrants?

The other day I wrote about a PriceWaterhouseCooper (PwC as they would have it) report on this very issue. I'll pick up where that post left off. One thing to notice about the PwC report is that it's aimed at HR departments, and it tells them some of the things they want to hear - that they're important, that they're unappreciated, and that they have a crucial role to play in today's hiring environment. This is not just flattery; this is advertising - aspirational advertising, to be more accurate. That's the technique (used since forever) of pitching an ad to a slightly more elevated group (socioeconomically) than the one it's actually aimed at. Think of mail-order catalogues and credit-card offers; that's where you see this in the crudest form. The idea is to make the recipients think "Wow, they must think I'm one of those people", or (even better) "Wow, I must really be one of those people". That is, the sort of people who shop for this pricey merchandise, or who think nothing of paying the annual fee for a MatteBlackAnodizedPlatinum Card, what have you, because that's the high-end life they lead.

What's PwC selling, then? Why, consulting services to all these HR departments, to help them navigate their extremely important, critical-like-never-before jobs in this extraordinary environment. The HR people have their morale improved, PwC gets some new accounts, and everyone's happy. But the report is still a pure example of the "critical lack of good candidates" idea, being put to more immediate use by a company that sees an opportunity to trade on what's saturating the air right now.

But how can there be a shortage and an excess at the same time? Part of the answer might be found in the work of Peter Cappelli of the Wharton School at Penn. A reader sent that link along to me the other day, and it's well worth a look. Cappelli is the author of Why Good People Can't Get Jobs, and his take is that employers are largely to blame for this situation:

. . .Today’s CEOs regularly blame schools and colleges for their difficulties in finding adequately prepared employees. The complaint shows up in survey after survey, as Cappelli shows in his book, and it is substantially more common among American employers than their peers in most other developed and developing economies.

But do these surveys “show that the United States is among the world leaders in skills gaps,” Cappelli asks, “or simply in employer whining and easy media acceptance of employer complaints?”

He thinks a body of lesser-reported studies contains the answer. “If you look at the studies of hiring managers and what they want, they’re not complaining about academic skills,” Cappelli says. “You hear the business spokespeople saying this, but the actual hiring managers are not saying this now. And in fact they’ve never, in modern times, said that.”

And Cappelli also has pointed out that this view of the world is appealing to several constituencies at the same time, among them, people who advocate school reform and changes in research funding, social reformers of several different kinds, and employers who would rather place the blame for some of their problems on outside factors. There's a reason this idea keeps circulating around - there are a lot of extraneous reasons to keep believing it.

He goes on to decry what he calls the "Home Depot" approach to hiring:

In a 2011 op-ed article for The Wall Street Journal, Cappelli remarked on a telling statistic from the Silicon Valley tech boom of the 1990s: only 10 percent of the people in IT jobs had IT-related degrees. But a lot of the same people would probably have a hard time landing similar jobs today, because employers have increasingly adopted what Cappelli calls “a Home Depot view of the hiring process, in which filling a job vacancy is seen as akin to replacing a part in a washing machine.

“We go down to the store to get that part,” he explains, “and once we find it, we put it in place and get the machine going again. Like a replacement part, job requirements have very precise specifications. Job candidates must fit them perfectly or the job won’t be filled and business can’t operate.”

He lays some of the blame for this on software-based hiring practices, the CV-scanning programs that look for the keywords that supposedly have to be present for a candidate to be considered. (Many readers here may have run into this problem; chemistry and its associated disciplines are an unfortunately good fit for this approach). And here's where some sympathy for the HR people might be appropriate: these sorts of "solutions" are often used when there aren't enough people (or enough time, or money) to do a good job of screening applicants. That's not to say that there probably aren't some HR people who truly believe that this is the best way to do things, but some of them also have their backs to their own walls.

There's another part of that article on Cappelli that takes us to the H1B visa issue:

When there are three or four job-seekers for every vacancy—and some postings draw applicants by the hundred—firms have an understandable incentive to wait for a dream candidate to show up. And ideally, a dream candidate who expresses a low salary requirement.

In (a recent) Manpower survey, 11 percent of the employers reporting skill shortages chalked it up to applicants unwilling to accept job offers at the wages companies were willing to pay.

I have the impression that much of the push to open up the technical-worker visas is coming from Silicon Valley and the IT world in general. (Someone correct me if I'm wrong). And it's also my impression that there are already a lot of people in that job market looking for work - again, if I'm mistaken about this, I'll revise this post. So one (not very charitable) explanation for a drive to bring in more job candidates from abroad is that they will be cheaper to hire, and that employers will have more leverage over them because of their visa situation. Plausible, or not? Update: apparently all too plausible - see this New York Times piece.

Now, it pains me to write that sort of thing, because we could head right off into the whole immigration-reform swamp, which is concerned with a lot of issues that are peripheral to this discussion. (Undocumented workers from Central America, for example, are not a big factor in IT or chemistry hiring). And I think that the US should indeed admit immigrants, that doing so has been one of the big factors in making us the nation we are (the good parts, I mean), and that if we're going to let people in, that we should strongly, strongly bias the process towards smart, entrepreneurial, hard-working ones. So I have a natural sympathy towards the idea of bringing in technically and scientifically trained people.

But not to use them as a source of cheap labor that can be leaned on