About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
In the Pipeline:
Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline
January 28, 2015
Anyone who expresses and purifies proteins for a living has had to do plenty of refolding over the years. Some proteins are rock-solid rock stars - carbonic anhydrase, say. You can do most anything to them and they just shrug it off. But those are exceptions. A lot of interesting proteins are prima donnas - they have to be expressed in the right system, be run through the right buffers in the right way (and down the right sizing columns, etc.), or they just curl up and refuse to perform. Worse, there are plenty of proteins that aggregate on themselves and deposit a lump in the bottom of the sample vial, often never to be seen in active form again.
You can rescue these things, sometimes, by a whole range of techniques to kick their conformations around for another pass. Now a group at UC-Irvine says that they have a new refolding protocol. It looks like a combination of urea and some sort of mechanical treatment, creating shear in the fluid to untangle things.
Here's the paper. I'll be interested to see how this works for other people. I'm under the impression that there are several protocols out there that already rely on mechanical stress in solution, but that they're not everyone's favorite way to do it. I'll defer to the protein people out there for more informed opinions, though. Overall, I'd say that the world needs some more good protein-rescue techniques, and I hope that this is one.
+ TrackBacks (0) | Category: Biological News
Chemjobber picks up here on an interesting letter to C&E News. What do you do, on a job interview, when your interviewer says something that's obviously wrong?
In my case, however, there was not an explicit question being asked. Rather, while discussing some topic, the interviewer would say something that was clearly and obviously (and even blatantly) false. For example, the interviewer might say something that violated one of the laws of thermodynamics. In retrospect, it is clear that interviewers were not testing the knowledge of thermodynamics (or whatever the subject of the false statement was about). More likely they wanted to see how the interviewee handled suddenly being placed in a potentially awkward situation.
Has anyone out there had this happen to them? I never recall getting this technique tried on me - or who knows, maybe someone did, and I didn't pick up on it and bombed out right there. (Just kidding about that one - a bad interview, like a bad date, is usually obvious to everyone concerned).
But I can imagine some interviewers doing this sort of thing, and some organizations being fine with it. It sounds more like something they'd pull at a Wall Street firm than at a science/technical outfit, but I guess you never know.
+ TrackBacks (0) | Category: How To Get a Pharma Job
January 27, 2015
I mentioned last fall that the Federal Trade Commission had gone after an egregiously faked-up "study" that was being used to sell green coffee bean extract as some sort of miracle weight-loss drug on the Dr. Oz show. The agency has now gone after "Dr." Lindsey Duncan, the guest who was pushing the stuff. Update: here's the complaint.
That link will make you want to spit on the floor. Duncan is accurately described as a snake-oil salesman - his "degree" is from an unaccredited mail-order diploma mill, and he'd never heard of green coffee bean extract before the Dr. Oz show staff contacted him to see if he could be a guest to talk about the stuff. He assured them that yes indeed, he was just the man for the job, and got his web site lined up to sell bushels of the stuff. It does not seem to have occurred to Dr. Oz or his production staff that the guy who was confidently recommending a particular brand of green coffee bean extract as the stuff that would take off 16% of your body weight without dieting or exercise might just possibly have a financial stake in the business. Or if that thought did cross anyone's mind, they certainly didn't let it disturb them.
My contempt for Dr. Oz has actually increased, which I hadn't thought likely. But think of the contempt that its staff and its guests must have for their viewers - now that's impressive.
+ TrackBacks (0) | Category: Snake Oil
I've already been hearing from people out on the West Coast about how it's seventy-odd Fahrenheit out there. My first thought was "It's five AM out in San Diego; why aren't you asleep rather than looking at the thermometer?" But as for the Cambridge/Boston biopharma world, yeah, we're mostly at home today. New York City doesn't seem to have had much of an impressive snowfall, but it's lived up to the billing here around Boston. There was just a thin coating of snow when I went to bed last night, but now it looks like someone contracted to have a fleet of dumptrucks come by and bury the place. So we can rule out beautiful weather as a key factor for starting a booming biotech cluster.
Can we rule out vile weather as one? This is a question that keeps coming up with regard to picking graduate schools. People go off to Rochester or Wisconsin, and the usual joke is that "Well, you'll just have to stay in the lab and get more work done", as opposed to Santa Barbara or Hawaii, where the standard joke is not to spend too much time on the beach. (Those lines ignore the second-order effects, though. When the weather's always fine (La Jolla or Irvine), you take it for granted. When it finally stops snowing in a place like Ann Arbor or Ithaca, people want to take advantage of it!)
When I was at Duke, we had a post-doc in the group who came from Hawaii's department (natural products chemistry, of course), and when he showed us his PhD seminar, we wanted to throw stuff at him. The Duke labs were (at the time) windowless prison cells, and here was this guy talking about their three-week coral collecting trip to French Polynesia. The problem for us wasn't so much the weather in North Carolina, as the fact that we didn't even know what it was doing outside to start with.
I've never known quite what to think about the climate/productivity question. The most important weather, for a scientist, is the weather between the ears. That's where the big discoveries get made, when you get down to it, and whether they're made while sunshine is streaming in through the windows or they're being pelted with sleet doesn't seem as it if would matter so much. There's a big overlay of historical accident when you consider the distribution of large, well-stocked research universities, which confuses the issue as well. And then there's another confounding factor, which is that some people true do get draggy and depressed in cold, low-sunlight conditions. Any thoughts from the crowd?
+ TrackBacks (0) | Category: Life in the Drug Labs
January 26, 2015
This does not look good at all. The European Medicines Agency (EMA) has accused the large Indian generic company (and outsourcing contractor) GVK of widespread systematic fraud. According to this press release, the agency investigated about 1000 generic formulations of various drugs from GVK, and found that 300 of them had enough data (from other sources) to support them. But the other 700 (representing 10 to 15 separate drug substances) don't:
The inspection of GVK that led to the CHMP's recommendation was carried out by the French medicines agency (ANSM). The inspection revealed data manipulations of electrocardiograms (ECGs) during the conduct of some studies of generic medicines. These manipulations appeared to have taken place over a period of at least five years. Their systematic nature, the extended period of time during which they took place and the number of members of staff involved cast doubt on the integrity of the way trials were performed at the site generally and on the reliability of data generated at that site.
And there you see Falsus in unum in action. How can the rest of the studies be trusted, when you know that at least one important one has been faked? And faked with care and attention? European countries are in the process of pulling marketing authorizations for many GVK drugs. For its part, the FDA says that it conducted its own inspection after the French one, and found no irregularities, but one might assume that GVK made sure that things didn't go quite so catastrophically that second time. We'll see if they have something more to add.
+ TrackBacks (0) | Category: Regulatory Affairs | The Dark Side
Like a lot of other scientists in the Northeast today, I'm getting things in the lab ready for me not being there tomorrow. (I just ran into a colleague who didn't know that we're set to get two to three feet of snow, distributed by 50 mile-per-hour winds, over the next 36 hours, and he was not happy to get the news from me). I don't have much equipment to worry about, but I do have a number of reactions in progress.
Fortunately, they're the heat-them-up-and-make-some-product kind, which is good, since there are a lot of possible analogs in this series and I don't have the time to spend handcrafting each one of them. These things will probably be done overnight, sitting at 80C, but it won't do any real harm if they go another day or two like that. The products are pretty robust, and there's not much more that can happen in there. One is not always so fortunate as to have a bunch of boring reactions going on, though. There are certainly other transformations that have to be watched more closely, and you'd rather not set those up right before a massive blizzard.
Overall, though, chemists have it lucky in this regard. Think of the people doing finicky cell culture work, or the ones helping to run an animal facility. Some of those tasks are going to have to be done no matter what the weather is, or what a pain it is to get in and do them. You can arrange things to minimize the problem, but you can't get away from it completely. My SnAR displacements could sit around until next Monday without any attention if I had to do that with them, but some transient-expression cell line is probably not going to be quite that hardy.
+ TrackBacks (0) | Category: Life in the Drug Labs
So we've been having an extraordinary bull market for new biopharma companies - just look at the number that have gone public the last few months. (The past two years have beaten the numbers put up in the previous ten!) And it's not just the new ones, either; the number of follow-on offerings has increasing dramatically, and why not? People are lining up to throw cash at this sector, and there are plenty of companies ready to receive some of it. Might as well go out there with a couple of laundry baskets while it's raining money. These weather fronts move on through eventually, y'know. Here's BioCentury's count of those secondary offerings, via AndyBiotech on Twitter.
But there are some gaps. Bruce Booth's blog is finding room to wonder about the early-stage companies. You don't just form a new biotech and go public, of course. The job of company formation has already taken place by then, and just how is that going under the present conditions? As he notes, VC firms have had many more exit opportunities recently, which is no bad thing. And those returns have brought in plenty of new money looking for something similar. With interest rates about as low as they can possibly be - and even lower, if you're buying Swiss bonds, some of which are currently at negative yields - a hot equity market is going to stand out even more than usual.
And that's what the numbers show. A lot of the money coming into the area is from funds that normally would be investing in later-stage public companies, not from traditional VC folks. Only about half the startup money, or perhaps even less, is coming from those groups-of-limited-partner operations. In fact, Booth wonders if classic VC firms have increased their presence much at all over the last few years. Newer figures may change that; we'll have to see.
He then asks "Who's getting all this cash, anyway?" He has figures showing that "the same number of private biotech companies got funded in venture rounds during the lean bunker-mode of 2009 as got funded in the bubblicious climate of 2014." And that's a surprise. You'd have thought that more companies would be taking off in this environment. But what's happening is that the same number of companies are just raising more money. In fact, as he puts it, there's a "staggering and glaring disconnect between the demand for innovation in biotech today and the formation of new startups to address that demand." Considering the number of companies that have been exiting the private-financing world and hitting the public markets, there seems to be a real vacuum forming behind them.
I hope that this is a temporary situation, that supply will rise to meet demand, and for that matter, that the supply of suppliers will rise as well. We need a variety of views and approaches in early stage R&D, and that goes for the firms that are creating companies as well. There is no one perfect way to do this stuff (if there were, it would have taken over the world by now). Worth keeping an eye on in the next year or two. Stock market conditions are going to play a big role, though - what if someone decides to start up a big new limited partnership, rounds up some people and some money, and launches just as the biotech market window starts closing? That shouldn't, in theory, matter as much to an early-stage outfit, but it'll have a big psychological effect at the very least.
+ TrackBacks (0) | Category: Business and Markets
January 23, 2015
Elizabeth Warren has come out with a proposal for what she's calling a "pharmaceutical swear jar". Once a drug company had been fined, this bill would earmark a percentage of their profits over a multiyear period for use in NIH funding. Like many of Warren's ideas, this one has a lot of populist appeal - but does it make any sense?
It actually reminds me of that "Take back the grant money in case of fraud" idea that I vacillated about earlier this week. That one, though, at least has the benefit of having a pretty direct connection between the people causing harm and some of the ones getting harmed. In this idea, the tie is pretty indirect - unless, of course, Warren is one of those people who believe that the drug industry mostly makes its money by ripping off NIH-funded discoveries. And she seems to be: she says that such companies are "generating enormous profits as a result of federal research investments". So she probably sees this as a perfectly logical quid pro quo. This idea makes it seem as if drug companies have harmed the NIH when they're fined for over-promotion of a drug or for manufacturing problems, but that connection doesn't exist.
Note: Elizabeth Warren is not alone in her belief. There are a number of people who passionately believe that the bulk of drug research is done with NIH money, and that drug companies exist just to skim off the cream. I never knew this before I started this blog, I have to tell you, and if you try this idea out on anyone who actually works in biopharma they stare at you as if you've lost your mind, but there it is. The topic has come up at length around here several times: try here, here, here, here, and here. And this is a nice perspective from a former director of the NIH.
In the same way that I was wondering about incentives (appropriate ones and perverse ones) in that grant money idea, it's worth looking at the incentives here. One good thing is that it wouldn't be the NIH, the interested party, bringing the accusations against pharma. In such situations, the potential for abuse, the moral hazard, is too high. You end up with problems like the ones seen with police department seizures of assets, or (in a less dramatic, but still pernicious way) with stoplight cameras. What's supposed to be a public service ends up as a revenue source.
But what this does remind me of are the various states that use lottery income to fund education. The trouble with that is that the education budget itself doesn't necessarily stay the same when such a system goes into effect. The lottery money gets sent in that direction, while some (or damn near all) of the money that was earmarked for education gets moved over to other worthy vote-getting projects. In the end, the education budget doesn't really go up. So one problem with this idea is that it might not end up providing much extra money to the NIH, in the end.
So where does the money go when a drug company is fined under the current system? Into the general fund, as far as I can tell. I don't think it goes right to the Department of Justice, the FDA, or the FTC (that would set up that moral hazard problem mentioned above). No, it seems to go right back into the big pile, to be divided up by the House during budget negotiations. Some of that money (although not a very large per cent) finds its way to the NIH as it is; the rest gets spread out to service on the debt, "entitlement" programs, defense, and so on. (Actually, those three categories alone are a pretty good proxy for the entire federal budget - the NIH, by contrast, gets less than one per cent of the total).
Warren's plan would earmark money, then, for one agency. My sense is that in general lawmakers are not keen on doing that. They'd rather fund the various parts of the government by dividing up a general fund, rather than by a maze of bespoke conduits from Source A to Recipient B. (As the lottery example shows, even when something like this is set up, money gets dragged back into that general fund anyway). And as distasteful as the budgetary process is, I have to agree with the current system. I think the potential for flexibility is worth more than the sense of fitness that one might get from knowing that some (more or less) appropriate source is funding some particular program. To pick a large example, it's not like Social Security or Medicare taxes all go off to some special place where they get spent only on Social Security or Medicare (much less, as some people apparently believe, that "their money" is going into "their account").
The current NIH budget is about $30 billion. Warren's proposal, by her own reckoning, would have increased the agency's budget by $6 billion a year over the last five years, but the last five years have also seen the biggest fines ever. Historically, that isn't the kind of cash that comes in all the time (although if Elizabeth Warren were in charge of the process, no doubt things would pick up). The agency's funding has been flat, and flat at a lower level than its peak during George W. Bush's first term, so I'm sure that they'd be glad of the money.
But if Congress wants to give more money to the NIH - and they could do worse, and often do - then they should give more money to the NIH the old-fashioned way: by voting on a budgetary resolution that does this. That way, the agency itself knows what's coming, and when it's showing up, and can figure out what best to do with the money. The "Pharma Swear Jar" idea seems to me to be a stunt, designed to make people feel better and to give Elizabeth Warren a chance to campaign on having helped to stick it to the big drug companies.
+ TrackBacks (0) | Category: Business and Markets | Regulatory Affairs
Here's a report that GSK is cutting back severely in China - 1,000 workers during the first part of this year. Caixin.com seems to be first out with this story; others are quoting them. They refer to "sources close to the company", apparently employees who have already had word of the cutbacks. According to these sources, GSK has been experiencing a lot of difficulties in the country in the aftermath of the long, complex, and messy scandal of the past couple of years.
So this would be the Chinese component of GSK's overall restructuring, and they're apparently not alone:
Other foreign drug companies are trimming their number of employees in China. In November, the U.S. company Bristol-Myers Squibb started laying off around 1,000 workers, including around 10 senior managers, the 21st Century Business Herald newspaper reported, citing sources with knowledge of the matter.
It wasn't that long ago that these companies (and others) were ramping up spectacularly in China, which was going to be the big new frontier for the industry. That doesn't seem to be working out in quite the way everyone had anticipated. Some of this is just one part of some broader cutbacks, but some of it may well be a retrenchment of some Chinese dreams.
Update: the company is denying the Caixin report.
+ TrackBacks (0) | Category: Business and Markets
January 22, 2015
If you do early-stage medicinal chemistry, you'll probably be interested in this overview of spirocyclic scaffolds. It has examples from the recent literature, and an update on synthetic methods to get into this chemical space.
I've made several compounds like this over the years, without much success in the assays so far. But as the paper shows, there are plenty of active compounds out there, and the spiro ring fusion gives you access to fixed conformations that you're probably not going to get to any other way. Like any other tied-back series, it's sort of a death-or-glory move, as far as your SAR goes, but when it works, it really works.
+ TrackBacks (0) | Category: Chemical News | Life in the Drug Labs
This profile of Actavis CEO Brent Saunders (by Matthew Herper) makes interesting reading, if your definition of "interesting" is roomy enough to take in various shades of "unnerving" and "enraging". They're the company that saved Allergan from the clammy embrace of Valeant, but reading this, you'll wonder how big a difference it makes.
Saunders is being completely up front about some of the issues that many of us in the industry have been worrying about, but it's strange to hear these things out in the open:
Saunders insists that Actavis-Allergan is more than just a short-term trade. It’s the springboard for a revolutionary new kind of drug company: “growth pharma,” he calls it. Actavis-Allergan will have the scale in marketing and clinical trials of a global powerhouse like Eli Lilly or Bristol-Myers Squibb, but it will eschew the core mission of most drug companies–inventing drugs–preferring to buy them from universities or biotechs all the time. The new company will be the first big pharma that doesn’t even pretend to invent medicines.
“ The idea that to play in the big leagues you have to do drug discovery is really a fallacy,” says Saunders. “You have to do research, you have to be committed to innovation. I strongly believe that, but discovery has not returned its cost of capital.”
We have Fred Hassan to thank for Brent Saunders, and depending on your dealings with Hassan, this could give you a strong opinion about him pretty quickly. After coming up through Hassan during his time at Schering-Plough, Saunders became CEO at Bausch and Lomb, and ended up selling them to. . .Valeant. And Michael Pearson of Valeant did what he does: siphon out the gas from the tank, rummage through the glove compartment for loose change, and sell the rest for scrap. We also have Carl Icahn to thank for Saunders, since he next installed him as head of Forest Labs, shortly thereafter sold to Actavis. (And if the idea of a Hassan/Icahn hybrid doesn't give you pause, perhaps it should).
Within two weeks of becoming the CEO at Actavis, Saunders was involved in the Valeant/Allergan deal:
On July 30 Saunders called (Allergan CEO David) Pyott and offered to be a white knight. Over months of phone calls, he portrayed himself as the anti-Pearson, despite the fact that he agreed with much of Pearson’s thesis on the drug business. No, Saunders told Pyott, he would not strip the company like Pearson. Allergan would continue to do crucial research on things like dry-eye drugs and successors to Botox. Yes, the business could stay largely intact. . .
For some values of "intact", that is. As Herper's article makes clear, so far in his career, Brent Saunders has been a deal-making guy. And if he were to turn around and sell the combined Actavis/Allergan to, say, Pfizer, that would not be out of character in the slightest. It's worth noting that the company's tax domicile is Irish, in case that sort of thing interests Pfizer (or some other large US-based company) at all.
My own hope is that the rest of the industry can prove that the "research is a waste of cash" idea is a delusion. The numbers over the last ten or fifteen years, unfortunately, make a case that it isn't a delusion prima facie, but that attitude doesn't have to be correct, either. Even if the largest drug companies have been having trouble with their own return on investment for R&D (and not all of them have), many of the smaller ones have been able to make it work. (On the other hand, it's worth keeping in mind that at that end of the market, the less unsuccessful examples just disappear, so you have to keep survivor bias in mind). But overall, we keep trying to get better at drug discovery, and there's no reason that we shouldn't be able to make it work if we can keep doing that. (If we don't get better, on the other hand, we are in fact doomed).
So my own take is what I would call "hard-headed optimism". There's nothing that says that we have to fail, but there's nothing that guarantees our success, either. The future is unwritten. If that future leaves the likes of Valeant and Actavis falling behind the R&D-driven companies, though, I will be quite happy.
+ TrackBacks (0) | Category: Business and Markets | Business and Markets | Drug Industry History
January 21, 2015
Here's a look at the state of medical research in the US versus other developed countries (open-access article at JAMA).
Some things to note from that chart: (1) research funding has been pretty flat the last few years, with the only exception being the stimulus-package burst of cash. (2) The share of the total put up by biotechnology companies seems to have gone up a bit over the twenty-year span. (3) The money spent by industry is now up to 58% of the total in the US, and has been increasing over time. That's partly due to increased spending by industry, and partly due to lower-than-historical increases in spending by government sources. One thing to note is that these numbers have been inflation-adjusted, but by the Biomedical R&D Price Index, not the CPI (see the comments to this post for more on this).
Here comes a section with some interesting numbers, derived from PhRMA annual overviews:
The distribution of investments across the types of medical research changed from 2004 to 2011. Pharmaceutical companies shifted funding to late-phase clinical trials and away from discovery activity such as target identification and validation. The share of pharmaceutical industry funding (including that by US companies outside of the United States) spent on phase 3 trials increased by 36% (5%/year growth rate) from 2004 to 2011 (Figure 4), and the share of investment in prehuman/preclinical activities decreased by 4% (2%/year average decline). This shift toward clinical research and development reflects the increasing costs, complexity, and length of clinical trials but may also reflect a deemphasis of early discovery efforts by the US pharmaceutical industry. While industry has shifted funding to clinical trials, the share of NIH contributions dedicated to basic science and clinical research was unchanged (eTable 2 in the Supplement), with the majority of funds still focused on basic research. These data may not accurately reflect the true division of NIH investment for basic science vs disease-focused research, as a growing proportion of NIH expenditures is for projects having potential clinical application in many diseases or organ systems.
I wonder if some of the shift has been away from what gets defined as "pharmaceutical companies" and toward what gets defined as "biotechnology companies". A lot of smaller outfits are not members of PhRMA, of course, and I think that early-stage research has been heading towards their end of the industry. As for those small companies, here's a look at venture funding across this period:
In real terms, venture capital investment in biotechnology companies steadily increased from $1.5 billion in 1995 to a peak of $7.0 billion in 2007 (eFigure 3 in the Supplement). During that period, investment in biotechnology companies as a share of total venture capital investment increased from 10% to 18%, and the number of investments increased from 176 to 538. Investment levels and the number of transactions of biotechnology decreased following the financial crisis in 2008-2009, declining to a low of $4.3 billion in 2009. Venture capital investment still has not recovered to its pre-2008 levels, with only $4.5 billion invested in 2013. Size of investment per transaction (median, $11 million, inflation adjusted) has remained unchanged for 2 decades.
I wonder if the current boom times in biopharma startups are changing these numbers (the JAMA article only goes up to 2011). We'll have to take a look at these figures again in a couple of years and see if that's happened (my guess is that it has). Interestingly, the paper goes on to talk about funding levels by disease, versus disease burden in the US. Cancer and HIV are funded at well above the levels that this measure would predict, but (as the study notes) there are many other factors in play (scientific opportunity, for one). Underfunded, by this measure, are migraine and COPD.
The comparisons to worldwide research funding then come up. The US share has been declining as Asia ramps up, but Asia was ramping up from a very small percentage twenty years ago. As it stands, the US is still the source of 44% of the world's medical research funds, with Europe at another 33%. In terms of single countries, the US is still by far the largest contributor. When you look at the number of people doing the work, China comes out in numbers, but is quite low in percentage of the population so engaged (and this isn't the only area where they're an outlier in this fashion!)
I'll discuss patent and publication data in another post; there's enough to talk about at more length there. Overall, the authors of this paper conclude that the US, while still leading in most categories, has been standing relatively still or slipping a bit during the period reviewed:
Medical research in the United States remains the primary source of new discoveries, drugs, devices, and clinical procedures for the world, although the US lead in these categories is declining. For example, whereas the United States funded 57% of medical research in 2004, in 2011 that had declined to 44%. Basic research and product development are central to the health of countries’ economies. However, changes in the pattern of investment, particularly level funding by US government and foundation sponsors, with a decline in real terms, combined with companies’ focus on late-stage products (with diminished discovery-level investment) indicate that difficulties may soon appear in the ability of clinicians to fully realize the value of past investments in basic biology.
My hope is that this has turned around somewhat in the last two or three years. There has been a notable upswing in small company formation and funding, from what I can see, and many of them are jumping on some of that basic biology mentioned above (chimeric antigen receptor-based therapy in cancer, for example, which is one of the hottest biopharma investment areas going right now). So this could be a snapshot taken at the gloomiest point (I hope so), or it could be picking up on a longterm trend that's continuing despite any recent new. I opt for the former, but I'm an optimistic person.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Drug Industry History | Who Discovers and Why
Setting what is nearly a new personal record, I'm backtracking on my approval yesterday of the idea of granting agencies getting a refund for retracted papers. The problems I mentioned in that post, along with the arguments of many in the comments section, have swung me back around.
What I kept thinking about is how this might have applied to some recent cases of fraud and misconduct. Would anyone have retrieved any of the funding yet? I doubt it - there would still be all sorts of fighting going on about intention (carelessness versus fraud) and the like. And the perverse incentives brought on by this policy are very likely to be worse than the problem that it's trying to cure. This would, unfortunately, drive honest retractions from the world (and there really are some). It would also put a lot of pressure on journals and their staff, since they are so involved in whether or not a paper is retracted at all. And what do we do about lousy journals that never retract a thing?
No, while the current system has plenty of opportunities for abuse, I think that putting in this option would not improve the situation. I've thought of Ambrose Bierce's Devil's Dictionary, where he defines a conservative as "A statesman who is enamored of existing evils, as distinguished from the Liberal who wishes to replace them with others." This grant-clawback scheme would replace existing evils with new ones.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
January 20, 2015
Now, here's a proposal that would shake a lot of academic research up: what if granting agencies could force a refund of the awarded funds if the resulting work has to be retracted? That's the idea of this post at Retraction Watch:
How to do this? I propose that funding agencies demand a refund clause in every grant application, signed by the applicant’s current or potential host institution. This clause would specify the institutional responsibility in case the earlier publications by the applicant on which the proposal is based should ever be retracted, regardless of the reasons. Grant proposals without such signed clauses would not be processed upon submission. The percentage of refund can be further specified to consider the reasons for retraction, degree of how each cited publication by the applicant is to be weighted, the extent and transparency of the institutional investigation, the perpetrators’ stance, recidivism and so on.
Importantly, the clause should apply to every retraction, not just those in which misconduct is proven. Otherwise, the institutions would have an incentive to cover up misconduct, simply to avoid being fined for it. Even in the very rare cases of artifacts, initially misinterpreted in good faith, what would be the point of funding a further investigation into these artifacts as originally proposed in the grant application? This may appear unfair to the unlucky honest authors of artifactual papers, but supported by a thorough investigation, any serious grant agency would be inclined to find a compromise solution.
I like this plan. Update: I've had second thoughts! There are probably ways to game it that I haven't yet considered, but I like the incentive structure that it's trying to provide. This would, or should, give the truly dishonest pause, and give honest researchers even more reason to keep their eyes open. The main problems I can see are these two: first, there would suddenly be a new category of suspicious paper that didn't quite rise to the level of a retraction, so as not to trigger the refund clause. Second is the one mentioned above - that it would give entire institutions still more reason to try to make sure that bogus work, once published, is never investigated at all. But perhaps that could be covered a bit by whistleblower law - perhaps a percentage of the recovered grant money to go to whoever uncovers the trouble? Yes, that would change things quite a bit.
+ TrackBacks (0) | Category: The Dark Side
I'm always looking out for new assays that might tell us what the heck is going on inside cells, so this paper caught my eye. The authors describe a new luciferase-based complementation assay for detecting protein-protein interactions. There are several things like this in the literature already (and for sale, too), but this one has what looks like a robust way to get the split-luciferase proteins expressed, and it seems to be picking up weaker and more transient interactions than most. For example, it's shown to pick up a specific ubiquitin ligase PPI that had been demonstrated by yeast two-hybrid assays, but never in living cells. (Depending on the signal/noise, this sensitivity could either be a bug or a feature!)
They also used this system on interactions of p53 (which has a good number of them), and found something interesting. The only-a-mother-could-love-it small molecule Nutlin 3a is believed to be an inhibitor of the p53-MDM2 interaction, but (as the current paper points out), this hasn't been conclusively demonstrated in living cells. This assay, though, confirmed that ". . .small-molecule PPI antagonists such as Nutlin-3a can selectively and rapidly disrupt preformed p53-Mdm2 complexes in living cells." But another reported PPI compound, SJ-172550, failed to show activity (its mechanism had already been reported as not just straight inhibition of the protein-protein interaction). RO-5963, another compound in this space, fared a bit better, but had a noticeably different profile than Nutlin-3a, which does argue for the ability of this assay to pick up fine details.
Stapled peptides have been used to target some of these p53 interactions as well, but conflicting data exist about just how well those work in this case. And the conflict continues: this assay showed some activity for ATSP-7041, but two other stapled peptides from the literature, SAHp53-8 and sMTide-02 (from that same ACS Chem Bio paper linked above) "exhibited no detectable ability to disrupt p53-Mdm2 or p53-Mdm4 complexes in living cells."
What they found, on further study, was that these stapled peptides seem to be cytotoxic, via some mechanism that has nothing to do with p53, and that this activity is inhibited by the presence of serum in the assay conditions. A cell-free assay system was developed, which indicated that the two problematic peptides were indeed able to disrupt the p53/MDM interactions as advertised - when they can get to it, that is. Adding serum to these assays did not disrupt things, though, which takes care of the possibility that something in serum just binds the peptides and keeps them from doing their thing.
So that leaves the serum as doing something else to keep the stapled species from actually entering the cells. What seems to be happening, from further experiments, is that the compounds are actually damaging cell membranes, which gives them a chance to get in and show activity under serum-free conditions. Adding 10% serum to the assay, though, seemed to protect the membranes from disruption (and thus makes the compounds show as inactive in the resulting cell assay). This effect was seem on plain old fibroblast cells as well, so it's not specific to cancer cells. And it wasn't seen with all stapled peptides, either - mutant forms of these very ones, for example, didn't have the same effect. Nutlin-3a didn't have it, either.
The authors suggest that this might be the source of some of the conflicting data in the literature on the effects of stapled peptide compounds, especially in this p53 area. People had noted some serum effects and cytotoxicity before, but much of this was explained via p53-dependent mechanisms. What this work shows is that the membrane damage is intrinsic to some of these peptides, and that this is going to have to be taken into account in future cell assays across the field. There are, of course, plenty of nonstapled peptides that are capable of causing membrane damage (some of them on purpose, as in natural antibiotic peptides), so this doesn't mean that stapled peptides are universally trouble. What it does mean, though, is that measurements of their penetration into cells have just gotten more complicated.
+ TrackBacks (0) | Category: Biological News | Chemical Biology | Drug Assays | Toxicology
January 19, 2015
I'd like to recommend this review by Steve Ley and co-workers on the "march of the machines" in synthetic organic chemistry. Prof. Ley is well-known as an advocate of cutting-edge flow chemistry, but this article is about more than that. There's a lot of flow in it (and it's an excellent summary of current technology in the area), but it's also trying to convey the opportunities that modern instrumention gives organic chemistry in general. We can try a lot of things out now, on smaller and faster scales than ever, and with far better characterization on-the-fly than ever before, and we should be taking advantage of that.
In general, ". . .while people are always more important than machines, increasingly we think that it is foolish to do something machines can do better than ourselves". And what machines can do better, in many cases is the vast around of grunt work that goes into a lot of chemistry research. (And we shouldn't be afraid to keep redefining what's "grunt work" and what isn't). That should free us up to do more interesting work, and it's up to us to see that it does.
+ TrackBacks (0) | Category: Chemical News
Constellation Pharmaceuticals of Cambridge, a big player in the epigenetics area, struck a deal with Genentech in 2012. And (faster than anyone thought), it's 2015 already, and the collaboration stage of that partnership has come to an end. Roche has picked up three targets from the work, and Constellation has quietly laid off a significant fraction of its staff.
Three years seems like a lot longer time at the beginning than it does at the end. In order to deliver on several simultaneous exploratory drug research projects in that time frame, you really have to watch every detail. That time will run through your fingers like sand if you don't pick your experiments and approaches carefully, and given the vagaries of early-stage research, no amount of care can sometimes make up for what reality has ready for you. Epigenetics is a very interesting field, but a wildly complicated one, with a lot of unknowns, so I hope that it works out in this case.
Genentech/Roche has the option to acquire Constellation by the end of the year, but that's clearly not a sure thing, so the company is preparing to go on alone if need be. The exit-by-acquisition has long been the most likely happy ending for a lot of VC-backed small biopharma companies (the mostly likely sad ending, of course, is a quiet unprofitable evaporation). There's always going public, too, but the Genentech deal would seem to have ruled that out as a possibility, and I don't think that in 2012 anyone could have guessed just how lively the biopharma IPO market would become. There are surely companies of Constellation's size and with their prospects who have floated stock by now, for better or worse, but Constellation will apparently not be among their number until at least late in 2016 (and who can say what conditions will be like by then?)
+ TrackBacks (0) | Category: Business and Markets
January 16, 2015
Unworkable compounds are one thing. Unworkable processes and reactions are just as big a problem, though. You don't see as many paper proposing those as you do the ones advancing squirrely chemical matter, but they're out there. Here's an example from Quintus, who takes a look at this paper's route to some prostaglandin intermediates.
Unfortunately, he finds several reasons to wonder if this could ever be a viable process route, and I agree with his points (see his post for details). I think that process chemistry is one of the widest gaps between industry and academia. Traditionally, university labs have never had to pay attention to the factors that industrial scale-up labs have had to. Why should they? When you get down to it, the drug discovery labs at the other end of the industrial hallway often don't pay much attention to those factors themselves. If other chemists at the same company don't have that mindset, what chance do you have to find it in an academic lab?
The things I'm referring to are reproducibility (in yield, in impurity profile, in reaction setup, course, and workup), safety (no exotherms, no bad decomposition profiles), waste handling (limited number of acceptable solvents), cost (no horrible yields, no wildly expensive stiochiometric reagents), sourcing (solid, reliable suppliers for everything), and ease of operation (no chromatography, no fractional vacuum distillation, etc.) You can break one or more of those constraints (well, some of them!), but you'd better have a really, really good reason, and you'd better be prepared to demonstrate that you considered all the alternatives. The paper Quintus discusses contains several deal-breakers, unfortunately.
It's an orthogonal state of mind to the standard one for a lot of academic organic synthesis ("I have to make this exact compound in high yield, and damn the cost and inconvenience"). And it's orthogonal, in another direction, to the mindset of a lot of early-stage drug discovery chemistry ("I have to make a big variety of compounds, and I don't care how as long as it's fast"). Process chemistry is all about caring how things are made, down to every last detail, and it's a tricky and interesting way to make a living.
+ TrackBacks (0) | Category: Life in the Drug Labs
January 15, 2015
I can agree with this one: "Promiscuous 2-aminothiazoles: A Frequent Hitting Scaffold". The authors (the med-chem group from Monash University in Australia, with a collaborator from AstraZeneca) make the case that the title compounds are usually more trouble than they're worth. They noticed that the 4-phenyl derivative went fourteen for fourteen in a set of diverse fragment screens, which certainly make a person wonder.
I've seen some similar behavior in that series, and I think that a lot of people have the impression that aminothiazoles hit pretty often in HTS campaigns. Indeed, there are several varieties of them in the canonical PAINs list. This paper investigates a range of substitution patterns, and finds that, far too often, they tend to hit a wide variety of protein targets. There's no clear SAR to get you out of that problem, either, which mirrors the general experience with PAIN-type compounds of flat SAR with no obvious trends. The binding is not due to impurities, nor is it covalent - it seems to be legitimate interaction with the protein, but there's just too damn much of it. In fact, several of these compounds seem to be binding at multiple sites per protein, which will certainly makes things hard to interpret.
But at the same time, you can find aminothiazoles in several marketed drugs, so you can't just rule the entire class out by saying that it never leads anywhere. As with many of the PAIN scaffolds, the best way to characterize the problem might be to say that although such motifs may appear in marketed drugs, your chances of success are likely to diminish if you try to follow their lead. The binding modes can make assay and SAR data hard to interpret during the early stage of optimization, and the over-friendly nature of the binding groups will always be a worry as you go on to cell assays or into animal models. You can press on, but your life has become more complicated. If you have something else equally interesting to work on that doesn't have these issues around it, you might be better off heading in that direction.
And to reiterate something that's come up a lot around here recently, and no doubt will again, the key thing about any new papers that report such structures as hits is to see whether the authors recognize any of this. Too often, such compounds get reported as if they were just happy HTS hits with nothing to hide - perfectly legitimate tool compounds, or even candidates for preclinical optimization, why not? But if the people doing such work have missed out on the known issues around their chemical matter, what else have they missed?
So here's a friendly competition: let's see who can spot the first 2-aminothiazole paper of 2015, showing this as hits for some assay, and let's see if it mentions anything about their promiscuity. The real acid test will be in about a year, when everyone who's even writing a manuscript has had time to see this one, but people have had time to see the other PAINs paper, and what good has that done them?
+ TrackBacks (0) | Category: Chemical News | Drug Assays
Here's a very worthwhile review of kinase inhibitors by Oliver Hantschel, specifically focusing on the things that you might not otherwise notice. There are plenty of kinase inhibitors available in the catalogs as research tools, and they're listed as "Inhibitor of XYZ Kinase", as if that's all that they do. But what's really going on?
Quite a bit, in some cases. Making a truly selective kinase inhibitor has never been easy, especially in some parts of the kinase landscape. Hantschel makes a point that is worth thinking about - that it was perhaps fortunate that Gleevec (imatinib) was the first approved drug in this class. This is one of the more selective compounds, and its side-effect profile is quite manageable. If some of the later compounds had instead come first, the field might have developed differently. (Keep in mind that for some time kinases were considered basically undruggable, since the binding sites were considered too similar and the likelihood of trouble too high. The only kinase inhibitors known tended to have funky structures and poor selectivity, which was a self-reinforcing situation. I think, though, that people were still surprised that some of the broader-spectrum inhibitors were tolerated, well after the field was an accepted area for drug discovery).
The paper goes over the modern techniques for profiling kinase selectivity, which now can run to assays using hundreds of kinases. At this level of detail, most of the marketed kinase inhibitors still inhibit a dozen different enzymes or so, which is worth thinking about when such compounds are used as research tools - plenty of wrong assignments have been proposed over the years, and plenty more are surely waiting to be made. Modeling can tell you about which kinases have the most similar binding sites to the main target, which is a good start. But there are several compounds that are known to strongly inhibit different kinases by binding in completely different fashions, and throwing open a computational search to pick up changes of that size is problematic. Better to do the actual screening, when possible, while remembering that in vitro ranking may not completely translate to what's actually going on the cells.
As experiences in the clinic have demonstrated, there are an awful lot of kinase mechanisms in vivo that we don't yet understand. Dosing these inhibitors in animals and humans has helped to uncover a lot of new biology, but people should keep in mind that this is what they're probably going to be doing with each new compound. Hantschel goes into a couple of good examples of this (RAF and JAK2) where the initial rationale for the inhibitors as drugs turned out to be only part of some much larger and more complicated stories.
So there's plenty to think about just inside the kinase universe. The review goes on, though, to detail the many and various off-target effects that have been seen. A variety of non-kinase proteins can be picked up, including bromodomains, nucleotide-processing enzymes, oxidoreductases, and more. Some of these could be relevant to the cellular phenotypes that could otherwise be ascribed to kinase inhibition. It's important to keep in mind that kinase inhibitors are, in many cases, targeting binding motifs that have been re-used a number of times in living systems. The worries from thirty years ago about the whole field being untouchable were overblown, but it doesn't mean that there aren't plenty of surprises waiting.
From a drug development standpoint, those surprises are a bug. From a basic research standpoint, though, they're a feature. There's really no way that we could have learned so much about all these pathways other than by developing all these compounds and trying to untangle the results they've produced, and the untangling is going to continue for quite some time to come.
+ TrackBacks (0) | Category: Cancer | Drug Assays | Drug Industry History
January 14, 2015
Moderna Therapeutics, the mRNA-based company, is staying busy. They've got so much work on their hands that they're spinning out different therapeutic units - for example, Valera, for infectious diseases and Onkaido in oncology.
And the Valera unit is part of a new deal with Merck, which brings in another $100 million in funding. That, as Luke Timmerman notes here, means that Moderna has raised over a billion dollars in the last four years or so, five hundred million of that in the last month. They must have an awfully impressive set of PowerPoint slides - that's a whalloping amount of cash for a technology that hasn't proven itself in the clinic yet.
But I can see where people are coming from. If Moderna's stuff does work out, the biologics market will be upended, and a lot of untouchable drug targets will suddenly come into play. Using artificial mRNAs to cause your body to make proteins on demand is an ambitious goal, but there must be some pretty eye-opening data coming along to get Merck, AstraZeneca, and many others to put up this much money this early.
+ TrackBacks (0) | Category: Business and Markets | Infectious Diseases
You may remember Anil Potti, the cancer researcher at Duke whose biomarker-driven therapies turned out to be so poorly designed as to be useless. (Or you might recall the bizarrely clumsy firm that he hired to try to burnish his online reputation).
But what you probably don't know (I certainly didn't) was that someone in Potti's own research group, a third-year med student named Bradford Perez, had figured out that things were going wrong and had reported his concerns to the university. We wouldn't know that, because Duke has stated that they received no such whistleblower reports. The Cancer Letter, however, has the memos and e-mails, which flatly contradict the university's statements. This has come to light via a lawsuit from the families of some of the affected patients, and will no doubt make interesting reading at the upcoming trial.
Whatever its legal significance, the memo and the flurry of emails it touched off provide new insight into Duke’s handling of the Potti controversy:
• The memo shows that, by ignoring the content of the Perez memo, Duke’s deans allowed Nevins to investigate his protégé himself.
• Responding to Perez’ memo, Nevins and Potti promised to conduct a review of the data in April 2008. A thorough, unbiased review of this sort would have produced evidence of fraud, statisticians say.
• Emails demonstrate, step-by-step, how Duke officials convinced Perez to present his principled stance as a difference of opinion between him and two senior scientists.
Perez started to realize the situation he was in during the review of a paper he was publishing with Potti in the Journal of Clinical Oncology. Reviewers had noted the questions raised by Keith Baggerly and colleagues about early work in the Potti group, and were asking for more details about the statistics in this manuscript. And when he started digging into that, he found (as he put it in an e-mail to a third party) that the lab's techniques for validating its methods amounted to "erasing the samples that don’t fit the cross validation from the figure and then reporting the cross validation as meaningful and justification for a good predictor".
Perez, after several months of trouble, ended up writing a detailed memo on all this to a director at the medical school, Joseph Nevins. He laid out exactly what had been going wrong, in detail, and went on to say:
At this point, I believe the situation is serious enough that all further analysis should be stopped to evaluate what is known about each predictor and it should be reconsidered which are appropriate to continue using and under what circumstances.
“By continuing to work in this manner, we are going a great disservice to ourselves, to the field of genomic medicine and to our patients. I would argue that at this point nothing should be taken for granted. All claims of predictor validations should be independently and blindly performed. Unfortunately, since validation databases on the supplementary website have been shown to be misrepresented in multiple situations, those datasets should be obtained from their respective sources through channels that bypass the researchers.”
As things turned out, he was completely correct. What was the reaction from Nevins and from Duke? To ask him not to bring these complaints forward to anyone else, and to promise an internal investigation. But this is still two years before all the trouble came to light, and before another round of suspect trials had even started. Despite promises that all the data would be re-evaluated. Perez left the Potti lab (understandably), but the university presented this situation to the Howard Hughes Medical Institute (the source of funding) as a "difference of opinion" between a student and a professor, and stated that "It is important to note that there have been no allegations of scientific misconduct". But that wasn't the case. As the various emails show, the phrase "research fraud" had already come up, and not for the last time, either.
Bradford Perez's part in exposing all these problems has been unknown until now - well, unknown to everyone, apparently, except a long list of a higher-ups at Duke. I'm glad to see him getting his due. The article quotes Donald Berry of MD Anderson, a guy who knows his clinical research statistics, saying:
"Brad Perez is a hero. . .(but) there is more to this story than the heroic and principled actions of an erudite young man and the shame that has befallen a great university in blindly and selfishly defending its own. It is indicative of a lack of understanding of the scientific method among many scientists.
“The Duke scandal is extreme, to be sure. But irreproducibility in academic research is common. And the reward structure and complacency of universities is to blame. . ."
Quite so. (And yes, it's not like there are no problems with the reward system in industrial research, either). But Duke did this to themselves, and let Anil Potti do it to them, despite (as is now clear) numerous opportunities to have caught things earlier. (They only really got into gear once Potti's CV turned out to have been enhanced with things like a nonexistent Rhodes Scholarship). The Potti scandal was and is disgraceful, and so was the university's handling of it. But faculty and administrators at other universities shouldn't kid themselves into thinking that this was just a Duke problem. Things like this can happen all over the place - the opportunities and the incentives are there. There is a constant supply of people like Anil Potti, and a constant supply of administrators who don't want to hear about their conduct, and who are willing to stall and obfuscate in the hopes that such problems will just go away quietly. I'm not so sure if there's such a constant supply of people like Brad Perez, but we can hope.
+ TrackBacks (0) | Category: Cancer | Clinical Trials | The Dark Side
January 13, 2015
Here's a paper in Nature Chemistry on computational simulation of GPCR activation, using the beta-2 receptor as a model. I'm writing as someone who's worked on GPCRs, who is interested in such mechanisms, but who is not a computational chemist. And as such, I have some real reservations about the paper. Are my misgivings well-founded or not?
What this team at Stanford has done is a massive amount of molecular dynamics work, attempting to capture details of the the conformational changes that have to be taking place during receptor activation. They present plots illustrating the movements of key residues versus time for agonists, antagonists, and inverse agonists, at a very fine level of detail. I am not competent to judge the fitness of their MD software, their use of Markov state models, the methods by which they reduce 3,000 of those MSMs down to a ten-state model of the receptor dynamics, and so on. I'd be glad to hear from people who are - for now, I'll assume that all this has been done to a high standard.
But the paper does get into some areas that I feel able to question. Fellow chem-blogger Wavefunction's Twitter account first alerted me to the feature that I find most disturbing. Take a look at these structures. They're supposed to be from the GPCR Ligand and Decoy Database, published here, which this paper used as a source of more agonists, antagonists, and comparison compounds. But when I search the beta-2 files from that server (both agonists and antagonists, ligands and decoys), I can't find any compounds with the left-hand sides of compounds 2 or 3. They're not in the ZINC database, either, as far as I can tell. (A minor point is that the current paper refers to them as catecholamines, but that's not right, either: a catechol is a dihydroxybenzene, and there's an extra methylene in these. Such structures can be found in beta-adrenergic ligands, but they're not catecholamines).
Then there's the more serious matter of that hemiaminal. That's not the usual pharmacophore for an adrenergic receptor, which has amine and OH on adjacent carbons, and it's not even really a stable group under most conditions. That was the first thing that struck me when I saw the structures - how are these things GPCR ligands, when I wouldn't even be sure that they're stable in buffer?
So I don't know where these compounds come from, or if some mistake has been made along the way, but that's how it looks to me after a bit of digging around. I took a look through the literature for structures like these, and I did find some in an old Theravance patent, WO03042164 (see compounds 16 through 26). Looking over the patent procedures, though, I think that those structures are a mistake. The claimed chemical matter (and the synthetic procedures used in the rest of the patent) are all directed to making traditional beta-hydroxy-amino structures. The patent has a run of these hemiaminal things, supposedly made by the same coupling procedures that make the real beta-receptor ligands in the rest of the patent, and that's not going to work. My guess at the moment is that some beta-receptor ligand database has been corrupted by inclusion of these structures, which may well have propagated from this application or others in the series. At any rate, I don't see, at the moment, how these things are beta-receptor agonists, and I would have to say that running molecular dynamics simulations on them is not the best use of computing cycles.
And that brings up one last problem I have with this paper, which may be a minor one (or maybe not). The title is "Cloud-based simulations on Google Exacycle reveal ligand modulation of GPCR activation pathways". I can't help but thinking of something Bill James (the baseball statistics guy) wrote back in the 1980s. He was doing one of his exercises to try to predict performance, and said that he'd done a "computer projection". But then he backed up, and said that he figured that this phrase was probably going to disappear from use in the coming years, because the only way to do these things was with a computer, and you didn't say that you'd done a "pencil projection" or something. The tool would become so common that it would disappear into the background. To a large extent, that's just what's happened - but it was a sufficiently novel thought back then that I found it striking.
So I have to wonder, perhaps unfairly, if the "Google Exacycle" part is there to bring in some more attention. It's true that the cloud-computing aspect of this work did allow the authors to do a lot more than the usual MD simulation, and it may well be a difference of kind rather than just a difference of degree (again, I'd be glad to hear from computational folks on this point). But it can't help sounding cutting-edge, can it?
Update: Both Stuart Cantrill of Nature Chemistry and lead author Vijay Pande have shown up in the comments section, and I appreciate both of them coming by. These issues are being looked at - more later as details become clear.
Second Update: Prof. Pande says in the comments that this is indeed a drawing error. The correct structures are more reasonable beta-receptor ligands, and those are the ones that were docked. The paper is being corrected.
+ TrackBacks (0) | Category: In Silico
January 12, 2015
Bad things get said around here about rhodanines. That chemical class tends to hit in a lot of assays, especially assays against hard targets, leading to thoughts like the ones I was expressing the other day and in this post as well.. The problems with these compounds are many (and are mentioned in those blog posts linked above). They include the meta-problem of the sheer number of targets that these compounds can hit (making them problematic as leads or tools). That promiscuity is due to a number of mechanisms - direct binding, photoreactivity, and now we can add hydrolysis to the list.
Here's a new paper from Nature that shows that some rhodanine-based activity against penicillin-binding proteins and beta-lactamases comes, not from the parent compound, but from a thioenolate hydrolysis product.
This was the unexpected result of X-ray crystallographic work on the lead series, and it makes things rather hard to interpret:
". . .there are concerns regarding the promiscuity of rhodanines. Some consider rhodanines to be promiscuous inhibitors dueto their frequent appearance in high-throughput screening.However, rhodanines have also been described as ‘privileged scaffolds’ for inhibitor development. In cellular/biological contexts, promiscuity can make it difﬁcult to correlate biochemical and biological outcomes with rhodanines. An important outcome of our work is the addition of another layer of complexity to the interpretation of rhodanine-based inhibition. . ."
So if you see a screening paper cheerfully reporting a rhondanine-based hit, and they haven't checked this issue, or the other known ones, then you know that you are reading inadequate work. And you can wonder what else has been missed.
+ TrackBacks (0) | Category: Chemical News | Drug Assays
The J. P. Morgan Healthcare Conference is going on right now, so there's a lot of information coming out from the companies attending. One thing I noticed this morning was another blast of successful data for nivolumab from Bristol-Myers Squibb. A trial of this anti-PD-1 antibody was stopped early due to efficacy (not something that most of us get to experience very often!), and this follow up on encouraging data reported last fall. From all indications, both cancer patients and BMS are going to do very well indeed with this compound - the number of clinical trials and combinations being studied with it must be hard to keep track of by now.
+ TrackBacks (0) | Category: Cancer | Clinical Trials
January 9, 2015
In case you were wondering, Chuck Norris does not know what he's talking about when he talks about drug discovery and development. In other news, I don't know what I'm talking about when I talk about martial arts or tv/movie production. But I'm not trying to tell Chuck Norris about those things, now, am I?
Update: Chemjobber finds that these are not only borrowed ideas on the part of Norris, but borrowed words as well.
+ TrackBacks (0) | Category: Press Coverage
I don't want people to get the impression that I automatically scorn academic drug discovery efforts. That paper yesterday was solid stuff, for example. And this new paper was pointed out to me last week by someone who wanted an opinion from industry. It's from an Australian team (Monash and Queensland), looking at fragment-based drug discovery against the bacterial enzyme DsbA. That's a potential virulence target, a therapeutic option that hasn't made it as far as the bacteriocidal or bacteriostatic ones. But there are certainly infections for which decreasing virulence would be desirable, and might at the very least get patients out of some very nasty acute situations. Since we're having trouble coming up with new agents to kill gram-negative bacteria, perhaps we could defang some of them instead. Full disclosure: it turns out that I've worked with one of the authors on the paper, although I didn't realize that until after I'd read most of it!
These bacterial oxidoreductases are challenging targets - as is the case for many prokaryotic enzymes, there are few small molecule inhibitors known, and in this case, the binding pocket is more of a binding groove. But fragment-based drug discovery is actually a good way to answer that question of how druggable a given site is: if you run a good-sized diverse set of fragments across it and get nothing at all, you are in for a rough time. In this case, a set of >1000 compounds (from Maybridge) were screened, and there were, in fact, some hits. Several structural classes fell out after NMR screening, but SAR follow-ups showed (as is always the case) that some of them had stronger legs than others.
A set of phenylthiazole acids performed the best, and cocrystallization experiments actually yielded a high-resolution structure of one of these compounds in the DsbA substrate-binding region. This structure suggested a region to extend the structure, so amides of the carboxylic acid were prepared. Sure enough, several of these were active, more potent than the parent, and one gave another X-ray structure. This one had its odd features - the tyrosine amide wasn't aligned where they expected it to be - but that's a common phenomenon in fragment-based work. Compounds do what they want to do, not what we think they should.
So what do I think of this work? It seems fine to me. They have two different NMR assays, SPR as an orthogonal method, and X-ray structures. That's a classic setup for fragment work, and this seems to be a straight-down-the-middle illustration of how the technique is supposed to work. There aren't very many analogs reported, which makes me think that chemistry resources might have been the limiting factor. If you're expanding via an amide linkage into unknown territory, that's the "Tally Ho!" signal in an industrial lab to crank out a big diverse set and take advantage of some easy chemistry. The best compound in this paper is still only about 200 micromolar, and its ligand efficiency is not that impressive. But this is a snapshot of a very early point in a project; there's a lot more to be done.
This brings up another point: I think that fragment-based work is actually quite suitable for a lot of academic drug discovery efforts. People already have some of the most important instruments (NMR, for one), and the compound collections have a low barrier to entry, since they're smaller and can be bought into pretty easily. You can't just stumble into the area and expect to crank out good work (what area of drug discovery, or science in general, is that true for?) But if you want to start finding ligands or inhibitors against a given target, fragment-based discovery is probably the quickest way to get into the game.
Even at that high-micromolar potencies seen here, the final compound in the paper shows effects on bacterial cultures that are consistent with its mechanism of action. Inside a drug company, this would be an example of hit-to-lead work, with the result ready to be turned over to a med-chem team for heavy-duty optimization. I've seen far shakier project packages than this one, for sure, and based on what's in this paper, I would have no problem recommending it for further work. (I'm assuming that the decision of "Do we want to go after virulence targets?" has already been made, of course - you have to clear up those sorts of questions before you even start). This is not one of those spectacular, jaw-dropping fragment papers where people report what looks like about ten minutes of work to zoom down to a nanomolar compound. No one doing industrial anti-infectives drug discovery is going to be knocked over by this, but no one's going to bury their head in their hands, either, and that's an all-too-common reaction. It's fragment-based drug discovery just as I've experienced it myself - a solid piece of work, and the team reporting it clearly know what they're doing.
+ TrackBacks (0) | Category: Infectious Diseases
January 8, 2015
Here's a good article on Scientific Crackpottery, from someone who got to experience it first-hand. The author distinguishes three classes that might be confused with each other. First are the mountebanks, who play to the general public, and who, he's convinced, know deep inside that they're fooling their audience, and enjoy it. His example, which I would endorse completely, is Deepak Chopra. I'm tempted to add Dr. Oz to the list, although I think he's got enough capacity for self-deception (and enough self-regard) that he imagines that he believes his own press releases and thinks that he's out there as a force for good in the world.
Then you have the scientific con men, who arrive with what looks like a world-shaking result that turns out not to exist. Their motivation, at least in large part, is a too-ambitious reach for glory. And finally, you have "heretic-heroes", who take odd, unheard-of, or unpopular positions (like the other two groups might), but are still approaching them like scientists. They are willing to be proven wrong, and sometimes they're proven right.
That's a useful razor to slice away the cranks and the crackpots from those who deserve a hearing. Ask them what experimental result might make them change their minds, what the strictest test of their ideas might be. If someone is holding on to their views no matter what the evidence, they've gone off the rails.
+ TrackBacks (0) | Category: Snake Oil
A team from Northeastern University/Bonn/Novobiotic (and Selcia) has published a very worthwhile paper in Nature on a new antibiotic, with a new mechanism of action, via a new discovery technology. The compound itself is not the world-changing new last line of defense that everyone's hoping for, but it's nothing to sneeze at, either. And the platform used to find it is worth keeping an eye on.
A lot of people have had similar ideas to this one, based on the fact that the overwhelming majority of bacteria in any given environmental sample can't be readily cultured. These organisms may well be able to produce useful antibiotics and other natural products, but how will you ever be able to tell if you can't fish any of them out? In this work, Kim Lewis and Slava Epstein at Northeastern came up with a gizmo (the "iChip", a name that you'd think would have been taken several times by now), that tries to get around this problem. After taking a soil sample and diluting it into media, you dispense aliquots into the wells of this chip. Then the chip is placed back into the soil, in the same place the sample was taken from, where semipermeable membranes allow environmental factors (whatever they may be) to diffuse across. This gives, apparently, a much higher culture success rate.
Using this on a soil sample from Maine and leaving the chip in situ for a month, a number of colonies formed. These were tested for their ability to grow outside the device in fermentation broth, and extracts of these were tested against pre-grown lawns of an S. aureus strain to look for useful antibiotic activity. Lo and behold, one extract cleared out a large spot - it turned out to come from a newly described bacterium (Eleftheria terrae, provisionally). The compound present has been named teixobactin, and here it is.
Purification of the active broth extracts showed increasing amounts of a compound with MW 1242 by LC/MS. Isolation of this substance (after clearing out the endotoxin that came along with it) allowed a good deal of NMR spectral assignment work, and coupling this with the identification of the biosynthetic gene cluster and degradation/derivitization analysis (advanced Marfey's method) has allowed the structure to be assigned. It's a pretty chewy one, with two rare amino acids and four D-amino acids. That would account, you'd think, for its properties: teixobactin has reasonable microsomal stability and shows useful PK after an i.v. dose in mice. (The authors also checked it for hERG, CYP inhibition, and plasma protein binding - the sort of thorough med-chem workup that you'd like to see more often, honestly).
And let's pause a moment to reflect on the world we live in these days. This new bacterium's genome was totally sequenced, as a matter of course, because we have the tools to do that without a second thought. Not all that long ago, just figuring out what this organism might be related to would have been a whole project of its own. Its biosynthetic pathway was thus laid bare, and our accumulated knowledge of nonribosomal peptide synthetase proteins made clear how this compound is produced, which is another thing that would have taken a big chunk of someone's life in the old days. And then modern LC/MS and NMR techniques made comparatively short work of the structure, and any organic chemist should realize what that would have been like. Had anyone stumbled on teixobactin in the 1950s, back in the golden age of antibiotic discovery, they could have easily spent their career on it.
So how useful is the compound? It's active only against gram-positive organisms, which is too bad, because we could really use some new gram-negative killers (their cell membranes make them a tougher breed). But the mechanism of action turns out to be interesting: studies of S. aureus with labeled precursors showed that teixobactin is a peptidoglycan synthesis inhibitor, but extended exposure and passaging did not yield any resistant strains. That's close to impossible if an antibiotic is binding a particular protein target - stepping on the selection pressure will usually turn up something that evades the drug. When you don't see that, it's often because there's some nonspecific non-protein-targeted mechanism, which can be problematic, but teixobactin isn't toxic to eukaryotic cells in culture (and has a favorable tox profile in mice as well). It turns out that it binds to some of the peptidoglycan precursors, lipid II and lipid III. Vancomycin has a similar mechanism (binding to lipid II), but teixobactin has a wider spectrum of activity against lipid II variants (and lipid III as well). This mechanism makes developing resistance not so straightforward - the selection pressure is more of a bounce shot than a direct hit.
So overall, I'd say that the compound could be a promising alternative to vancomycin, and that's no bad thing. There are vancomycin-resistant strains of S. aureus out there, and if you get infected by one, you're going to need all the help you can get. I would assume that Novobiotic is working on the development, and I hope it goes well. There are plenty of challenges ahead (a reproducible scale-up fermentation being one that comes immediately to mind), but the compound has a good preclinical start on things. Teixobactin itself is not going to save the world from the oncoming problem of bacterial resistance, but it represents a promising line of attack.
This idea of going after cryptic bacterial strains has been around for a long time, but getting it to work has been another thing entirely. This is the most solid example that I'm aware of, and I hope that it's just the opening of a new platform for antibiotic drug discovery. The traditional search for natural product antibiotics has pretty well come to a shuddering halt over the years - no matter how much effort you put into increasingly exotic soil samples and the like, you keep finding the same things (if you find anything at all). Unculturable organisms are the new frontier, and it's safe to say that the iChip is going to be nowhere near the last word in exploring it. And at the same time, you have outfits like Warp Drive Bio trying to get organisms to express unusual compounds that aren't normally seen, so the hope is that there are a lot of useful things out there that that we have never heard of.
As an aside, if you're outside the field, you might wonder why it's worth working so hard to find natural products when we have so many synthetic organic chemists in the world cranking out new compounds. One big reason is the ridiculous, insane hugeness of chemical space: the number of possible compounds at or under the molecular weight of teixobactin defies description, and I mean that in a completely literal sense. There are not enough resources on Earth, or in our entire solar system, to do enough organic synthesis to make any noticeable dent in that array. The idea of having compounds that bind to things like Lipid II is a good one, but they're going to have to be large compounds, and exploring that space is daunting.
And then there's the evolutionary factor. Bacteria are out there elbowing each other for space and nutrients every minute of the day, and they've been doing it for billions of years. They've had the time and motivation to come up with molecules that work to kill off their rivals, and we should take advantage of that legacy as much as we can. These molecules have not only had rigorous real-world tests of their mechanisms of action, they've also (perforce) had their properties optimized against their target organisms as well. Believe it, most large peptidic-looking things like teixobactin would have awful stability and pharmacokinetics as drugs. Finding the ones that don't is nontrivial indeed.
+ TrackBacks (0) | Category: Infectious Diseases
January 7, 2015
I don't often note world events here. But since a blog is the embodiment of freedom of speech and freedom of opinion, I'm taking some time to remember the journalists in France who were killed today over nothing more than that. As Stéphane Charbonnier, one of the victims put hit himself a couple of years ago:
"Je n'ai pas peur des représailles. Je n'ai pas de gosses, pas de femme, pas de voiture, pas de crédit. ça fait sûrement un peu pompeux, mais je préfère mourir debout que vivre à genoux."
("I am not afraid of reprisals. I have no kids, no wife, no car, no mortgage. It surely is a bit pompous, but I'd rather die standing than live on my knees")
Australian cartoonist David Pope has it right.
Update: the comments section on this post is now closed.
+ TrackBacks (0) | Category: Blog Housekeeping
Since I've been following the progress of Zafgen and their unusual drug candidate on the blog, I wanted to note that it just resoundingly passed another Phase II trial. This one was in patients who've gained weight to due hypothalamic injury. The Phase III is in the works.
Zafgen's compound is one that most of us would have put a red X through as soon as we saw its structure. Since I was writing earlier today about odd structures and the problems they can represent, it's important to keep the other side of the argument in mind: if you have solid data, no structure is too weird.
+ TrackBacks (0) | Category: Clinical Trials | Diabetes and Obesity
When you look into the literature on small-molecule agents for really tricky targets, something stands out to medicinal chemists immediately: the structures start to get strange. Examples of this sort of thing are beyond counting, but this recent paper will serve as well as any. It's from a large multicenter academic team, and proposes several compounds as ligands for Bax, a protein of the Bcl family that's involved in apoptosis and is a potential target for a range of lung cancers. There's a particular serine residue whose phosphorylation has been shown to alter Bax function significantly, and the present paper notes that there may well be a small-molecule-sized pocket nearby.
The work starts off by computationally docking a large collection of molecules from the NCI database to a model of this pocket. 36 out of 300,000 were found to score well, so the team exposed Bax-expressing cells to all 36 of them and looked for effects. Three of the compounds showed apoptotic effects, which were less marked in lung cancer cell lines that expressed less Bax. The compounds were able to compete with a fluorescent protein that also binds to Bax, and this assay gave affinities of around 50 nanomolar. Update: see the comments. These compounds give weirdly similar data.
These results sound promising. But let's look at the structures of the compounds. SMBA1 is not very attractive, for sure. You'd want to make sure that any fluorescence assay involving it doesn't suffer from some sort of interference, because it sure looks like a compound that would make its presence known in the UV/Vis spectrum. I have no experience with these sorts of fluorenylidene phenols, but some stability and reactivity checks would not be out of place.
The problem is, SMBA2 and SMBA3 make that first compound look like ibuprofen. I know that the NCI compound collection has a lot of wooly stuff in it, but come on. These are formaldehyde/amine condensation products. In aqueous solution, they exist in various equilibria depending on pH, and such structures are known to be reactive reagents in organic chemistry. (See the Delépine reaction, among others). SMBA3, in fact, is an intermediate structure from that very reaction, as if someone were trying to synthesis allylamine. I would be very worried about exposing such compounds to cells - there are no guarantees that they're going to remain the way that they're drawn, and the species that form can be quite reactive. Figuring out what's really going on with them would be quite a job.
I can see no indication from the paper that any of this bothered anyone. The only characterization that seems to have been done with these compounds was some DLS, dynamic light scattering, to look for aggregation. Nothing wrong with checking that, but there's no other chemical characterization or purity assessment of the compounds at all, or at least none that I can see. I mean, they're probably what they say on the label, but who can tell? And even if they're pure, how do they behave under the assay conditions?
The problems don't end there. SMBA2 is said to have 57 nM affinity to Bax. But that's an extraordinary value for a compound with a molecular weight of 168 - that kind of ligand efficiency should make a person suspect covalency. Covalent compounds are not necessarily bad, but you do want to examine them closely to see if they're working on the target you think they're hitting. I don't think the paper has any tests for that, such as checking for time-dependent inhibition or looking at the isolated Bax protein from the assay by mass spec. Meanwhile, SMBA3 has 54 nM affinity, and is shown as having a molecular weight of 308. But that includes the iodide counterion, and that doesn't count: this thing is not floating around in cells, nor binding to its putative target, with its iodide partner. Its real molecular weight is about 181, and that also represents a wildly high ligand efficiency. (There's also the question of how a charged quaternary compound like this gets into the cells so easily, but that's another issue).
The docked structures of these three compounds with Bax are shown in the paper. The two formaldehyde/amine compounds are shown interacting with two Asp residues, and there had certainly better be some strong interactions if you're going to pull out potencies like these. But the positively charged nitrogen in SMBA3 would surely seek out a negatively charged Asp if this were the case, a classic salt bridge, but that's not what the docking shows. Meanwhile, SMBA1, the fluorenyl compound, is shown just sort of floating there in space, not doing anything with the Asp residues, and not showing any strong interactions that I can see from the figure at all. Even the phenol doesn't seem to be doing anything, unless that's a pi-stack with Phe176. These dockings structures do not, unfortunately, inspire confidence.
So in light of all these objections and complications, let me get back to the point from the first paragraph of this post. When you get weird-looking structures out of a screen for a difficult target, you can explain them two ways. One possibility is that such targets, and such binding sites, are not evolutionarily optimized to bind any particular small molecules, and that regular "drug-like" chemical matter should not then be expected to hit them any more than anything else. Odd sites, in this view, will generate odd molecules. The other possibility, though, is that these things are false positives. The ways in which compounds can fool you are legion - in this case, I've mentioned some of the possibilities already, but there are more where those came from.
One thing that's for sure about targets like this one is that their intrinsic screening hit rates are very low. So that means, necessarily, that if the false positive rate is at any realistic level, then the bulk of what you get out of a screen will be just that: false positives. The challenge in screening these things is to dig through the garbage heap in search of the few hits that might be real. It's not a lot of fun, in some cases, because the list can be long and playing endless whack-a-mole with the compounds on it can be wearying. But you have to give all your hits the same tough love, or you run a significant risk of wasting your time.
So yes, I think that the odds are good that the compounds reported in this paper are false positives. They clearly seem to have cellular effects, and these may well be mediated by the Bax pathway. But it's a big leap, at least for me, to believe that everything lines up the way it's supposed to here. Odds are that anything reported for this Bax binding site is a false positive, and (although I hate to say it) this paper hasn't convinced me that its authors have given this problem sufficient attention.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Cancer | Chemical Biology | Drug Assays
January 6, 2015
Something to send along to any autism conspiracy theory advocate you might know - especially if you're not interested in ever speaking to them again. And you may well not be!
+ TrackBacks (0) | Category: Snake Oil
I've conveyed my dislike of wide-open office plans several times, and my suspicions of the motives of those who promote them. Here's an article at the Washington Post that confirms my own biases (and is therefore stunningly accurate):
As the new space intended, I’ve formed interesting, unexpected bonds with my cohorts. But my personal performance at work has hit an all-time low. Each day, my associates and I are seated at a table staring at each other, having an ongoing 12-person conversation from 9 a.m. to 5 p.m. It’s like being in middle school with a bunch of adults. Those who have worked in private offices for decades have proven to be the most vociferous and rowdy. They haven’t had to consider how their loud habits affect others, so they shout ideas at each other across the table and rehash jokes of yore. As a result, I can only work effectively during times when no one else is around, or if I isolate myself in one of the small, constantly sought-after, glass-windowed meeting rooms around the perimeter.
That does sound hideous, and I'm very glad that I haven't had to experience anything of the kind. My take remains that radically open plans are beloved by architects, because it gives them a freer hand and allows them to sell the latest, hottest thing to their clients. And some high-level people in companies like it, because they want to believe that this trendy stuff will do what it says on the label - make people innovative and productive. And although many employees dislike these setups, some do like the feeling that they're in a new forward-thinking world. But I have yet to see anything convincing, with any hard data behind it at all, that says that open offices are a good idea or do what they allegedly do for a workplace. Mostly, I've seen the opposite.
The only inarguable hard data I've seen on open plans is that they're cheaper. A search for further explanations may not always be necessary.
+ TrackBacks (0) | Category: Life in the Drug Labs
A longtime reader sent along these two items as indications of just how high-end it's getting in some parts of the scientific publishing game. First off, we have King Abdulaziz University (of Saudi Arabia) aggressively recruiting for their "International Affiliate Program". What might this be? Well, here's the deal: you, as a reasonably highly cited academic in some other country, sign up for a salary from KAU (as fans of its sports teams call it) and you only have to show up in Jeddah itself for three one-week visits a year. Did I mention that the salary is $6000/month? And that they pick up business-class airline tickets for you, and that you stay in a five-star hotel while in residence? Well, they do, you know. And what do you have to do in return?
Why, you just have to partner with some Saudi faculty member, work with them on some project or another, and make absolutely sure to publish papers with them. And you also have to make sure that you change your affiliation, in listings like ISI's and other citation-tracking services, to show that you're now part of the KAU team. What could be simpler? King Abdulaziz University gets to ratchet itself up the rankings, you get to sell your good name, and everyone's happy - right?
So from now on, I will assume the worst: if I see any publications from this institution that have a more-highly-cited co-author from outside Saudi Arabia, I will conclude that this person has prostituted him- or herself for cash and a few nights in some Saudi Arabian hotel. And I will also assume that the research itself is likely of little use or interest, other than in the service of boosting citation counts. You can also get a dog to play with you if you loop a piece of steak around your neck, while we're on the subject.
We now turn over another slimy rock, to find ways to build your publication record. Maybe even to the point where King Abdulaziz University might want to grease you with loot - the sky's the limit here. A close look at phrasing across a wide variety of published papers has revealed that certain sentences and paragraph structures seem to appear far more often than one might think. (The most suggestive of these refers to a statistical test that turns out not to exist). All of these seem to go back to teams of Chinese authors, interestingly, which suggests that either these authors, from different fields and institutions, are somehow finding ways to plagiarize each other, or that these are the fingerprints of a common work-for-hire source.
In November Scientific American asked a Chinese-speaking reporter to contact MedChina, which offers dozens of scientific "topics for sale" and scientific journal "article transfer" agreements. Posing as a person shopping for a scientific authorship, the reporter spoke with a MedChina representative who explained that the papers were already more or less accepted to peer-reviewed journals; apparently, all that was needed was a little editing and revising. The price depends, in part, on the impact factor of the target journal and whether the paper is experimental or meta-analytic. In this case, the MedChina rep offered authorship of a meta-analysis linking a protein to papillary thyroid cancer slated to be published in a journal with an impact factor of 3.353. The cost: 93,000 RMB—about $15,000.
Such a manuscript did indeed show up at a likely journal (whose editors had been tipped off by Scientific American), and it was rejected. I wonder if there's a refund policy? Some sort of sliding scale, prorated by the impact factor of where the paper lands? At any rate, these people are not taking such rejections lying down. The Chinese paper mills are actively working to remove the element of chance:
Within two weeks of being contacted by Scientific American, BioMed Central announced that it had identified roughly 50 manuscripts that had been assessed by phony peer reviewers. The publisher told the Retraction Watch blog that "a third party may be involved, and influencing the peer review process." It is possible that these manuscripts came from paper mills. We were able to look at the titles and authors of about half a dozen of those papers. All appear very similar in style and subject matter to other paper mill-written meta-analyses, and all were from groups of Chinese authors.
So let's run the numbers: how many papers do you have to publish, at ten to fifteen thousand dollars a whack, to get King Abdulaziz University to offer to pay you off at $72,000 per year? Keep in mind that you're also getting grant money and position/tenure out of all those papers, too. The people paying up don't seem to think that they're wasting their money, and the people on the sell side don't seem to think that they're overpaying. Isn't it an inspiring scene?
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
January 5, 2015
Via Adam Feuerstein on Twitter, here's an odd story about what goes on at Pharmacyclics. CEO Bob Duggan is apparently encouraging the staff there to take a course in how to "become geniuses".
Duggan, 70, believes so deeply in Barrios' research that he has arranged for employees at Pharmacyclics to take a self-directed program to acquire the 24 genius traits. They might spend about one hour a week learning the characteristics — which include drive, courage and honesty — and complete the course of study over six months.
"It's not compulsory," he explained. "But it's offered to every individual here, and I don't know anyone who hasn't taken it."
He says that employees give rave reviews of the course material. They even share it with their families and friends.
I would be having a fit, personally. Duggan is a major donor to the Church of Scientology, and if you detect the same aroma coming off this stuff, you're right on target. A little Googling shows that this 24-habits-of-a-genius stuff overlaps with L. Ron Hubbard himself, back in the days when he was on roughly the same plane of existence as the rest of us. I would be very unhappy working for a Scientology enthusiast, but if extracurricular course work got added to the mix, I'd be out the door so fast you could hear the air clap behind me. Sheesh - I'd rather do a week of Six Sigma training, and there's not much that makes me say that.
+ TrackBacks (0) | Category: Business and Markets