About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
Not Voodoo

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
Realizations in Biostatistics
ChemSpider Blog
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Eye on FDA
Chemical Forums
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa

Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
Gene Expression (I)
Gene Expression (II)
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net

Medical Blogs
DB's Medical Rants
Science-Based Medicine
Respectful Insolence
Diabetes Mine

Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem

Politics / Current Events
Virginia Postrel
Belmont Club
Mickey Kaus

Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

« Phthalate: A Natural Product? Sure 'Bout That? | Main | SRT501 - A Trial Suspended »

May 3, 2010

The Collapse of Complexity

Email This Entry

Posted by Derek

Here's something a bit out of our field, but it might be disturbingly relevant to the drug industry's current situation: Clay Shirky on the collapse of complex societies. He's drawing on Joseph Tainter's archaeological study of that name:

The answer he arrived at was that (these societies) hadn’t collapsed despite their cultural sophistication, they’d collapsed because of it. Subject to violent compression, Tainter’s story goes like this: a group of people, through a combination of social organization and environmental luck, finds itself with a surplus of resources. Managing this surplus makes society more complex—agriculture rewards mathematical skill, granaries require new forms of construction, and so on.

Early on, the marginal value of this complexity is positive—each additional bit of complexity more than pays for itself in improved output—but over time, the law of diminishing returns reduces the marginal value, until it disappears completely. At this point, any additional complexity is pure cost.

Tainter’s thesis is that when society’s elite members add one layer of bureaucracy or demand one tribute too many, they end up extracting all the value from their environment it is possible to extract and then some.

Readers who work in the industry - particularly those at the larger companies - will probably have just shivered a bit. To my mind, that's an eerily precise summation of what's gone wrong in some R&D organizations. Shirky talks about internet hosting companies and the current dilemmas of the large media organizations, but there's plenty of room to include the drug industry in there, too. Look at the way research has been conducted over the past thirty years or so: we keep adding layers of complexity, basically because we have to - more and more assays and screens. It used to be (so I hear) all about dosing animals. Then you had cell cultures, then cloned receptors and enzymes came along (we're heading out of the 1970s and well into the 1980s now, if you're keeping score at home). Outside of target assays, the Ames test came along in the 1970s, and there were liver microsomes and isolated P450 enzymes for stability, Caco-2 cells for permeability, hERG assays to look out for cardiac tox, et cetera. You can do the same thing for the development of animal models - normal rodents, then natural inbred mutations, then knockouts, humanized transgenics. . .you get the picture.

As I say, we have very little choice but to get more complicated, because our knowledge of biology keeps expanding. But while this is going on, everyone keeps thinking that all this new knowledge is (at some point) going to start making things easier - a future era known, informally, as "when we really start figuring all this stuff out". It hasn't happened yet. If you're someone like Ray Kurzweil, you expect this pretty soon. I don't, although I hold out eventual long-term hope.

Shirky's message for the media companies is that their high-value-added lifestyles are being fatally undermined. We're not facing the same situation in this industry - there's no equivalent of free YouTube stuff eating our lunch, and I'm not expecting anything in that line for a long time, if ever. But the complexity-piling-on-complexity problem is real for us, nonetheless. If the burden gets too heavy, we could be in trouble even without someone coming along to push us over.

Comments (35) + TrackBacks (0) | Category: Business and Markets | Drug Development | Drug Industry History


1. Discoking on May 3, 2010 9:01 AM writes...

Fantastic post!

Permalink to Comment

2. Cartesian on May 3, 2010 9:38 AM writes...

It is true that complexity is relatively growing but sometime there is some persons who are coming with some new theories in order to reorganize things (make them more compatible) and then simplify them.

Permalink to Comment

3. Laurit on May 3, 2010 9:39 AM writes...

Of course if you pile up enough complexity you might also get emergent behaviour... Which is pretty much what Kurtzweil is hoping for.

Permalink to Comment

4. barry on May 3, 2010 9:47 AM writes...

thanks for bringing this one in--it's beyond my usual reading horizon.
Big Pharma now claims it costs $USD billion to bring a small-molecule drug to market. I'm sure no small company ever spent that because no small company ever had $billion.
We're a long way from Erlich's first anti-microbials in which one made a cmpd, crystallized it, dosed it in a guinea-pig model and went straight to man. Few diseases have such good correlation of animal-model to human disease. Having said that, we're spending too much money (i.e. time) trying to insure that nothing will fail in the clinic. When only the clinical trial can give the next answer, it's time to launch the clinical trial or sell the candidate.

Permalink to Comment

5. emjeff on May 3, 2010 10:11 AM writes...

#4, Barry - it must be nice to live in your world. Here in the real world, the reason Pharma is trying so hard to make sure that the next trial does not fail is that the cost of failure is now unsustainable. It used to be that failed Phase 3 trials was the cost of doing business; however, today failure in Phase 3 means layoffs and disaster to thousands (remember torcetrapib?). Big Pharma can be (and should be) berated for many things it does wrong, but berating us for trying to avoid failure is bizarre.

Permalink to Comment

6. barry on May 3, 2010 10:25 AM writes...

re: #5
The cost of a failed clinical trial is so high because Big Pharma only plays for the blockbuster. Merck et al were the most profitable corporations in the US for decades when they were doing small-molecule research and running clinicals. Some of them were failing, and that was "the cost of doing business". The current regime in which every clinical failure is a bankruptcy or a lay-off is not a constant of nature; it's a bad business model.

Permalink to Comment

7. RandDChemist on May 3, 2010 12:06 PM writes...

Necessity is the mother of invention.

There are so many tools available to the biologist and chemist. It will not (as is pointed out) make things easier any time real soon (barring some sort of breakthrough I suspect).

A discerning mind is needed and is required in the drug business. Using the right tool for the job is vital if not critical. That means not being afraid to fail, rewarding risk-taking and avoiding fads. Do what needs to be done and avoid being side-tracked.

When there are a seemingly limitless number of things to consider it can overwhelm the scientist. It is all too easy to become distracted and lose focus of the true purpose: bringing a medicine to those who need it.

It's not simple, it is not trivial and it is certainly not for the faint of heart. Looking for guaranteed returns? Not here. High risk, high reward stuff here.

Could microdosing work? It could help. All the animal models in the world won't guarantee success once the drug gets into to people.

A good drug does not need to be perfect. It just needs to work well and as expected.

Permalink to Comment

8. qetzal on May 3, 2010 12:32 PM writes...


That means not being afraid to fail, rewarding risk-taking and avoiding fads. Do what needs to be done and avoid being side-tracked.

That can be a tall order if you have a regulatory agency that expects you to do X, Y, and Z, regardless of your judgement that they may be fads, side-tracks, or unnecessary.

The things Derek mentions (cloned receptors, Ames, liver microsomes, Caco-2, hERG, etc.) are all things that FDA expects to see now, right? If someone with a new drug candidate thinks one or more of these is unnecessary, but FDA disagrees, they have two choices: 1) do them anyway, which adds cost, or 2) spend extra time convincing FDA otherwise, which adds cost.

No doubt there are side-tracks and fads that pharma should avoid, but I think the regulatory envirnoment still virtually guarantees increasing complexity over time. Unfortunately, I've no easy solution for that.

Permalink to Comment

9. barry on May 3, 2010 12:58 PM writes...

re: #7
I was totally enamored of microdosing a few years ago, until I talked with some people who had used it. Turns out that for most drugs there are a large number of compartments into which the DS partitions. Most of these can be ignored or lumped together at therapeutic drug levels, but that's not so at the microdose. A high-affinity compartment of very low capacity can totally dominate PK at the microdose yet mean nothing at the therapeutic level.

re:#5 Torcetrapib was a Hail Mary pass from an organization that wasn't bringing up enough clinical candidates to sustain their burn rate. It was an interesting clinical test of a wholly novel biological target. Even more conservative candidates aimed at well-proven disease targets can fail in Phase III (ask Abbott). If you can't afford to lose some of your bets, you're playing over your head. That's as true in the Clinic as it is in the stock market or the casino.

Permalink to Comment

10. Medicamenta vera on May 3, 2010 1:25 PM writes...

Thought provoking article by Clay Shirky (esp., the bit about the phone conference with ATT).

I take exception, however, to Derek's examples of drug discovery complexity that is sinking the ship. Microsomal assays, CaCO2, and other assays (e.g., micro-Ames, photosensitivity testing, etc.) have raised the bar, but by-and-large are advances in guiding SAR and identifying better lead compounds.

I propose that “organizational complexity" (i.e., multi-layered management) is a more important culprit in the decrease of new chemical entities per research dollars spent per annum. A number of years ago when I worked at Pfizer, I asked my department head (in Cardiovascular) who was in at the top of the organizational structure for our therapeutic area, and what criteria were they using to evaluate the TA. At that time Cardiovascular Therapeutics was spread across three sites, and my recollection was that there were three or four management groups above the TA level. The department head did not know who or what was happening beyond two levels. Fantasize, if you will, how liberating it would be to not have to devote time to committees that are chartered by management and then ignored by management, to not have to spend 15% plus of your time doing human resources crap, suffer from initiatives that are not sufficiently beta-tested so that some higher level functionary can make the target date for their performance appraisal, or make the same presentation multiple times to different management groups. There was an initiative to reduce the number of meetings at one BigPharma I worked at. The goal was to create more time for drug discovery. As a result, the quarterly portfolio review with upper management was reduced to 4 meetings per years, tremendous savings there. Another upper level management initiative at another BigPharma was to have more spontaneous and real-time updates of discovery program progress (translation - more slides went into backup and more backup slides were presented).

Permalink to Comment

11. Jose on May 3, 2010 3:05 PM writes...

"Microsomal assays, CaCO2, and other assays (e.g., micro-Ames, photosensitivity testing, etc.) have raised the bar, but by-and-large are advances in guiding SAR and identifying better lead compounds."

Please, could you point me to the empirical evidence base to support this statement?

Permalink to Comment

12. AlchemX on May 3, 2010 3:46 PM writes...

A nice complement to this post would be George Whitesides TED talk on science and simplicity.

Permalink to Comment

13. anonymous on May 3, 2010 8:43 PM writes...

"We're not facing the same situation in this industry - there's no equivalent of free YouTube stuff eating our lunch, ..."

There happens to be this little company from Isreal called Teva. You may have heard of them?

And don't forget all the trial lawyers who suck the blood of the industry.

Permalink to Comment

14. milkshake on May 4, 2010 12:42 AM writes...

I think the fatal flaw of big pharma is the change of culture from research-based to finance and marketing-based. The story goes like this: Pharma in 80s was phenomenally succesful, there were double digit stock appreciation year after year, it looked like a safe bet for many investors big and small, and the demand for pharma stock drove the stock still higher. Giant pharma apparently attracetd more people with corporate law and business background, who then populated the upper echelons of the management, brought in business productivity consultants, insigated megamegers, and above all fucked up the the research free-wheeling culture with all kinds of management fads and reorganization. While they set up everyone for failura and have made totally unrealistic promises they awarded themselves eight-figure bonuses and retirement packages. They could affird to do all this because the research to development to approval and market cycle is so slow that the management is decoupled from responsibility for pernicious decisions about research.

Soviet Union had famines, industrial disasters and lived in the state of permanent crisis and disrepair and finally collapsed - not because it was more complex than western societis: quite the opposite infact. Its just that the Soviet ideology and management were rotten and it takes forever for a behemoth like this one to fall on its own.

Permalink to Comment

15. processchemist on May 4, 2010 4:17 AM writes...

I totally agree with milshake. Starting from manufacturing up to R&D, resources and focus shifted from the real thing to the financial paper.
In the current way of thinking a powerpoint presentation of a new value chain is much more important than the underlying reality.

Permalink to Comment

16. John on May 4, 2010 9:39 AM writes...

The Evil MBA theory is comforting, but I think what we are really looking at is the normal process of technology aging, in which more and more complex processes are brought to bear to achieve a smaller and smaller return on investment.

When cell phones first came out it was a society-changing event. Lots of features have been added over the ensuing years, each requiring more complex technology and arguably having less impact on society than the previous one. I've heard that one company is in negotiations with the auto companies to create a feature in which you can start your car with your cell phone. We are probably reaching the limits of truly innovative and useful features that can be added to a device with a 3 inch screen. But my friends who are software engineers all believe they are going to retire in 30 years having worked their whole lives developing innovative cell phone technology.

In medicinal chemistry we've gone from curing syphillis and bacterial infections in the early 20th century to creating 4th generation antidepressants. In the 1990s we converted AIDS into a chronic illness, but this was only possible using technology that was not available 30 years earlier. And I suspect a pharmacological cure for most cancer patients is just as improbable as a pharmacological cure for broken bones. In the future, a disproportionate share of the real healthcare game changers will come from other technologies, possibly including medical devices, stem cells, cancer vaccines, RNAi and stuff that hasn't even been thought up yet.

Permalink to Comment

17. Hap on May 4, 2010 10:07 AM writes...

The "Evil MBA" theory is flawed, but fits a couple of data points:

1) Emphasis on quantitative measures of productivity (which don't seem to correlate to end results). Measuring things is necessary to change them, but you have to have an idea that what you measure affects the outcome - in the absence of such, adherence to numerical measures of worth is not rational. It however seems to be the altar at the Church of the Business Manager.

2) Outsourcing and other ways of lowering head count help if you know what the problem is, but don't help if you have no idea why you can't produce enough products. If the new products are likely to be something else that what exists now, it would seem that investment in research in those areas would be merited, but that doesn't seem to be happening - instead, companies are counting on other people to find those technologies and then to be able to buy them relatively cheaply when their utility is clear. In both cases, the actions seem to fit a "cash out today, leave the difficult decisions for someone else" methodology which doesn't seem particularly sane.

3) Emphasis on sales - in the absence of enough products you have to sell what you have, but the overselling has destroyed pharma's goodwill (one iof its intangible long-term resources) and probably helped to increase the demands for drug safety (because in the wider market, cost-benefit requires lower side effects to be worth the lower benefits), thus increasing the cost of drug approvals and lowering their probabilities. The destruction of goodwill in particular is not consistent with a long-term view. (Scientists or other management neophytes could have done this as well, though - they simply have to not care about the long-term.)

4) Large payouts for CEOs and execs for companies that neither produce sufficient products or stock price increases.

Permalink to Comment

18. John on May 4, 2010 10:20 AM writes...

These are all good points Hap, but I would argue that they are secondary to declining R&D productivity rather than to the influence of professional managers. Everybody is running around trying to put a Band Aid over a cut artery.

I think I could probably go toe to toe with you on trading stories of bad management decisions by MBAs for stories of bad management decisions by scientists, but probably not appropriate here.

Permalink to Comment

19. peocesschemist on May 4, 2010 10:22 AM writes...


I'm not supporting the "evil MBA theory". I'm saying something different: the substance (in the aristothelic sense of the word) of our business is inherently technical; if the decision makers in industry
1) don't have a strong technical background
2) value anyway hype or financial aspects of the business over science
we're gonna look at a sectorial replay of the subprime crisis.
The "small molecules are dead" mantra is not new to me: I heard it in the middle of the 90's (no peptide no party) and now (no biologics no game).
It's cyclic. Time to bring from the bookshelf "Thus spoke Zarathustra" and refresh some eternal return concepts.

Permalink to Comment

20. Hap on May 4, 2010 10:30 AM writes...

I'm pretty sure you'd win the bad management stories race, and the "Evil MBA" theory is probably as deprecated as the "Trilateral Commission" theories. It just seems that none of the techniques being employed by companies deal with declining productivity - they seem, instead, to make it harder to deal with the problem. Destroying current value and crippling your ability to make future products aren't winning strategies unless your idea of winning is a golden parachute. Wishing productivity issues away, just as with growing government debt and promised benefits, takes a (potentially) solvable problem and makes it completely intractable.

If you have a gushing wound, sticking a knife into it to fiddle things around isn't generally an effective treatment.

Permalink to Comment

21. Anonymous on May 4, 2010 11:06 AM writes...


This time small molecules my be indeed dead for good.
This Pharmalot link can make many med. chemists bit depressed:

Permalink to Comment

22. John on May 4, 2010 11:12 AM writes...

Agree with all of the above except that I do not believe we are dealing with a cyclic process.

What is going on in the industry today is like a page out of my corporate strategy textbook. The return on investment of any new field of technology tends to grow initially, peak, and then fall off. When the ROI begins to fall off, the initial response of management is usually to throw more and more money into R&D. Eventually this state of denial ends and there is a period of very rapid cutbacks in R&D budgets, followed by an increasing emphasis on cost efficiency and competition that is based on price rather than on innovation. Obviously with third party payers this model will not fit exactly to the pharma industry, but one does notice the increased industry interest in generics.

The shift to biologicals does not solve the issue above except in the odd sense that there is no such thing as a biogeneric. So by shifting into biologicals, companies can generate perpetual income streams (rather than 15 year ones) and eventually can reduce the level of R&D expenditures needed to maintain their revenues.

Permalink to Comment

23. Hap on May 4, 2010 11:22 AM writes...

If you don't have the technologies that are likely to form the basis for drugs in the future, how does cutting R+D help? Cutting R+D in current technologies would make sense, but not overall (because you still have no future product line). Transferring R+D (outsourcing) makes sense if you know what the people to whom you're outsourcing can do, but their effectiveness and productivity is ?

Also, while biologics may currently have an infinite product life (no generics), that isn't likely to continue. It also assumes that the costs and research associated with them are less than that of small molecules, and that the prices that make them attractive targets can be maintained, neither of which seems like a reasonable assumption.

Permalink to Comment

24. John on May 4, 2010 11:45 AM writes...

These are good points, I cannot address all of them but I'll try to explain my thinking on the perpetual (actually semi-perpetual")income idea.

There will clearly be "follow-on biologics", but I don't think there will ever be biogenerics in the sense of pharmacy substitution or companies bringing biologics to market without significant clinical trial data. This idea has been considered, and to my understanding has been categorically rejected in both the US and the EU. In Europe, the follow-on biologics legislation has led to a process that is intermediate in expense and risk compared to the US NDA and ANDA processes, and I believe that US legislation anticipates a similar process, Henry Waxman notwithstanding.

My assumption is that the follow-ons will not be much cheaper due to a limited number of companies with the capabilities and risk tolerance to bring these to market (on average in the EU they currently sell for about 75% of the innovator's price). I am also assuming that not being much cheaper and having less clinical (safety, efficacy) data/experience associated with them, they will not achieve anywhere close to the market penetration of small molecule generics (but I don't know what the actual experience in Europe has been on this issue).

Permalink to Comment

25. Hap on May 4, 2010 11:59 AM writes...

True, but doesn't that limit the market? It's a whole lot easier for people to consider buying Viagra on their own if they feel it could help them than, say, erythropoietin, because the cost is so high. "Lifestyle" drugs are particularly susceptible to that - it's going to hard to afford T2 diabetes drugs that cost $30K (or even $3K) a year. Limiting your market might reduce some of the safety issues (by increasing cost-benefit) but also limits your money (though with a longer life you can make the costs back eventually). Someone else commented that the costs of biologics mean that either the market can only bear so many of them - there's only so much money floating about.

Permalink to Comment

26. John on May 4, 2010 12:11 PM writes...


With third party payers and the near universal belief that everyone is entitled to access to the best healthcare available, I think that aggregate pharmaceutical company revenues as a percentage of GDP is probably controlled both by the number of products and by what is politically acceptable to the electorate.

That being said, I think that if our $40,000 drugs were curing people of cancer instead of adding a few months, the public would be willing to let us take a larger bite.

Permalink to Comment

27. John David Galt on May 4, 2010 1:40 PM writes...

The cause isn't complexity per se, it is what economists call "rent seeking": the expenditure of large amounts of time and effort, not to create new wealth but to move existing wealth from your pile into my pile. Wikipedia has a good entry on the topic, and Mises and Bastiat wrote about it.

When a government grows as large as ours, chances are it's full of incentive traps that lead to more and more rent seeking.

I hope that someday, there will be a serious scholarly effort to design and implement a constitution that doesn't have this problem. The first requirement is that the legislature must no longer have the power to arbitrarily hand out unlimited "pork" to friends and campaign contributors.

Permalink to Comment

28. RandDChemist on May 4, 2010 2:35 PM writes...

#8 Point taken, but in the end solid, sound science needs to be advocated with agencies, not adherence to outdated modalities. That is tough, but needed. Part of that is adequate staffing for our regulatory agencies. If we have the regs, then we need to have the people to make sure they are properly followed, not based on the quick opinion of an overworked individual.

#9 Thanks for the info. It's something that I run across from time to time. Like anything, if microdosing even works, it will have limitations.

Regarding the "evil MBA"...that's only a part of the problem, sometimes. A popular meme here though.

Milkshake is right that pharma is incredibly short-sighted. Meeting quarterly targets is not the way to go.

#21 The demise of the small molecule is greatly exaggerated. This list shows that we need to look beyond blockbusters. Biologics are here to stay and a big part of pharma.

#26 Check out the NYT piece on cancer therapies covered here awhile ago. At what price life? How much would I pay for 6 more months of a quality life to spend with my wife and children? $40000 is a pittance. Thing is, 6 months might be enough to get someone to the next treatment. The treatment of cancer is not trivial, nor is survival and quality of life.

One thing we are going to need to have a better way of tech transfer from academia to industry if we are going to bring medicines to market.

The way of doing things in pharma is changing, and it is not going to be painless.

Resources are needed, waste is not.

Permalink to Comment

29. John on May 4, 2010 5:22 PM writes...

The value of a year of life is a pretty interesting number. Britain uses a value of about $50,000. US health economics calculations commonly use twice that. If we assume that the Iraq war will end tomorrow, and that it prevented one terrorist attack that would have killed 14,000 people (e.g., 10,000 more than have died in the war itself), you get a number of around $2,500,000.

If people could buy an unlimited number of extra years of life for a fixed price, I would guess that most people would spend 2/3's or less of their income in this way, in order to have something left over for food and their kid's education. Given the US per capita income of about $50K, this would put the "average" value of a year of life at about $30K, about 1% of the value assigned by the Iraq calculation.

Permalink to Comment

30. barry on May 5, 2010 1:12 PM writes...

surely the Pharmalot analysis predicting the ascendency of Biologics and the death of Med. Chem. rests on the pricing structure? Many of the likely outcomes of the reforms of the American Health Care system won't pay the costs of the current generation of biological treatments. Small-molecule drugs (along with vaccines and potable water and sanitary sewers) may prove to be the best values if/when we align our healthcare expenditures with our values.

Permalink to Comment

31. oldtimer on May 5, 2010 1:29 PM writes...

In the 60's R&D was seen as an expense, 70's-90's as an investment. 21st century its back to being an expense. Will the wheel turn again?

Permalink to Comment

32. srp on May 5, 2010 10:37 PM writes...

I would like to know if there are strong empirical data showing that adding these earlier layers of testing actually reduce the overall cost of developing new drugs. Or if they improve safety in human trials.

From earlier discussions of this topic, it seems that the evidence is pretty thin that animal models are predictive either way--not just that animal successes are often human failures but that we don't know that animal failures would have been less likely to be human successes. Now let's think about the same question for going from cells to animals. Multiply the uncertainties. What the heck are we doing?

Permalink to Comment

33. barry on May 6, 2010 10:26 AM writes...

re: srp #32

Medicinal Chemistry started with anti-infectives. Grow a pathogen in a test-tube, see if your candidate compound can keep the broth clear. Innoculate a guinea pig or rodent, see if your candidate can keep the animal alive. To extrapolate from in-vitro to in-vivo to human trials was pretty clear.
We're a long way from that now. To argue e.g. that a muricidal rat model predicts reproduces human psychosis or that a xenograft of cultured cells in a nude mouse reproduces human cancers is to extrapolate several steps beyond what is provable. Eventually, only a well-designed clinical trial can answer whether we have a drug or not. The opportunity cost of putting off that trial is too often discounted.

Permalink to Comment

34. Kaleberg on May 8, 2010 10:07 PM writes...

Shirky's argument ignores the increased survival value of complex societies, which balance out their vulnerabilities. Remember, the Roman Empire was still operating in the 15th century. It might help to read A Journal of the Plague Year about the 17th century plague in London and compare it to what accounts one can find of the 14th century plague there. The 17th century plague was devastating, but survival rates were much higher and the societal impact was smaller. Much of this was because of increased societal complexity which created more resilient institutions. It is hard to find a comparable modern plague. Most people have never even heard of the flu epidemic in the early 20th century. As for future plagues, we are hedging our bets, so we've put a few chips down on the pharmaceutical industry, complexity and all.

P.S. Don't get completely discouraged, complexity often melts away. In the 19th century, motorizing the Erie Canal to replace the mules, was considered an impossible solution since it would require training a huge number of engineers to replace the mule skinners. Nowadays almost everyone learns how to drive, and you can motorboat along the canal. Sure, modern cars, and boats, have lots more complicated parts and systems, but they are actually easier to use.

Permalink to Comment

35. Anonymous on May 12, 2010 4:45 AM writes...

""Microsomal assays, CaCO2, and other assays (e.g., micro-Ames, photosensitivity testing, etc.) have raised the bar, but by-and-large are advances in guiding SAR and identifying better lead compounds."

Please, could you point me to the empirical evidence base to support this statement?"

Actually the compelling evidence for me has been the shift in failure in my environment from Phase I to Phase II. We rarely lose any drug candidate now during FIH testing - we've usually got the PK and safety sorted using all those nice in vitro tests. Unfortunately we're still hitting a brick wall in Phase II and sometimes Phase III, generally because we fail to show proof of concept. Now you can argue that one alll sorts of ways - it may be down to poor animal models, it may be due to an inadequate understanding of the disease.

Permalink to Comment


Remember Me?


Email this entry to:

Your email address:

Message (optional):

The Last Post
The GSK Layoffs Continue, By Proxy
The Move is Nigh
Another Alzheimer's IPO
Cutbacks at C&E News
Sanofi Pays to Get Back Into Oncology
An Irresponsible Statement About Curing Cancer
Oliver Sacks on Turning Back to Chemistry