Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

« An Alzheimer's Blood Test? Not So Fast. | Main | Studies Show? Not So Fast. »

July 9, 2014

Outsourced Assays, Now a Cause For Wonder?

Email This Entry

Posted by Derek

Here's a look at Emerald Biotherapeutics (a name that's unfortunately easy to confuse with several other former Emeralds in this space). They're engaged in their own drug research, but they also have lab services for sale, using a proprietary system that they say generates fast, reproducible assays.

On July 1 the company unveiled a service that lets other labs send it instructions for their experiments via the Web. Robots then complete the work. The idea is a variation on the cloud-computing model, in which companies rent computers by the hour from Amazon.com, Google, and Microsoft instead of buying and managing their own equipment. In this case, biotech startups could offload some of their basic tasks—counting cells one at a time or isolating proteins—freeing their researchers to work on more complex jobs and analyze results. To control the myriad lab machines, Emerald has developed its own computer language and management software. The company is charging clients $1 to $100 per experiment and has vowed to return results within a day.

The Bloomberg Businessweek piece profiling them does a reasonable job, but I can't tell if its author knows that there's already a good amount of outsourcing of this type already. Emerald's system does indeed sound fast, though. But rarely does the quickness of an assay turn out to be the real bottleneck in any drug discovery effort, so I'm not sure how much of a selling point that is. The harder parts are the ones that can't be automated: figuring out what sort of assay to run, and troubleshooting it so that it can be reliably run on high-throughput machines are not trivial processes, and they can take a lot of time and effort. Even more difficult is the step before any of that: figuring out what you're going to be assaying at all. What's your target? What are you screening for? What's the great idea behind the whole project? That stuff is never going to be automated at all, and it's the key to the whole game.

But when I read things like this, I wonder a bit:

While pursuing the antiviral therapy, Emerald began developing tools to work faster. Each piece of lab equipment, made by companies including Agilent Technologies (A) and Thermo Fisher Scientific (TMO), had its own often-rudimentary software. Emerald’s solution was to write management software that centralized control of all the machines, with consistent ways to specify what type of experiment to run, what order to mix the chemicals in, how long to heat something, and so on. “There are about 100 knobs you can turn with the software,” says Frezza. Crucially, Emerald can store all the information the machines collect in a single database, where scientists can analyze it. This was a major advance over the still common practice of pasting printed reports into lab notebooks.

Well, that may be common in some places, but in my own experience, that paste-the-printed-report stuff went out a long time ago. Talking up the ability to have all the assay data collected in one place sounds like something from about fifteen or twenty years ago, although the situation can be different for the small startups who would be using Emerald (or their competitors) for outsourced assay work. But I would still expect any CRO shop to provide something better than a bunch of paper printouts!

Emerald may well have something worth selling, and I wish them success with it. Reproducible assays with fast turnaround are always welcome. But this article's "Gosh everything's gone virtual now wow" take on it isn't quite in line with reality.

Comments (11) + TrackBacks (0) | Category: Drug Assays


COMMENTS

1. pgwu on July 9, 2014 10:03 AM writes...

Sounds like a systems integrator doing some extra assays. Folks buying Zymark systems hoped to do this in the 80's/90's (minus the internet part), and a few more iterations followed. Each time things got better but I am not sure it's there yet. There is always something trivial manually that tripped you up in a robotic system. I do see it being used for some robust assays.

Permalink to Comment

2. Anonymous on July 9, 2014 10:12 AM writes...

I have to say it's a good way for ex-scientists and potential entrepreneurs to test their ideas when they no longer have access to a lab.

Permalink to Comment

3. Chrispy on July 9, 2014 10:29 AM writes...

Lotsa luck with this plan. It turns out that biological assays need constant tweaking and a vigilant eye. As much as people have tried, they have proven to be stubbornly difficult to outsource because the data you get back is crap. Some assays could be run this way (like simple enzyme inhibition assays), but unlike chemistry, where LC/MS can tell you a lot about the quality of the product, if the product is data it can be very hard to QC after the fact.

One could see a certain demand for expensive equipment (Cellomics, FLIPR, patch clamping, LC/MS/MS) but this is not the kind of thing you'd want to "fly by wire." And routine assays (HERG, liver microsomes, etc.) are already outsourced.

Anyone who guarantees results in a day is fiercely expensive, doing next to nothing, and/or generating garbage (pick two). Caveat emptor.

Permalink to Comment

4. db on July 9, 2014 10:54 AM writes...

"Even more difficult is the step before any of that: figuring out what you're going to be assaying at all. What's your target? What are you screening for? What's the great idea behind the whole project? That stuff is never going to be automated at all, and it's the key to the whole game."

Sadly though, that's what a lot of Big Data proponents see as the endgame--throwing enough at the wall to take a fundamental advance out of what manages to stick.

The problem, obviously, is that no matter what trappings it is dressed in, that isn't *science*, it's tinkering. In addition, without designed experiments and carefully collected data set made up of carefully selected types of data sought, it is hard to impossible to really sort out meaning.

Permalink to Comment

5. Dr Octopus on July 9, 2014 3:03 PM writes...

Whoa, whoa, wait a minute Barry... I've heard that some of these so-called "Zeolites" are actually made of CHEMICALS!!!

Permalink to Comment

6. Sisyphus on July 9, 2014 8:34 PM writes...

I read this at Businessweek before Derek posted and found the article to be one of the most vacuous and uninformative articles that I have read. This is my take on it. Why did the antiviral idea not work out? Because it was a half-baked, highly academic ruse concocted by two individuals with zero experience in drug development and the inverse of zero in arrogance and self-confidence. If they really had something, they would be talking about it - not claiming it is a "stealth" project. So onto another half-baked idea. So what exactly are these guys doing now? Creating a new LIMS system? Or running assays? Or making molecules (as a rotovap would suggest)? Despite all the words and sentences in the article, I still don't know what they are doing? What is the business model? What are the "experiments" and what are the numbers of assays, of clients, of experiments, of revenue that is required to make the business model sustainable? $1-$100/ experiment? Is this one LCMS run or one gel or one XRD or one sequence? For a $100 experiment, how much profit is there?

Reading this made me think of the combichem era when there were many promises made and even more money wasted. See this article for a flashback: http://www.forbes.com/forbes/1998/0126/6102076a.html. Symyx made similar promises with a slight variation on the business model and where is Symyx now?

Permalink to Comment

7. John Wayne on July 10, 2014 8:22 AM writes...

I'm still cringing because neither of the guys in the picture (first link) is wearing safety glasses in the lab.

The strength of this idea is in the weakness of most operating systems for laboratory equipment. It has been my experience that they range from average to terrible; any improvements could be heartily welcomed by the low bar maintained by the market.

Permalink to Comment

8. anchor on July 10, 2014 8:33 AM writes...

@8-I am on board with you on this. Nothing intellectual here other than enough resources ($$$) to buy equipments and then asking for profit sharing, if something pans out. People like these are dime a dozen and I have seen many in my life time!

Permalink to Comment

9. newnickname on July 10, 2014 3:39 PM writes...

You can automate the squirting, mixing, heating, cooling and even the outcome measuring (spectro, mass, etc). You can't automate the THINKING to know if any of the above was even done correctly or makes any sense without seeing it or being there.

Today, I see grad students and post-docs who are nothing but "button pushers" who have no clue what their instruments are doing, let alone measuring, and they just wait for an answer to be spit out.

I have asked, "What value did you use for parameter-X?" "I don't know. I let the machine do it." [Default setting is the wrong value.] And "Does that result even make any sense to you???" "Well, that's what the machine says it is." "Do you think the machine might be WRONG??" [crickets chirping]

More worthless data and worthless publications and worthless investments when someone realizes that the Emerald -- ooops, I mean the Emperor -- has no clothes.

Permalink to Comment

10. hellooooooo on July 11, 2014 7:47 AM writes...

They have some serious Flash horsepower in the background. I don't see how that fits into their bio-cloud model. Those are significantly "hands-on" in my experience.

Permalink to Comment

11. rumtscho on July 15, 2014 4:12 AM writes...

> To control the myriad lab machines, Emerald has developed its own computer language and management software.

I'm not a chemist, I'm a computer scientist. "Developed its own computer language" raises a big red flag for me. How would you chemists react to an article about somebody synthesizing a compound and then analyzing its properties using a spectrometer they built for themselves?

And then, are the tasks really uniform enough to be automated? They are talking about supporting "40 common experiments". As always with automation, the devil is in the details. If each of the experiments runs the same way every time, it can indeed be automated. But what if there are subtle differences? What happens when a researcher gives them spindly cells for counting instead of round ones? Even if they have thought to cover that particular example, there will be many other such gotchas. My experience with life science is the number of variations is usually too high for automation to make sense. Computers are only good when you want the same process repeated perfectly time after time on standardized input, no deviations. Biological material and scientific processes are both poorly suited for automation.


There are some amazing things happening right now in bioinformatics. They usually include the heavy information processing at some point after we've converted analog information to digital data. If these people can pull it off at analog-digital-coversion point, hats off to them. But it sounds too complex to work out well.

Permalink to Comment

POST A COMMENT




Remember Me?



EMAIL THIS ENTRY TO A FRIEND

Email this entry to:

Your email address:

Message (optional):




RELATED ENTRIES
The Worst Seminar
Conference in Basel
Messed-Up Clinical Studies: A First-Hand Report
Pharma and Ebola
Lilly Steps In for AstraZeneca's Secretase Inhibitor
Update on Alnylam (And the Direction of Things to Come)
There Must Have Been Multiple Chances to Catch This
Weirdly, Tramadol Is Not a Natural Product After All