About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
Not Voodoo

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
Realizations in Biostatistics
ChemSpider Blog
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Eye on FDA
Chemical Forums
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa

Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
Gene Expression (I)
Gene Expression (II)
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net

Medical Blogs
DB's Medical Rants
Science-Based Medicine
Respectful Insolence
Diabetes Mine

Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem

Politics / Current Events
Virginia Postrel
Belmont Club
Mickey Kaus

Belles Lettres
Uncouth Reflections
Arts and Letters Daily

In the Pipeline

Category Archives

« Drug Prices | General Scientific News | Graduate School »

July 27, 2015

Oliver Sacks on Turning Back to Chemistry

Email This Entry

Posted by Derek

If you haven't seen it, Oliver Sacks has written a sort of self-elegy in the New York Times. As he announced some months ago, he has been diagnosed with metastatic liver cancer, which as most people know has an extremely poor survival rate at almost any time point you look at.

I have tended since early boyhood to deal with loss — losing people dear to me — by turning to the nonhuman. When I was sent away to a boarding school as a child of 6, at the outset of the Second World War, numbers became my friends; when I returned to London at 10, the elements and the periodic table became my companions. Times of stress throughout my life have led me to turn, or return, to the physical sciences, a world where there is no life, but also no death.

And now, at this juncture, when death is no longer an abstract concept, but a presence — an all-too-close, not-to-be-denied presence — I am again surrounding myself, as I did when I was a boy, with metals and minerals, little emblems of eternity. . .

Readers of his memoir, Uncle Tungsten, will recognize some of the themes of that book. Sacks himself gave me some early encouragement in the first years of this blog, in a letter I'm very happy to have (we talked about how much we each enjoyed copper sulfate). I know just what he means about seeking companionship in the nonhuman, too.

Different people react to the physical sciences in different ways, as I'm reminded these days while I go over the copy-editing for The Chemistry Book. What one person finds beautiful and timeless, another can find sterile and uninteresting. (The book's actually going well, by the way). Perhaps a proxy for this might be to look at the sorts of photographs someone takes on vacation, or out on a camping trip: how many of them feature people, versus objects and scenes? My own photography, I can say pretty definitively, tends towards the abstract: cloudscapes, algae, lichens, closeups of things like weathered rock faces and tree bark. That's not to say that I haven't taken many people-pictures over the years, too, but given a choice between (say) four people grinning in front of a tree and just the tree itself, I'm always going to work in a couple of shots of the latter, too.

Stanley Kubrick once said in an interview that "The most terrifying fact about the universe is not that it is hostile but that it is indifferent". I like that about it. There are bigger and older things out there than people.

Comments (7) + TrackBacks (0) | Category: General Scientific News

June 24, 2015

ChemDraw's Anniversary

Email This Entry

Posted by Derek

If you have a chance to stop by, Thursday the 25th is the "30th Anniversary of ChemDraw" event in Cambridge (MA). Here's the link - I'm going to reminisce a bit in the morning's program about the pre- and early post-ChemDraw days (as I have here on occasion). If you'd told me about this event back in 1985, I don't think I would have believed you.

Update: Prof. Dave Evans will be on hand to talk about the early days - here's his memoir of that period, in Angewandte Chemie.

Comments (11) + TrackBacks (0) | Category: General Scientific News

April 6, 2015

Levels of Data

Email This Entry

Posted by Derek

Here's a brief article in Science that a lot of us should keep a copy of. Plenty of journalists and investors should do the same. It's a summary of what sort of questions get asked of data sets, and the differences between them. There are six broad data analysis categories:

1. Descriptive. This is the simplest case, where you're just summarizing a data set and describing the totals in it.

2. Exploratory. The next step - you search through the descriptive analysis looking for trends or relationships, with which to develop new hypotheses. No guarantees, of course - you'll have to confirm these with more work.

3. Inferential. This one looks at an exploratory treatment and tried to determine whether those trends are likely to hold up. As the authors say, this is probably the most common statistical workup in the literature - better than randome chance, or not? But it can't tell you why something is happening, of course.

4. Predictive. An inferential study is necessarily done on a large sample (well, it had better be, at any rate, if you're going to infer with much confidence). A predictive analysis uses some subset of the data to predict how individual cases will go. The example from drug development would be the use of biomarkers to predict whether a given patient in a trial will respond to some new investigational drug.

5. Causal. At this level, you're trying to see what the magnitude of changes are across the system when you start changing things - what often gets called the "tone" of the system. What are the most important variables, and what has little effect on the outcome?

6. Mechanistic. With the information at the causal level available, now you can really get down to the nuts and bolts. Change A causes effect B, through this detailed mechanism. We don't see this as much with anything involving biology - there always seem to be exceptions. This is more the realm of engineering and physics, although a lot of time and money is going into trying to change that.

It's only at the causal and mechanistic levels that you can start doing detailed modeling with confidence. That's where everyone would like to be with computational binding predictions, but we don't understand them well enough yet. And think how far we have to go to get predictive toxicology to those levels! We can do that sort of thing on a small scale - for example, saying that a compound that (say) inhibits angiotensin-converting enzyme, to this degree, and with that average half-life in vivo, will be expected to lower X% of a random population's members blood pressure by at least Y%. That's after decades of experience and data-gathering, keep in mind.

But that's not aeronautical engineering. Those folks don't tell you that wing design A will provide at least so much lift on a certain percentage of the airframes it gets bolted on to. Nope, those folks get to build their airframes to the same exact specifications, not just take whatever shows up at the factory needing wings, and those airframe/wing combinations had better perform within some very tight tolerances or something has gone seriously wrong. This is just another way of stating the "built by humans" difference I was talking about the other day.

So some of that data analysis hierarchy above is, well, aspirational for those of us doing drug research. The authors of the Science article are well aware of this themselves, saying that "Outside of engineering, mechanistic data analysis is extremely challenging and rarely achievable.". But that level is where many people expect science to be, most of the time, which leads to a lot of frustration: "Look, is this pill going to help me or not?" We should remember where we are on the scale and try to work our way up.

Comments (12) + TrackBacks (0) | Category: Clinical Trials | Drug Development | General Scientific News | In Silico

February 17, 2015

Chemical Illiteracy, Again

Email This Entry

Posted by Derek

BBC%20peptide.jpgThe BBC really should know better than this. Shown is a screen shot from a current science TV series, "Wonders of Life", taken from this review in The Guardian. Every chemist reading this will have noticed by now that this so-called "peptide structure" is laughably insane. It's poorly drawn, too - what's with all those changing bond lengths? - but that big problem is that it represents something that could never exist.

If a BBC show were to include a world map where Italy had been relabeled Argentina, they would probably be shamed throughout the British press. This structure is more stupid than that, though, since it's at least physically possible for the citizens of Italy to vote to change the name of their country. Oxygen, though, cannot decide to be a neutral trivalent species, and that's only one of several stupidities in that "structure". This seems to be yet another case of some graphic design person getting their hands on some chemical drawing software and making something that looks pretty.

Prof. David Smith at York has a good response to all this on YouTube. What always irritates me about these mistakes is that they're so avoidable, and accomplish so little. Would putting down the right structure really have made it less visually interesting? Is the gibberish shown really so much more compelling that someone had to come along and mess things up just for that purpose? This isn't the first example of these ridiculous errors I've highlighted around here, and I'm sure it won't be the last. I just wish they they were less frequent, and less prominent.

Comments (41) + TrackBacks (0) | Category: General Scientific News

February 6, 2015

English As the Language of Science

Email This Entry

Posted by Derek

For some Friday afternoon reading, here's an essay on how English became the language of science. We seem, it says, to have gone from polyglot-with-a-bridging language (Latin) during the Renaissance and up until about 1700, then through a period of completely polyglot science up until around 1850. Around then, science settled into a triumvirate of English, German, and French.

What broke that apart? What you'd think: World War I and World War II. In fact, whenever you're looking for a root cause of something that changed profoundly during the 20th century, you should always start by checking the First World War. All by itself, it was a catastrophe, but it set off a chain of even bigger catastrophes, and the world we live in is the world created by that sequence.

It's easy to think that if only that particular fuse had not been lit and set off that particular detonation, that everything would have been much better. It might well have been. When my children were younger and I was reading (or re-reading) the Tintin books with them, it struck me that their setting was just that world, the one where World War One had never happened. Europe was peaceful and prosperous, and America was a smaller part of the picture. (P. G. Wodehouse's books are largely set in that world, too). But perhaps not. A giant European war was probably on the way in some form, and maybe it would have been even worse the longer it waited to arrive.

At any rate, the wars of the 20th century demolished the standing of German as a language of science, and the collateral damage from the blast took out French as well. English stepped into the gap, and has done nothing but widen its lead ever since. I can even see the change since my own graduate school days in the 1980s, back when Chemische Berichte and a Recueil des Travaux Chimiques des Pays-Bas still existed, just to name two.

Comments (47) + TrackBacks (0) | Category: General Scientific News

February 3, 2015

That Pew Survey and Its Problems

Email This Entry

Posted by Derek

A week or so ago, the Pew Center people released a poll on the differences in opinion between the general public and the members of the AAAS. It got quite a bit of attention, and one of the highlighted splits was on the safety of genetically modified organisms in food. (Less than 50% of the general public regarded these as safe to eat, whereas nearly 90% of the scientists did).

But before making too much of this - or indeed, any poll at all - I would recommend reading this from Dan Kahan at Yale Law School's Cultural Cognition Project. His point is that the general public has demonstrated, by both word and action, that it does not actually understand what's going on with GMOs. So asking a broad question like the Pew Survey's is pretty much meaningless. It would be like taking a nationwide poll on what the American public thought about the usefulness of rhodanines as screening hits in drug discovery, or asking everyone what their opinions were on cell penetration of stapled peptides.

This had occurred to me years ago, in a completely different context. Back in grad school, I had walked back to my apartment for lunch (I was close enough to the chemistry building, although it wasn't something I did often). I had the TV on, where the House was debating on funding for the Nicaraguan contras. (I was in grad school quite a while ago, apparently). And the C-SPAN broadcast mentioned the results of an opinion poll on the subject - this many people thought we should fund the Contras, and this many thought we shouldn't. But they went on to mention the results of some of the other questions on the poll, and one of those was something like "Who are we trying to fund - the government of Nicaragua or the people trying to overthrow it?" And some people said one, and some people said the other, and there was a pretty substantial "Don't Know" response. At that point, I realized that the results from the other question were meaningless: the opinion of someone who thought that this money was going to the government of Nicaragua, or of someone who just didn't know enough to say either way, was worth nothing in this context.

That same thought should have occurred to me when I read the Pew results. As the link above shows, people also report all sorts of other attitudes and behavior around GMO food that show that they don't actually understand what's going on. And if you ask people what "GMO" itself stands for, you don't even do as well as you'd hope on that question. Some people know the acronym, some people know that the "G" stands for "genetic", some people just know that it means "Something Icky", and others haven't apparently come across it and have no idea. So when you ask a general sample of people about this topic, the results you get are the results from asking people about something that they don't understand. You really would do as well asking everyone about rhodanines, with the main difference being that more people have heard of the term "GMO", even if they don't know anything about it.

For instance, half the survey respondents said that they "always" or "sometimes" check the labels on the food they buy to see if they're made with GMOs. Problem is, there's no such label in the US. Over 70% of the foods in a typical grocery store have some sort of GMO ingredient, by some estimates, and many of these have been used for years and years now. People buy them without a second thought. There are several ways to interpret this. You might see it as revealed preference: people say one thing (that they're concerned about GMO food), but act another way, because they actually don't care that much. Or, as some activists believe, if you could just get it across to people how much GMO food they're consuming, they'd react with shock and horror and stop buying it. Or the opposite might happen - people might figure that since hundreds of millions of customers have been eating these things for years without anyone noticing at all, that they must not be so fearful.

I'm not going to speculate on which of these possibilities might be accurate. What I did want to emphasize, though, is what that Yale Law link does. When you see surveys like this, you're very likely not looking at what the headline on the survey says you're looking at. Ask people about something they don't know much about, and you act on the results at your peril.

Comments (54) + TrackBacks (0) | Category: General Scientific News

February 2, 2015

Carl Djerassi, 1923-2015

Email This Entry

Posted by Derek

Carl Djerassi's obituaries are all over the press, and well they should be. The history of early steroid chemistry is wild and tangled, with many players who now seem larger than life. But there's no doubt that Djerassi was one of the biggest names in that era, and there's no doubt that he was one of the major forces (along with John Rock and Gregory Pincus, not to mention Margaret Sanger and Katharine McCormick) behind the development of the first oral contraceptive pill. And therefore, without any doubt, he had a major influence on the modern world, and on human history itself.

What sometimes gets lost in all this is that Djerassi was a very accomplished chemist. By his own admission, though, his involvement with "The Pill" realigned his interests somewhat from being a hard-science chemist to understanding more about the social impact of his work. (And that, in turn, likely helped lead to his second career as an author and dramatist). My own opinion is that Djerassi's chemistry is a secure part of chemical history, but his fiction, which I've never been able to warm up to, is unlikely to long survive him. But as a world historical figure, for those who understand the technological underpinnings of society, his name will live on and never be erased.

Inevitably, one of the topics that comes up when Djerassi is mentioned is why he never won the Nobel Prize. Paul Bracher has much more on this at ChemBark, and it's a very good question. The only thing I'll note for now is that Gilbert Lewis never won, either - and for that matter, the early Nobel Committee had a chance to honor Mendeleev, and never did. Every field has a list of people who should have achieved its top honors and didn't, and Djerassi is one of ours. Wavefunctions's thoughts are here, and his post includes the famous photo with Woodward and Prelog.

Two recent histories of the efforts behind the development of the birth control pill are Sexual Chemistry by Lara Marks and The Birth of the Pill by Jonathan Eig. Djerassi's own reflections are captured in his 2004 work, This Man's Pill, and Wavefunction himself recommends Djerassi's chemical autobiography, Steroids Made it Possible.

Update: added more links.

Comments (9) + TrackBacks (0) | Category: General Scientific News

January 30, 2015

Underpowered And Overinterpreted

Email This Entry

Posted by Derek

Time for another "watch those statistics" post. I did one about this time last year, and I could do one every couple of months, to be honest. Here's a good open-access paper from the Royal Society on the problem of p-values, and why there are so many lousy studies out there in the literature. The point is summed up here:

If you use p=0.05 to suggest that you have made a discovery, you will be wrong at least 30% of the time. If, as is often the case, experiments are underpowered, you will be wrong most of the time.

True, true, and true. If you want to keep the false discovery rate down to below 5%, the paper says, you should be going for p<0.001. And just how many studies, of all kinds, across all fields, hit that standard? Not too damn many, which means that the level of false discovery out there is way north of 5%.

(This paper) deals only with the very simplest ideal case. We ask how to interpret a single p-value, the outcome of a test of significance. All of the assumptions of the test are true. The distributions of errors are precisely Gaussian and randomization of treatment allocations was done perfectly. The experiment has a single pre-defined outcome. The fact that, even in this ideal case, the false discovery rate can be alarmingly high means that there is a real problem for experimenters. Any real experiment can only be less perfect than the simulations discussed here, and the possibility of making a fool of yourself by claiming falsely to have made a discovery can only be even greater than we find in this paper.

The author of this piece is David Coqulhoun, a fact that some people will have guessed already, because he's been beating on this topic for many years now. (I've linked to some of his prickly opinion pieces before). He's not saying something that a lot of people want to hear, but I think it's something that more people should realize. A 95% chance of being right, across the board, would be a high standard to aim for, possibly too high for research to continue at a useful pace. But current standards are almost certainly too low, and we especially need to look out for this problem in studies of large medical significance.

Update: what this post needed was this graphic from XKCD!

Comments (50) + TrackBacks (0) | Category: Clinical Trials | Drug Assays | General Scientific News | The Scientific Literature

November 3, 2014

Difficult to Measure

Email This Entry

Posted by Derek

Here's a link that has nothing to do with chemistry: a profile of Bill James. Or does it? As many out there will know, James pioneered a data-driven, from-the-ground-up way of looking at baseball, and writing about it. Back in the 1980s, I bought his Baseball Abstract every year, and enjoyed them thoroughly, but not just as a baseball fan. I liked his approach to the subject: does some piece of received wisdom make sense? Can we find out if it does? If not, why did we all believe it for so long? And so on. As I wrote here, reading James probably changed my life to some extent. I was already inclined this way, but seeing the Jamesian worldview in action was inspiring. And I still have a lot of what he wrote, up in my head, well-used parts of my mental furniture.

A line from this new profile really stands out:

“I have to take my share of responsibility for promoting skepticism about things that I didn’t understand as well as I might have,” he says. “What I would say NOW is that skepticism should be directed at things that are actually untrue rather than things that are difficult to measure."

And boy, do we ever have that problem in drug discovery. A lot of our most important concepts and processes fall into that category. And there are (as usual) opposite errors that we can make about them. We can decide that everything that's said and done about them is crap, because they're so difficult (and difficult to quantify). Or we can mess up in the other direction by embracing any quantitative approach that comes along, because it finally promises to give us some clarity. Steering between these two is not easy, not at all. But we have to try.

Comments (8) + TrackBacks (0) | Category: General Scientific News

October 31, 2014

Science And Dramatic Art: Any Intersections?

Email This Entry

Posted by Derek

Someone asked me this the other day, and I've never had a good answer for it. Has there ever been a realistic, compelling depiction of of what it's like to do scientific research in any movie, TV show, or play? My own answer is that while the number of dramatic works that have mishandled the subject are beyond counting, there are still many that manage to deal with scientific topics in a reasonable way. But how many of them are able to depict what it's actually like to be a scientist?

Perhaps the problem is that "realistic" and "compelling" are almost impossible to achieve at the same time for a wider audience. WIthout the background, you're at one remove (at least) to the real topic, and to the real experience. What happens is that the authors have to start out by saying, implicitly, "We'll stipulate that this is important. . .", or "Trust me, the protagonist finds this interesting. . .", and that's a big handicap to start off with. At least it is compared to the standard materials of human drama, which range from Shakespeare, Chekhov, etc. at the top to some sort of "Babies in Danger!" movie on the Lifetime channel at the bottom.

So most of the well-regarded depictions of science and scientists tend to borrow human interest in order to spice things up. There has to be conflict between strong personalities, a love story entanglement, or flat-out good guys and bad guys to give the audience someone to root for. It's true that there are episodes in scientific history that involve these things, but the science part of the story isn't about them. It's about ideas, and the manipulation of ideas is very, very hard to get across. (Montherlant said that "Happiness writes white", but scientific abstraction writes whiter than that).

It would be a wonderful thing if anyone could really make a reader or viewer feel what it was like to be an obscure patent clerk in Switzerland, surpassing Newton in your spare time. Or what it was really like for Watson and Crick when they heard that Linus Pauling had a DNA structure coming out, and when they then realized that it was wrong. Or what it must have been like for Alvarez and co-workers as the meaning of the iridium anomaly crept up on them. Or just what it's like for a grad student in chemistry, getting the result that means the definite end of their project and realizing that yes, this really is it.

But these are mental states, intimately tangled with a wealth of obscure details and high-level reasoning. They aren't built out of words, and their key steps aren't physical actions, so conveying them with words and physical actions is a major challenge. It's not possible to really get across what it must have been like to be Einstein. What sort of writer would you have to be? But just explaining things at one remove is hard, too: what sort of writer do you have to be to get across (say) the feeling of that self-referential tail-biting mathematical move that's at the heart of Gödel's incompleteness proof? To a reader who doesn't have much of any math?

And I've made it this far without even mentioning the other tough part, the pace at which these things move. Watson and Crick looped in and around the question of DNA structure for a couple of years before hitting on the solution - try turning that into a faithful hour-long TV special. Even without the mental gymnastics problem, those sorts of time courses are hard to dramatize.

But who's come closest? Nominations for artistic works of all kinds in the comments (TV, movies, books, plays), and in all fields of science. We're going to have to cast the net as widely as possible to come up with some good ones.

Comments (124) + TrackBacks (0) | Category: General Scientific News

October 10, 2014

More on Fluorescent Microscopy Chemistry Prizes

Email This Entry

Posted by Derek

I wanted to note (with surprise!) that one of this year's Nobel laureates actually showed up in the comments section of the post I wrote about him. You'd think his schedule would be busier at the moment (!), but here's what he had to say:

A friend pointed this site/thread out to me. I apologize if I was unclear in the interview. #3 and #32 have it right -- I have too much respect for you guys, and don't deserve to be considered a chemist. My field is entirely dependent upon your good works, and I suspect I'll be personally more dependent upon your work as I age.

Cheers, Eric Betzig

And it's for sure that most of the readers around here are not physicists nor optical engineers, too! I think science is too important for food fights about whose part of it is where - we're all working on Francis Bacon's program of "the effecting of all things possible", and there's plenty for everyone to do. Thanks very much to Betzig for taking the time to leave the clarification.

With that in mind, I was looking this morning at the various tabs I have open on my browser for blogging subjects, and noticed that one of them (from a week or so back) was a paper on super-resolution fluorescent probes. And it's from one of the other chemistry Nobel winners this year, William Moerner at Stanford! Shown is the rhodamine structure that they're using, which can switch from a nonfluorescent state to a highly fluorescent one. Moerner and his collaborators at Kent State investigated a series of substituted variants of this scaffold, and found one that seems to be nontoxic, very capable of surface labeling of bacterial cells, and is photoswitchable at a convenient wavelength. (Many other photoswitchable probes need UV wavelengths to work, which bacteria understandably don't care for very much).

Shown below the structure drawing is an example of the resolution this probe can provide, using Moerner's double-helix point-spread-function, which despite its name is not an elaborate football betting scheme. That's a single cell of Caulobacter crescentus, and you can see that the dye is almost entirely localized on the cell surface, and that ridiculously high resolutions can be obtained. Being able to resolve features inside and around bacterial cells is going to be very interesting in antibiotic development, and this is the kind of work that's making it possible.

Oh, and just a note: this is a JACS paper. A chemistry Nobel laureate's most recent paper shows up in a chemistry journal - that should make people happy!

Comments (8) + TrackBacks (0) | Category: General Scientific News

You'd Think That This Can't Be Correct

Email This Entry

Posted by Derek

Well, here's something to think about over the weekend. I last wrote here in 2011 about the "E-cat", a supposed alternative energy source being touted/developed by Italian inventor Andrea Rossi. Odd and not all that plausible claims of low-energy fusion reactions of nickel isotopes have been made for the device (see the comments section to that post above for more on this), and the whole thing definitely has been staying in my "Probably not real" file. Just to add one complication, Rossi's own past does not appear to be above reproach. And his conduct (and that of his coworker Sergio Focardi) would seem to be a bit strange during this whole affair.

But today there is a preprint (PDF) of another outside-opinion test of the device (thanks to Alex Tabarrok of Marginal Revolution on Twitter for the heads-up). It has several Swedish co-authors (three from Uppsala and one from the Royal Institute of Technology in Stockholm), and the language is mostly pretty measured. But what it has to say is quite unusual - if it's true.

The device itself is no longer surrounded by lead shielding, for one thing. No radiation of any kind appears to be emitted. The test went on for 32 days of continuous operation, and here's the take-home:

The quantity of heat emitted constantly by the reactor and the length of time during which the reactor was operating rule out, beyond any reasonable doubt, a chemical reaction as underlying its operation. This is emphasized by the fact that we stand considerably more than two order of magnitudes from the region of the Ragone plot occupied by conventional energy sources.

The fuel generating the excessive heat was analyzed with several methods before and after the experimental run. It was found that the Lithium and Nickel content in the fuel had the natural isotopic composition before the run, but after the 32 days run the isotopic composition has changed dramatically both for Lithium and Nickel. Such a change can only take place via nuclear reactions. It is thus clear that nuclear reactions have taken place in the burning process. This is also what can be suspected from the excessive heat being generated in the process.

Although we have good knowledge of the composition of the fuel we presently lack detailed information on the internal components of the reactor, and of the methods by which the reaction is primed. Since we are presently not in possession of this information, we think that any attempt to explain the E-Cat heating process would be too much hampered by the lack of this information, and thus we refrain from such discussions.

In summary, the performance of the E-Cat reactor is remarkable. We have a device giving heat energy compatible with nuclear transformations, but it operates at low energy and gives neither nuclear radioactive waste nor emits radiation. From basic general knowledge in nuclear physics this should not be possible. . .

Told you it was interesting. But I'm waiting for more independent verification. As long as Rossi et al. are so secretive about this device, the smell of fraud will continue to cling to it. I truly am wondering just what's going on here, though.

Update: Elforsk, the R&D arm of Sweden's power utility, has said that they want to investigate this further. Several professors from Uppsala reply that the whole thing is likely a scam, and that Elforsk shouldn't be taken in. Thanks to reader HL in the comments section, who notes that Google Translate does pretty well with Swedish-English.

Comments (40) + TrackBacks (0) | Category: General Scientific News

October 9, 2014

Eric Betzig Is Not a Chemist, And I Don't Much Care

Email This Entry

Posted by Derek

Update: Betzig himself has shown up in the comments to this post, which just makes my day.

Yesterday's Nobel in chemistry set off the traditional "But it's not chemistry!" arguments, which I largely try to stay out of. For one thing, I don't think that the borders between the sciences are too clear - you can certainly distinguish the home territories of each, but not the stuff out on the edge. And I'm also not that worked up about it, partly because it's nowhere near a new phenomenon. Ernest Rutherford got his Nobel in chemistry, and he was an experimental physicist's experimental physicist. I'm just glad that a lot of cutting-edge work in a lot of important fields (nanotechnology, energy, medicine, materials science) has to have a lot of chemistry in it.

With this in mind, I thought this telephone interview with Eric Betzig, one of the three laureates in yesterday's award, was quite interesting:

This is a chemistry prize, do you consider yourself a chemist, a physicist, what?

[EB] Ha! I already said to my son, you know, chemistry, I know no chemistry. [Laughs] Chemistry was always my weakest subject in high school and college. I mean, you know, it's ironic in a way because, you know, trained as a physicist, when I was a young man I would look down on chemists. And then as I started to get into super-resolution and, which is really all about the probes, I came to realise that it was my karma because instead I was on my knees begging the chemists to come up with better probes for me all the time. So, it's just poetic justice but I'm happy to get it wherever it is. But I would be embarrassed to call myself a chemist.

Some people are going to be upset by that, but you know, if you do good enough work to be recognized with a Nobel, it doesn't really matter much what it says on the top of the page. "OK, that's fine for the recipients", comes one answer, "but what about the committee? Shouldn't the chemistry prize recognize people who call themselves chemists?" One way to think about that is that it's not the Nobel Chemist prize, earmarked for whatever chemists have done the best work that can be recognized. (The baseball Hall of Fame, similarly, has no requirement that one-ninth of its members be shortstops). It's for chemistry, the subject, and chemistry can be pretty broadly defined. "But not that broadly!" is the usual cry.

That always worries me. It seems dangerous, in a way - "Oh no, we're not such a broad science as that. We're much smaller - none of those big discoveries have anything to do with us. Won't the Nobel committee come over to our little slice of science and recognize someone who's right in the middle of it, for once?" The usual reply to that is that there are, too, worthy discoveries that are pure chemistry, and they're getting crowded out by all this biology and physics. But the pattern of awards suggests that a crowd of intelligent, knowledgable, careful observers can disagree with that. I think that the science Nobels should be taken as a whole, and that there's almost always going to be some blending and crossover. It's true that this year's physics and chemistry awards could have been reversed, and no one would have complained (or at least, not any more than people are complaining now). But that's a feature, not a bug.

Comments (38) + TrackBacks (0) | Category: Chemical News | General Scientific News

September 19, 2014

Peter Thiel's Uncomplimentary Views of Big Pharma

Email This Entry

Posted by Derek

See what you think of Peter Thiel's characterization of the drug industry in this piece for Technology Review. Thiel's a very intelligent guy, and his larger points about technology stalling out make uncomfortable reading, in the best sense. (The famous quote is "We wanted flying cars; instead we got 140 characters"). But take a look at this (emphasis added):

You have to think of companies like Microsoft or Oracle or Hewlett-Packard as fundamentally bets against technology. They keep throwing off profits as long as nothing changes. Microsoft was a technology company in the ’80s and ’90s; in this decade you invest because you’re betting on the world not changing. Pharma companies are bets against innovation because they’re mostly just figuring out ways to extend the lifetime of patents and block small companies. All these companies that start as technological companies become antitechnological in character. Whether the world changes or not might vary from company to company, but if it turns out that these antitechnology companies are going to be good investments, that’s quite bad for our society.

I'd be interested in hearing him revise and extend those remarks, as they say in Washington. My initial reaction was to sit down and write an angry refutation, but I'm having second thoughts. The point about larger companies becoming more cautious is certainly true, and I've complained here about drug companies turning to M&A and share buybacks instead of putting that money back into research. I'd say, though, that the big drug companies aren't so much anti-technology as they are indifferent to it (or as indifferent as they can afford to be).

Even that still sounds harsh - what I mean is that they'd much rather maximize what they have, as opposed to coming up with something else. Line extensions and patent strategies are the most obvious forms of this. Buying someone else's innovations comes next, because it still avoids the pain and uncertainty of coming up with your own. There's no big drug company that does only these things, but they all do them to some degree. Share buybacks are probably the most galling form of this, because that's money that could, in theory, be applied directly to R&D, but is instead being used to prop up the share price.

But Thiel mentions elsewhere in his interview that we could, for example, be finding cures for Alzheimer's, and we're not. Eli Lilly, though, is coming close to betting the company on the disease, taking one huge swing after another at it. Thiel's larger point stands, about how more of the money that's going into making newer, splashier ways to exchange cat pictures and one-liners over the mobile phone networks could perhaps be applied better (to Alzheimer's and other things). But it's not that the industry hasn't been beating away on these itself.

I worry that the Andy Grove fallacy might be making an appearance again, given Thiel's background (PayPal, Facebook, LinkedIn). That link has a lot more on that idea, but briefly, it's the tendency for some people from the computing/IT end of the tech world to ask what the problem is with biomedical research, because it doesn't improve like computing hardware does. It's a good day to reference the "No True Scotsman" fallacy, too: sometimes people seem to identify "technology" with computing, and if something doesn't double in speed and halve in cost every time you turn around, well, that's not "real" technology. At the very least, it's not living up to its potential, and there must be something wrong with it.

I also worry that Thiel adduces the Manhattan project, the interstate highway system, and the Apollo program as examples of the sort of thing he'd like to see more of. Not that I have anything against any of those - it's just that they're all engineering projects, rather than discovery ones. The interstate system, especially: we know how to build roads, so build bigger ones. The big leap there was the idea that we needed large, standardized ones across the whole country, with limited entrances and exits. (And that was born out of Eisenhower's experiences driving across the country as the road network formed, and seeing Germany's autobahns during the war).

But you can say similar things about Apollo: we know that rockets can exist, so build bigger ones that can take people to the moon and back. There were a huge number of challenges along the way, in concept, design, and execution, but the problem was fundamentally different than, say, curing Alzheimer's. We don't even know that Alzheimer's can be cured - we're just assuming that it can. I really tend to think it can be cured, myself, but since we don't even know what causes it, that's a bit of a leap of faith. We're still making fundamental "who knew?" type discoveries in biochemistry and molecular biology, of the sort that would totally derail most big engineering projects. The Manhattan project is the closest analog of the three mentioned, I'd say, because atomic physics was such a new field (and Oppenheimer had to make some massive changes in direction along the way because of that). But I've long felt that the Manhattan project is a poor model, since it's difficult to reproduce its "Throw unlimited amounts of money and talent at the problem" mode, not to mention the fight-for-the-survival-of-your-civilization aspect.

But all that said, I do have to congratulate Peter Thiel on putting his money down on his ideas, though his investment fund. One of things I'm happiest about in today's economy, actually, is the way that some of the internet billionaires are spending their money. Overall, I'd say that many of them agree with Thiel that we haven't discovered a lot of things that we could have, and they're trying to jump-start that. Good luck to them, and to us.

Comments (58) + TrackBacks (0) | Category: Business and Markets | General Scientific News | Who Discovers and Why

July 22, 2014

The Broad Gets $650 Million For Psychiatric Research

Email This Entry

Posted by Derek

The Broad Institute seems to have gone through a bit of rough funding patch some months ago, but things are looking up: they've received a gift of $650 million to do basic research in psychiatric disorders. Believe it, that'll keep everyone busy, for sure.

I enjoyed Eric Lander's characterization of much of the 1990s work on the genetic basis of mental illness as "pretty much completely useless", and I don't disagree one bit. His challenge, as he and the rest of the folks at the Broad well know, is to keep someone from being able to say that about them in the year 2034. CNS work is the ultimate black box, which makes a person nervous, but on the other hand, anything solid that gets discovered will be a real advance. Good luck to them.

You might also be interested to know where the Stanley Foundation, the benefactors here, came up with over half a billion dollars to donate to basic medical research (and more to come, apparently). You'd never guess: selling collectibles. Sports figurines. Small replicas of classic cars, trucks, and tractors. Miniature porcelain versions of popular dolls. Leather-bound sets of great (public domain) novels. Order now for the complete set of Presidential Coins - that sort of thing. It looks to be a lot more lucrative than discovering drugs (!)

Comments (49) + TrackBacks (0) | Category: General Scientific News | The Central Nervous System

July 18, 2014

Chemistry Class For the "Food Babe"?

Email This Entry

Posted by Derek

I found this article from the Charlotte Observer on the "Food Babe" (Vani Hari) very interesting. A "menu consultant" for Chick-Fil-A, is she? Who knew?

I've come across a horribly long string of chemistry misapprehensions, mistakes, and blunders while looking at her site - she truly appears to know nothing whatsoever about chemistry, not that this would appear to bother her much. (Wavefunction has a good article on these). I noticed in the comments section of the newspaper's article that someone is apparently trying to crowdsource a fundraising drive to send her to some chemistry classes. I enjoy that idea very much, although (1) horse, water, drink, etc., and (2) she appears to have sufficient funds to do this already, were it of any possible interest to her. And more money coming in all the time. She may well make more money telling people that they're eating yoga mats than I do trying to discover drugs.

Comments (46) + TrackBacks (0) | Category: General Scientific News

July 9, 2014

No Scripps/USC

Email This Entry

Posted by Derek

The proposed Scripps/USC deal is off, according to reporters Gary Robbins and Bradley Fikes at the San Diego Union-Tribune. No details on what comes next, though - but something presumably does come next.

Comments (14) + TrackBacks (0) | Category: General Scientific News

July 2, 2014

All Natural And Chemical Free

Email This Entry

Posted by Derek

Yesterday's link to the comprehensive list of chemical-free products led to some smiles, but also to some accusations of preaching to the choir, both on my part and on the part of the paper's authors. A manuscript mentioned in the blog section of Nature Chemistry is certainly going to be noticed mostly by chemists, naturally, so I think that everyone responsible knows that this is mainly for some comic relief, rather than any sort of serious attempt to educate the general public. Given the constant barrage of "chemical-free" claims, and what that does to the mood of most chemists who see them, some comedy is welcome once in a while.

But the larger point stands. The commenters here who said, several times, that chemists and the public mean completely different things by the word "chemical" have a point. But let's take a closer look at this for a minute. What this implies (and implies accurately, I'd say) is that for many nonscientists, "chemical" means "something bad or poisonous". And that puts chemists in the position of sounding like they're arguing from the "No True Scotsman" fallacy. We're trying to say that everything is a chemical, and that they range from vital to harmless to poisonous (at some dose) and everything in between. But this can sound like special pleading to someone who's not a scientist, as if we're claiming all the good stuff for our side and disavowing the nasty ones as "Not the kind of chemical we were talking about". (Of course, the lay definition of chemical does this, with the sign flipped: the nasty things are "chemicals", and the non-nasty ones are. . .well, something else. Food, natural stuff, something, but not a chemical, because chemicals are nasty).

So I think it's true that approaches that start off by arguing the definition of "chemical" are doomed. It reminds me of something you see in online political arguments once in a while - someone will say something about anti-Semitism in an Arab country, and likely as not, some other genius will step in with the utterly useless point that it's definitionally impossible, you see, for an Arab to be an anti-Semite, because technically the Arabs are also a Semitic people! Ah-hah! What that's supposed to accomplish has always been a mystery to me, but I fear that attempts to redefine that word "chemical" are in the same category, no matter how teeth-grinding I find that situation to be.

The only thing I've done in this line, when discussing this sort of thing one-on-one, is to go ahead and mention that to a chemist, everything that's made out of atoms is pretty much a "chemical", and that we don't use the word to distinguish between the ones that we like and the ones that we don't. I've used that to bring up the circular nature of some of the arguments on the opposite side: someone's against a chemical ingredient because it's toxic, and they know it's toxic because it's a chemical ingredient. If it were "natural", things would be different.

That's the point to drop in the classic line about cyanide and botulism being all-natural, too. You don't do that just to score some sort of debating point, though, satisfying though that may be - I try not to introduce that one with a flourish of the sword point. No, I think you want to come in with a slightly regretful "Well, here's the problem. . ." approach. The idea, I'd say, is to introduce the concept of there being a continuum of toxicity out there, one that doesn't distinguish between man-made compounds and natural ones.

The next step after that is the fundamental toxicological idea that the dose makes the poison, but I think it's only effective to bring that up after this earlier point has been made. Otherwise, it sounds like special pleading again: "Oh, well, yeah, that's a deadly poison, but a little bit of it probably won't hurt you. Much." My favorite example in this line is selenium. It's simultaneously a vital trace nutrient and a poison, all depending on the dose, and I think a lot of people might improve their thinking on these topics if they tried to integrate that possibility into their views of the world.

Because it's clear that a lot of people don't have room for it right now. The common view is that the world is divided into two categories of stuff: the natural, made by living things, and the unnatural, made by humans (mostly chemists, dang them). You even see this scheme applied to inorganic chemistry: you can find people out there selling makeup and nutritional supplements who charge a premium for things like calcium carbonate when it's a "natural mineral", as opposed (apparently) to that nasty sludge that comes out of the vats down at the chemical plant. (This is also one of the reasons why arguing about the chemist's definition of "organic" is even more of a losing position than arguing about the word "chemical").

There's a religious (or at least quasi-religious) aspect to all this, which makes the arguments emotional and hard to win by appeals to reason. That worldview I describe is a dualist, Manichean one: there are forces of good, and there are forces of evil, and you have to choose sides, don't you? It's sort of assumed that the "natural" world is all of a piece: living creatures are always better off with natural things. They're better; they're what living creatures are meant to consume and be surrounded by. Anything else is ersatz, a defective substitute for the real thing, and quite possibly an outright work of evil by those forces on the other side.

Note that we're heading into some very deep things in many human cultures here, which is another reason that this is never an easy or simple argument to have. That split between natural and unnatural means that there was a time, before all this industrial horror, when people lived in the natural state. They never encountered anything artificial, because there was no such thing in the world. Now, a great number of cultures have a "Golden Age" myth, that distant time when everything was so much better - more pure, somehow, before things became corrupted into their present regrettable state. The Garden of Eden is the aspect this takes in the Christian religion, but you find similar things in many other traditions. (Interestingly, this often takes the form of an ancient age when humans spoke directly with the gods, in whatever form they took, which is one of the things that led Julian Jaynes to his fascinating, although probably unprovable hypotheses in The Origin of Consciousness in the Breakdown of the Bicameral Mind).

This Prelapsarian strain of thinking permeates the all-natural chemical-free worldview. There was a time when food and human health were so much better, and industrial civilization has messed it all up. We're surrounded by man-made toxins and horrible substitutes for real food, and we've lost the true path. It's no wonder that there's all this cancer and diabetes and autism and everything: no one ever used to get those things. Note the followup to this line of thought: someone did this to us. The more hard-core believers in this worldview are actually furious at what they see as the casual, deliberate poisoning of the entire population. The forces of evil, indeed.

And there are enough small reinforcing bars of truth to make all of this hold together quite well. There's no doubt that industrial poisons have sickened vast numbers of people in the past: mercury is just the first one that's come to mind. (I'm tempted to point out that mercury and its salts, by the standards of the cosmetics and supplements industries, are most certainly some of those all-natural minerals, but let that pass for now). We've learned more about waste disposal, occupational exposure, and what can go into food, but there have been horrible incidents that live on vividly in the imagination. And civilization itself didn't necessarily go about increasing health and lifespan for quite a while, as the statistics assembled in Gregory Clark's A Farewell to Alms make clear. In fact, for centuries, living in cities was associated with shorter lifespans and higher mortality. We've turned a lot of corners, but it's been comparatively recently.

And on the topic of "comparatively recently", there's one more factor at work that I'd like to bring up. The "chemical free" view of the world has the virtue of simplicity (and indeed, sees simplicity as a virtue itself). Want to stay healthy? Simple. Don't eat things with chemicals in them. Want to know if something is the right thing to eat, drink, wear, etc.? Simple: is it natural or not? This is another thing that makes some people who argue for this view so vehement - it's not hard, it's right in front of you, and why can't you see the right way of living when it's so, so. . .simple? Arguing against that, from a scientific point of view, puts a person at several disadvantages. You necessarily have to come in with all these complications and qualifying statements, trying to show how things are actually different than they look. That sounds like more special pleading, for one thing, and it's especially ineffective against a way of thinking that often leans toward thinking that the more direct, simple, and obvious something is, the more likely it is to be correct.

That's actually the default way of human thinking, when you get down to it, which is the problem. Science, and the scientific worldview, are unnatural things, and I don't mean that just in the whole-grain no-additives sense of "natural". I mean that they do not come to most people as a normal consequence of their experience and habits of thought. A bit of it does: "Hey, every time I do X, Y seems to happen". But where that line of thinking takes you starts to feel very odd very quickly. You start finding out that the physical world is a lot more complicated than it looks, that "after" does not necessarily mean "because", and that all rules of thumb break down eventually (and usually without warning). You find that math, of all things, seems to be the language that the universe is written in (or at least a very good approximation to it), and that's not exactly an obvious concept, either. You find that many of the most important things in that physical world are invisible to our senses, and not necessarily in a reassuring way, or in a way that even makes much sense at all at first. (Magical explanations of invisible forces at least follow human intuitions). It's no wonder that scientific thinking took such a long, long time to ever catch on in human history. I still sometimes think that it's only tolerated because it brings results.

So there are plenty of reasons why it's hard to effectively argue against the all-natural chemical-free worldview. You're asking your audience to accept a number of things that don't make much sense to them, and what's worse, many of these things look like rhetorical tricks at best and active (even actively evil) attempts to mislead them at worst. And all in the service of something that many of them are predisposed to regard as suspicious even from the start. It's uphill all the way.

Comments (53) + TrackBacks (0) | Category: General Scientific News | Snake Oil | Toxicology

June 30, 2014

Don't Learn to Science?

Email This Entry

Posted by Derek

In keeping with the discussions around here about STEM jobs and education, I wanted to pass along this link from Coding Horror: "Please Don't Learn to Code". It's written by a programmer, as you might guess, and here's his main point:

To those who argue programming is an essential skill we should be teaching our children, right up there with reading, writing, and arithmetic: can you explain to me how Michael Bloomberg would be better at his day to day job of leading the largest city in the USA if he woke up one morning as a crack Java coder? It is obvious to me how being a skilled reader, a skilled writer, and at least high school level math are fundamental to performing the job of a politician. Or at any job, for that matter. But understanding variables and functions, pointers and recursion? I can't see it.
Look, I love programming. I also believe programming is important … in the right context, for some people. But so are a lot of skills. I would no more urge everyone to learn programming than I would urge everyone to learn plumbing. That'd be ridiculous, right?

I see his point. He goes on to say that more code is not necessarily what we need in the world, and that coding is not the proper solution to many problems. On a less philosophic level, the learn-to-code movement also makes it seem as if this is the short path to a job, which is not quite aligned with reality, either.

I suppose I can support learning a tiny bit about programming just so you can recognize what code is, and when code might be an appropriate way to approach a problem you have. But I can also recognize plumbing problems when I see them without any particular training in the area. The general populace (and its political leadership) could probably benefit most of all from a basic understanding of how computers, and the Internet, work. Being able to get around on the Internet is becoming a basic life skill, and we should be worried about fixing that first and most of all, before we start jumping all the way into code.

Now let's apply that to learning about chemistry and biology. It's not going to be a very comfortable exercise, because I (and many of the people who read this site) have put a lot of time and effort into learning an awful lot of chemistry and biology. I've written before about the problem of how much science the "average" person should know, and the least controversial answer is "More than they do now". After that, the arguing starts.

It would be nice if everyone knew enough to make some of the ridiculous scams out there harder to work. "Eat whatever you want and still lose 10 pounds a week with this miracle fat-burning supplement!" would be greeted with "Hey, isn't that thermodynamically sort of impossible?". "New Super-Ionized Oxygenated Water Reverses Aging!" would meet with "How do you "super-ionize" water? And how much oxygen can it hold, anyway? And wouldn't that be, like, bleach?" It would be good if people had a slightly better idea of what causes cancer, how diabetes works, a bit better understanding of toxicology, and so on.

But then we're already supposed to be teaching everyone some of the basics, and it doesn't necessarily seem to be going all that well (evidence, both hopeful and not, can be found here and here). Everyone's supposedly exposed to some simple astronomy, but surveys always show a depressing amount of confusion, when it comes to the earth, moon, and sun, which one of them is going around which. Everyone's supposed to have been exposed to the idea of cells making up living organisms, to DNA, and so on, but you can still seemingly get away with all kinds of off-kilter claims about such things when talking to a lay audience.

Some readers will remember the "Why Are You Forcing My Son to Take Chemistry" guy from the Washington Post. I wish that I could argue that chemistry, and a good dose of it, is prima facie a requirement for any reasonably competent citizen, but I'm not quite there yet. But I'm also sure that being completely ignorant of chemistry is a good indicator of someone whose worldview is incomplete and could use some shoring up. You need some knowledge in these areas, but we could start with getting across the stuff we're trying to get across already.

What I am sure of, though, is that a certain amount of science and math really is necessary, and not just for the bare facts. My daughter, when she was learning the quadratic equation, asked me the classic question "Why am I learning this? When will I ever use it?" My response to her was that I, too, had rarely had recourse to the quadratic equation as it stood. But at the same time, learning these things was good for the mind. I told her that when I went to the gym, it wasn't because I was planning on having to do more repetitive squats with a weighted bar on my back any time soon. But strengthening my back and legs was a good thing in general, and helped out with a lot of other things in my day-to-day life, in both the short and long terms. The same with the mind. Memorized the quadratic formula was not a great deal of use in and of itself, but that realization she had, in one of those thrown-ball problems, that the height of the ball was at the origin at just two points (at the beginning and the end of its flight), and that was why solving for time at that height gave you two solutions - that flash of understanding, I told her, was the feeling of mental muscle being built up, capacity that she would need for more than just her math homework. Everyone could do with some of that exercise.

Comments (35) + TrackBacks (0) | Category: General Scientific News

June 21, 2014

Scripps Update

Email This Entry

Posted by Derek

I'm told now that after a number of faculty at the Scripps Research Institute have objected to the proposed deal with USC, that UC-San Diego is ready to explore some sort of merger deal. Going down North Torrey Pines Road would seem to be logistically easier than going to Los Angeles, but there are other considerations as well. This could go on for a while!

Comments (28) + TrackBacks (0) | Category: General Scientific News

June 17, 2014

Scripps Merging With USC?

Email This Entry

Posted by Derek

This news broke last night: that USC might be acquiring Scripps. It all looks to come down to tight federal money: that's where most of the funding comes from, and institutions that rely on grants (and overhead from grants) to survive are having to cut back. (There were, for example, a number of layoffs earlier this year at the Broad Institute in Cambridge, for just that reason).

As you can see from that story, I was called last night for comment about this, and I have to say, I was very surprised (although maybe not as surprised as the reporter was when I started quoting Tennyson). The loss of several big names over the last few years has made it clear that there were some difficulties inside Scripps, which made quite a contrast to the era when they burst on the organic chemistry scene by making huge offers to a number of star professors. But I couldn't think of another example where one department more or less takes over another one, and I especially couldn't think of anything happening at this level.

It doesn't look like a done deal yet, and even if it does go forward, how things will work is unclear. (It's also unclear what this means for the Scripps branch in Florida). More on this as it develops.

Comments (41) + TrackBacks (0) | Category: General Scientific News

April 1, 2014

Freeman Dyson on the PhD Degree

Email This Entry

Posted by Derek

From this interview:

"Oh, yes. I’m very proud of not having a Ph.D. I think the Ph.D. system is an abomination. It was invented as a system for educating German professors in the 19th century, and it works well under those conditions. It’s good for a very small number of people who are going to spend their lives being professors. But it has become now a kind of union card that you have to have in order to have a job, whether it’s being a professor or other things, and it’s quite inappropriate for that. It forces people to waste years and years of their lives sort of pretending to do research for which they’re not at all well-suited. In the end, they have this piece of paper which says they’re qualified, but it really doesn’t mean anything. The Ph.D. takes far too long and discourages women from becoming scientists, which I consider a great tragedy. So I have opposed it all my life without any success at all. . ."

Comments (73) + TrackBacks (0) | Category: General Scientific News

March 3, 2014

Sydney Brenner on the State of Science

Email This Entry

Posted by Derek

Via Retraction Watch, here's an outspoken interview with Sydney Brenner, who's never been the sort of person to keep his opinions bottled up inside him. Here, for example, are his views on graduate school in the US:

Today the Americans have developed a new culture in science based on the slavery of graduate students. Now graduate students of American institutions are afraid. He just performs. He’s got to perform. The post-doc is an indentured labourer. We now have labs that don’t work in the same way as the early labs where people were independent, where they could have their own ideas and could pursue them.

The most important thing today is for young people to take responsibility, to actually know how to formulate an idea and how to work on it. Not to buy into the so-called apprenticeship. I think you can only foster that by having sort of deviant studies. That is, you go on and do something really different. Then I think you will be able to foster it.

But today there is no way to do this without money. That’s the difficulty. In order to do science you have to have it supported. The supporters now, the bureaucrats of science, do not wish to take any risks. So in order to get it supported, they want to know from the start that it will work. This means you have to have preliminary information, which means that you are bound to follow the straight and narrow.

I can't argue with that. In academia these days, it seems to me that the main way that something really unusual or orthogonal gets done is by people doing something else with their grant money than they told people they'd do. Which has always been the case to some extent, but I get the impression it's more so than ever. The article also quotes from Brenner's appreciation of the late Fred Sanger, where he made a similar point:

A Fred Sanger would not survive today’s world of science. With continuous reporting and appraisals, some committee would note that he published little of import between insulin in 1952 and his first paper on RNA sequencing in 1967 with another long gap until DNA sequencing in 1977. He would be labelled as unproductive, and his modest personal support would be denied. We no longer have a culture that allows individuals to embark on long-term—and what would be considered today extremely risky—projects.

Here are Brenner's mild, temperate views on the peer-review system and its intersection with academic publishing:

. . .I don’t believe in peer review because I think it’s very distorted and as I’ve said, it’s simply a regression to the mean.

I think peer review is hindering science. In fact, I think it has become a completely corrupt system. It’s corrupt in many ways, in that scientists and academics have handed over to the editors of these journals the ability to make judgment on science and scientists. There are universities in America, and I’ve heard from many committees, that we won’t consider people’s publications in low impact factor journals.

Now I mean, people are trying to do something, but I think it’s not publish or perish, it’s publish in the okay places [or perish]. And this has assembled a most ridiculous group of people. I wrote a column for many years in the nineties, in a journal called Current Biology. In one article, “Hard Cases”, I campaigned against this [culture] because I think it is not only bad, it’s corrupt. In other words it puts the judgment in the hands of people who really have no reason to exercise judgment at all. And that’s all been done in the aid of commerce, because they are now giant organisations making money out of it.

I don't find a lot to disagree with there, either. The big scientific publishers have some good people working for them, but the entire cause is more and more suspect. THere's a huge moral hazard involved, which we don't seem to be avoiding very well at all.

Comments (35) + TrackBacks (0) | Category: General Scientific News | The Scientific Literature

February 17, 2014

Down With P Values

Email This Entry

Posted by Derek

I'd like to recommend this article from Nature (which looks to be open access). It details the problems with using p-values for statistics, and it's simultaneously interesting and frustrating to read. The frustrating part is that the points it makes have been made many times before, but to little or no effect. P-values don't mean what a lot of people think that they mean, and what meaning that have can be obscured by circumstances. There really should be better ways for scientists to communicate the statistical strength of their results:

One result is an abundance of confusion about what the P value means. Consider Motyl's study about political extremists. Most scientists would look at his original P value of 0.01 and say that there was just a 1% chance of his result being a false alarm. But they would be wrong. The P value cannot say this: all it can do is summarize the data assuming a specific null hypothesis. It cannot work backwards and make statements about the underlying reality. That requires another piece of information: the odds that a real effect was there in the first place. To ignore this would be like waking up with a headache and concluding that you have a rare brain tumour — possible, but so unlikely that it requires a lot more evidence to supersede an everyday explanation such as an allergic reaction. The more implausible the hypothesis — telepathy, aliens, homeopathy — the greater the chance that an exciting finding is a false alarm, no matter what the P value is.

Critics also bemoan the way that P values can encourage muddled thinking. A prime example is their tendency to deflect attention from the actual size of an effect. Last year, for example, a study of more than 19,000 people showed that those who meet their spouses online are less likely to divorce (p < 0.002) and more likely to have high marital satisfaction (p < 0.001) than those who meet offline (see Nature; 2013). That might have sounded impressive, but the effects were actually tiny: meeting online nudged the divorce rate from 7.67% down to 5.96%, and barely budged happiness from 5.48 to 5.64 on a 7-point scale. To pounce on tiny P values and ignore the larger question is to fall prey to the “seductive certainty of significance”, says Geoff Cumming, an emeritus psychologist at La Trobe University in Melbourne, Australia. But significance is no indicator of practical relevance, he says: “We should be asking, 'How much of an effect is there?', not 'Is there an effect?'”

The article has some suggestions about what to do, but seems guardedly pessimistic about the likelihood of change. The closer you look at it, though, the more our current system looks like an artifact that was never meant to be used in the way we're using it.

Comments (30) + TrackBacks (0) | Category: General Scientific News

February 6, 2014

Crowdfunding Independent Research

Email This Entry

Posted by Derek

I've written about Ethan Perlstein's work here before, and now I note that the Wall Street Journalhas an article about his crowdfunding research model.

Ethan O. Perlstein for years followed a traditional path as a scientist. He earned a Ph.D. in molecular biology from Harvard, spent five years doing postdoctoral research at Princeton and led a team that published two papers on pharmacology.

But last year, Dr. Perlstein was turned down by 27 universities when he sought a tenure-track position to set up his own lab. Hundreds of candidates had applied for a small number of positions, the universities said, a situation made worse by cuts in federal research funding.. . .

. . .Still, Dr. Perlstein's approach is unusual because he isn't raising money to support a discrete project or product. "Ethan is doing basic research," said Jessica Richman, co-founder of Ubiome, a health and wellness startup that raised more than $350,000 through crowdfunding on a site called Indiegogo. "He is selling the idea that he is an independent scientist doing research."

Dr. Perlstein plans to launch his public appeal for Perlstein Lab this week on a site called AngelList. Perlstein Lab will focus on finding drugs to treat lysosomal storage diseases, in which cells fail to produce and recycle waste. The materials accumulate in cells and can cause a range of problems, including death.

Here's his profile page on AngelList, which seeks money from what the SEC calls "qualified" investors (high net worth individuals). I think that's probably a good idea - anyone who's done "angel" type investments before will have a more realistic idea of the chance of any return (you'd hope). Crowdfunding research, in general, is something that interests me a great deal, although it's easy to think of potential problems.

Comments (12) + TrackBacks (0) | Category: Business and Markets | General Scientific News

October 29, 2013

Humble Enzyme Dodges Spotlight

Email This Entry

Posted by Derek

Here's the latest biochemical news from The Onion, which is at least as reliable as some journals. What I think I like the best is that the person who wrote this clearly understood some details about amylase and about enzyme function in general. An alternative science career?

Comments (10) + TrackBacks (0) | Category: General Scientific News

October 7, 2013

The 2013 Medicine/Physiology Nobel: Traffic

Email This Entry

Posted by Derek

This year's Medicine Nobel is one that's been anticipated for some time. James Rothman of Yale, Randy W. Schekman of Berkeley, and Thomas C. Südhof of Stanford are cited for their fundamental discoveries in vesicular trafficking, and I can't imagine anyone complaining that it wasn't deserved. (The only controversy would be thanks, once again, to the "Rule of Three" in Alfred Nobel's will. Richard Scheller of Genentech has won prizes with Südhof and with Scheller for his work in the same field).
Here's the Nobel Foundation's scientific summary, and as usual, it's a good one. Vesicles are membrane-enclosed bubbles that bud off from cellular compartments and transport cargo to other parts of the cell (or outside it entirely), where they merge with another membrane and release their contents. There's a lot of cellular machinery involved on both the sending and receiving end, and that's what this year's winners worked out.

As it turns out, there are specific proteins (such as the SNAREs) imbedded in intracellular membranes that work as an addressing system: "tie up the membrane around this point and send the resulting globule on its way", or "stick here and start the membrane fusion process". This sort of thing is going on constantly inside the cell, and the up-to-the-surface-and-out variation is particularly noticeably in neurons, since they're constantly secreting neurotransmitters into the synapse. That latter process turned out to be very closely tied to signals like local calcium levels, which gives it the ability to be turned on and off quickly.

As the Nobel summary shows, a lot of solid cell biology had to be done to unravel all this. Scheckman looked for yeast cells that showed obvious mutations in their vesicle transport and tracked down what proteins had been altered. Rothman started off with a viral infection system that produced a lot of an easily-trackable protein, and once he'd identified others that helped to move it around, he used these as affinity reagents to find what bound to them in turn. This work dovetailed very neatly with the proteins that Scheckman's lab had identified, and suggested (as you'd figure) that this machinery was conserved across many living systems. Südhof then extended this work into the neurotransmitter area, discovering the proteins involved in the timing signals that are so critical in those cells, and demonstrating their function by generating mouse knockout models along the way.

The importance of all these processes to living systems can't be overstated. Eukaryotic cells have to be compartmentalized to function; there's too much going on for everything to be in the same stew pot all at the same time. So a system for "mailing" materials between those regions is vital. And in the same way, cells have to communicate with others, releasing packets of signaling molecules under very tight supervision, and that's done through many of the same mechanisms. You can trace the history of our understanding of these things through years of Nobel awards, and there will surely be more.

Comments (15) + TrackBacks (0) | Category: Biological News | General Scientific News

September 30, 2013

A War On Expertise?

Email This Entry

Posted by Derek

I see that Popular Science is shutting down the comments function on their web site. Like a lot of news organizations, I think that their signal/noise was pretty low in the comments. (And that prompts me to express, again, my appreciation for the commenters on this blog - one of the first questions I get when I talk to anyone else who runs a web site is how on Earth the comments section around here stays so readable and sane!)

They're citing some experiments that seem to show that fractious comments sections actually make the original posts above them seem less reliable, and that may be how it works. In reality, my impression is that if a site seems to have a lot of fist-waving in the comments section, that pretty soon most readers don't even bother with it, and the only ones that show up are there for the fights. I'll say this for the Popular Science folks - they're not doing this for monetary/traffic reasons, because wildly argumentative comments sections also drive traffic from the people who just can't stay away (and hit "Refresh" over and over in the process).

Here's the key quote from their article:

A politically motivated, decades-long war on expertise has eroded the popular consensus on a wide variety of scientifically validated topics. Everything, from evolution to the origins of climate change, is mistakenly up for grabs again. Scientific certainty is just another thing for two people to "debate" on television. And because comments sections tend to be a grotesque reflection of the media culture surrounding them, the cynical work of undermining bedrock scientific doctrine is now being done beneath our own stories, within a website devoted to championing science.

I know where they're coming from, but I'm not so sure about that "again". My belief is that there hasn't been a time when evolution was not controversial with many people, nor climate change/global warming. The internet, it's true, gives everyone with a point of view a chance to ventilate, so it brings this sort of thing to the surface much more easily than in the past. (Look back a few decades, and ask yourself what was available to someone with a strong opinion. Letters to the editor? Soapbox in the park? Handling out flyers on the corner?)

And I don't think that there's been any big, coherent, "decades-long war on expertise". If there is, then there always has been. It makes a person feel better to believe these things, but that's the sort of self-congratulatory thinking that I believe one has to avoid. "I'm too smart for the crowd, the mob - a member of a persecuted minority just because I see the truth. . ." That doesn't do anything to help your own reasoning.

No, most people don't understand scientific topics, but most people never have. If anything, I'd be willing to bet that the population today is more literate in these matters than ever before. The sorts of people who go hunting through web sites looking for things to confirm their own opinions have always been with us. As have groups who'd rather obfuscate topics than debate them, for reasons of their own. We just have a better look at the whole process these days.

Comments (45) + TrackBacks (0) | Category: Blog Housekeeping | General Scientific News

September 26, 2013

Nobel Speculation Time

Email This Entry

Posted by Derek

As we approach October, Nobel Speculation Season is upon us again. And Ash is right at Curious Wavefunction - making the predictions gets easier every year, because you get to keep the lists you had from before, with maybe a name or two removed because they actually won. Paul at Chembark usually does a long post on the subject this time of year, but he seems to have his hands full (understandably!) getting his academic lab set up and teaching his courses the first time through.

The Thomson Reuters people have made their annual predictions, based on citation counts and such measures, and so far every other article I've seen in the press is based on their lists. Some years I speculate myself, so for what they're worth, here's my 2011 list, here's 2010, here's 2009, 2008, and 2006.

What's the landscape like this year? Thomson/Reuters have made a bold case for Sharpless/Folkin/Finn for the Huisgen-style "click" chemistry. I know that the thought has crossed my mind before (and shown up in the comments here before as well, several times). A second Nobel for Sharpless would be quite a feat, and it really says something that people consider it a possibility. John Bardeen got Physics twice, and Sanger got Chemistry twice. Marie Curie's the only person to win in two different sciences. (And yes, there's Pauling, the only winner of two unshared prizes, but one of them was Peace, which has too often been a real eye-roller of an award). R. B. Woodward, had he lived, would certainly have won Chemistry twice, since there was an award for the Woodward-Hoffman rules after his death.

Thomson/Reuters also proposes Alvisatros/Mirkin/Seeman for a DNA nanotechnology prize in chemistry, and Bruce Ames for the Ames mutagenicity test. (Man, it feels strange to link to something that I blogged about eleven years ago!) Of those two, I think Ames is a better bet, since he really did change the field of toxicology forever. It would make more sense for a physiology/medicine prize, but we all know how things go. On the other hand, I think that the DNA nanoparticle work will be given a chance to have a greater real-world impact before it gets a prize.

I like the Wavefunction picks a lot, too. He's looking at single-molecule spectroscopy for a physical/analytical chemistry prize, and he points out that surface plasmon resonance and solid-state NMR (among other NMR methods) haven't won, either. Considering the 2002 prize to Fenn for electrospray MS, these are very plausible, although I have to say that LC/MS (considering the way it's taken over the analytical world in the last 20 years) made that one a clear choice. His other top picks are nuclear receptors, electron transfer in biological systems, chaperone proteins, cancer genetics, and some sort of chemical biology prize. I wouldn't be a bit surprised at any of those - like the palladium-coupling prize, the biggest problem will be figuring out who to award some of these to, not determining whether they're worthy of winning.

My own guess? From the Thomson/Reuters and Wavefunction lists, I like Ames, nuclear receptors, single-molecule spectroscopy, and chaperone proteins. Just behind those are electron transfer and then the triazole click work. I don't see an organic synthesis prize at all; I think this is going to be physical/analytical if it's going to be inside the traditional bounds of chemistry. And with that in mind, the committee might just use the chemistry prize for Venter et al. on DNA sequencing methods, or for Solomon Snyder et al. on neurotransmitters. We shall see!

Comments (64) + TrackBacks (0) | Category: General Scientific News

September 25, 2013

MacArthur Awards in Chemistry

Email This Entry

Posted by Derek

Congratulations to Phil Baran of Scripps for getting a MacArthur Foundation grant. There aren't many of those that have landed in the field of chemistry - a commenter here points out Carolyn Bertozzi at Berkeley, Laura Kiessling at Wisconsin, and Melanie Sanford at Michigan as the past winners. A worthy bunch!

Comments (31) + TrackBacks (0) | Category: General Scientific News

May 29, 2013

The Hydrogen Wave Function, Imaged

Email This Entry

Posted by Derek

Here's another one of those images that gives you a bit of a chill down the spine. You're looking at a hydrogen atom, and those spherical bands are the orbitals in which you can find its electron. Here, people, is the wave function. Yikes.Update: true, what you're seeing are the probability distributions as defined by the wave function. But still. . .
This is from a new paper in Physical Review Letters (here's a commentary at the APS site on it). Technically, what we're seeing here are Stark states, which you get when the atom is exposed to an electric field. Here's more on how the experiment was done:

In their elegant experiment, Stodolna et al. observe the orbital density of the hydrogen atom by measuring a single interference pattern on a 2D detector. This avoids the complex reconstructions of indirect methods. The team starts with a beam of hydrogen atoms that they expose to a transverse laser pulse, which moves the population of atoms from the ground state to the 2s and 2p orbitals via two-photon excitation. A second tunable pulse moves the electron into a highly excited Rydberg state, in which the orbital is typically far from the central nucleus. By tuning the wavelength of the exciting pulse, the authors control the exact quantum numbers of the state they populate, thereby manipulating the number of nodes in the wave function. The laser pulses are tuned to excite those states with principal quantum number n equal to 30.

The presence of the dc field places the Rydberg electron above the classical ionization threshold but below the field-free ionization energy. The electron cannot exit against the dc field, but it is a free particle in many other directions. The outgoing electron wave accumulates a different phase, depending on the direction of its initial velocity. The portion of the electron wave initially directed toward the 2D detector (direct trajectories) interferes with the portion initially directed away from the detector (indirect trajectories). This produces an interference pattern on the detector. Stodolna et al. show convincing evidence that the number of nodes in the detected interference pattern exactly reproduces the nodal structure of the orbital populated by their excitation pulse. Thus the photoionization microscope provides the ability to directly visualize quantum orbital features using a macroscopic imaging device.

n=30 is a pretty excited atom, way off the ground state, so it's not like we're seeing a garden-variety hydrogen atom here. But the wave function for a hydrogen atom can be calculated for whatever state you want, and this is what it should look like. The closest thing I know of to this is the work with field emission electron microscopes, which measure the ease of moving electrons from a sample, and whose resolution has been taken down to alarming levels).

So here we are - one thing after another that we've had to assume is really there, because the theory works out so well, turns out to be observable by direct physical means. And they are really there. Schoolchildren will eventually grow up with this sort of thing, but the rest of us are free to be weirded out. I am!

Comments (17) + TrackBacks (0) | Category: General Scientific News

May 14, 2013

A Specific Crowdfunding Example

Email This Entry

Posted by Derek

I mentioned Microryza in that last post. Here's Prof. Michael Pirrung, at UC Riverside, with an appeal there to fund the resynthesis of a compound for NCI testing against renal cell carcinoma. It will provide an experienced post-doc's labor for a month to prepare an interesting natural-product-derived proteasome inhibitor that the NCI would like to take to their next stage of evaluation. Have a look - you might be looking at the future of academic research funding, or at least a real part of it.

Comments (14) + TrackBacks (0) | Category: Cancer | General Scientific News

Crowdfunding Research

Email This Entry

Posted by Derek

Crowdfunding academic research might be changing, from a near-stunt to an widely used method of filling gaps in a research group's money supply. At least, that's the impression this article at Nature Jobs gives:

The practice has exploded in recent years, especially as success rates for research-grant applications have fallen in many places. Although crowd-funding campaigns are no replacement for grants — they usually provide much smaller amounts of money, and basic research tends to be less popular with public donors than applied sciences or arts projects — they can be effective, especially if the appeals are poignant or personal, involving research into subjects such as disease treatments.

The article details several venues that have been used for this sort of fund-raising, including Indiegogo, Kickstarter, RocketHub, FundaGeek, and SciFund Challenge. I'd add Microryza to that list. And there's a lot of good advice for people thinking about trying it themselves, including how much money to try for (at least at first), the timelines one can expect, and how to get your message out to potential donors.

Overall, I'm in favor of this sort of thing, but there are some potential problems. This gives the general pubic a way to feel more connected to scientific research, and to understand more about what it's actually like, both of which are goals I feel a close connection to. But (as that quote above demonstrates), some kinds of research are going to be an easier sell than others. I worry about a slow (or maybe not so slow) race to the bottom, with lab heads overpromising what their research can deliver, exaggerating its importance to immediate human concerns, and overselling whatever results come out.

These problems have, of course, been noted. Ethan Perlstein, formerly of Princeton, used RocketHub for his crowdfunding experiment that I wrote about here. And he's written at Microryza with advice about how to get the word out to potential donors, but that very advice has prompted a worried response over at SciFund Challenge, where Jai Ranganathan had this to say:

His bottom line? The secret is to hustle, hustle, hustle during a crowdfunding campaign to get the word out and to get media attention. With all respect to Ethan, if all researchers running campaigns follow his advice, then that’s the end for science crowdfunding. And that would be a tragedy because science crowdfunding has the potential to solve one of the key problems of our time: the giant gap between science and society.

Up to a point, these two are talking about different things. Perlstein's advice is focused on how to run a successful crowdsourcing campaign (based on his own experience, which is one of the better guides we have so far), while Ranganathan is looking at crowdsourcing as part of something larger. Where they intersect, as he says, is that it's possible that we'll end up with a tragedy of the commons, where the strategy that's optimal for each individual's case turns out to be (very) suboptimal for everyone taken together. He's at pains to mention that Ethan Perlstein has himself done a great job with outreach to the public, but worries about those to follow:

Because, by only focusing on the mechanics of the campaign itself (and not talking about all of the necessary outreach), there lurks a danger that could sink science crowdfunding. Positive connections to an audience are important for crowdfunding success in any field, but they are especially important for scientists, since all we have to offer (basically) is a personal connection to the science. If scientists omit the outreach and just contact audiences when they want money, that will go a long way to poisoning the connections between science and the public. Science crowdfunding has barely gotten started and already I hear continuous complaints about audience exasperation with the nonstop fundraising appeals. The reason for this audience fatigue is that few scientists have done the necessary building of connections with an audience before they started banging the drum for cash. Imagine how poisonous the atmosphere will become if many more outreach-free scientists aggressively cold call (or cold e-mail or cold tweet) the universe about their fundraising pleas.

Now, when it comes to overpromising and overselling, a cynical observer might say that I've just described the current granting system. (And if we want even more of that sort of thing, all we have to do is pass a scheme like this one). But the general public will probably be a bit easier to fool than a review committee, at least, if you can find the right segment of the general public. Someone will probably buy your pitch, eventually, if you can throw away your pride long enough to keep on digging for them.

That same cynical observer might say that I've just described the way that we set up donations to charities, and indeed Ranganathan makes an analogy to NPR's fundraising appeals. That's the high end. The low end of the charitable-donation game is about as low as you can go - just run a search for the words "fake" and "charity" through Google News any day, any time, and you can find examples that will make you ashamed that you have the same number of chromosomes as the people you're reading about. (You probably do). Avoiding this state really is important, and I'm glad that people are raising the issue already.

What if, though, someone were to set up a science crowdfunding appeal, with hopes of generating something that could actually turn a profit, and portions of that to be turned over to the people who put up the original money? We have now arrived at the biopharma startup business, via a different road than usual. Angel investors, venture capital groups, shareholders in an IPO - all of these people are doing exactly that, at various levels of knowledge and participation. The pitch is not so much "Give us money for the good of science", but "Give us money, because here's our plan to make you even more". You will note that the scale of funds raised by the latter technique make those raised by the former look like a roundoff error, which fits in pretty well with what I take as normal human motivations.

But academic science projects have no such pitch to make. They'll have to appeal to altruism, to curiosity, to mood affiliation, and other nonpecuniary motivations. Done well, that can be a very good thing, and done poorly, it could be a disaster.

Comments (20) + TrackBacks (0) | Category: Academia (vs. Industry) | Business and Markets | General Scientific News

April 25, 2013

More Single-Cell Magnetic Imaging

Email This Entry

Posted by Derek

Earlier this year, I wrote about a method to do NMR experiments at the cellular level or below. A new paper uses this same phenomenon (nitrogen-vacancy defects near the surface of diamond crystals) to do magnetic imaging of individual bacteria.

It's well known that many bacteria have "magnetosome" structures that allow them to sense and react to magnetic fields. If you let them wander over the surface of one of these altered diamond crystals, you can use the single-atom unpaired electrons as sensors. This team (several groups at Harvard and at Berkeley) were able to get sub-cellular resolution, and correlate that with real-time optical images of the bacteria (Magnetospirillum magneticum). It's very odd to see images of single bacteria with their field strengths looking like little bar magnets, but there they are. What we'll find by looking at magnetic fields inside individual cells, I have absolutely no idea, but I hope for all kinds of interesting and baffling things. I wonder what you'd get when mammalian cells take up magnetic nanoparticles, for example?

In other news, it's already late April, and things are already far enough along for me to talk about something on the blog as having happened "earlier this year". Sheesh.

Comments (0) + TrackBacks (0) | Category: General Scientific News

March 19, 2013

Scientific Snobbery

Email This Entry

Posted by Derek

Here's something that you don't see mentioned very often in science, but it's most certainly real: snobbery:

We all do it. Pressed for time at a meeting, you can only scan the presented abstracts and make snap judgements about what you are going to see. Ideally, these judgements would be based purely on what material is of most scientific interest to you. Instead, we often use other criteria, such as the name of the researchers presenting or their institution. I do it too, passing over abstracts that are more relevant to my work in favour of studies from star universities such as Stanford in California or Harvard in Massachusetts because I assume that these places produce the 'best' science.

As someone who is based at a less well-known institution, the University of South Dakota in Vermillion, I see other scientists doing the same to me and my students. In many cases, this is a loss: to my students and their projects, which could have benefited from the input, and to the investigators who might have missed information that could have been useful in their own work.

It's true. This carries over to industry, too, both in the ways that people look at other's academic backgrounds, and even in terms of industrial pedigrees. Working for a biopharma that's been successful, that everyone's heard of, does a lot more for your reputation than working for one that no one knows anything about. The unspoken supposition is that a really small obscure company must have had to reach lower down the ladder to hire people, even though this might not be the case at all.

I have no idea of what could be done about this, because I think it's sheer human nature. The best we can do, I think, is to realize that it happens and to try to consciously correct for it when we can. It's realistic to assume that some small school doesn't have the resources that a larger one has, or that a professor at one has more trouble attracting students. But beyond that, you have to be careful. Some very smart people have come out of some very obscure backgrounds, and you can't - and shouldn't - assume anything in that line.

Comments (30) + TrackBacks (0) | Category: General Scientific News

February 27, 2013

A Nobel Follow-Up

Email This Entry

Posted by Derek

Those of you who remember the Green Fluorescent Protein Nobel story will likely recall Douglas Prasher. He was the earliest discoverer of GFP, and Roger Tsien has said that he has no idea why he didn't get the Nobel as well. But Prasher, after a series of career and personal reverses, ended up driving a shuttle bus in Huntsville by the time the prize was announced.

Well, he's back in science again - and working in the Tsien lab. Here's the story, which I was very glad to read. Prasher's clearly smart and talented, and I hope that he can put all that to good use. A happy ending?

Comments (15) + TrackBacks (0) | Category: General Scientific News

January 8, 2013

Overly Honest Experimental Methods

Email This Entry

Posted by Derek

If you'd like a look under the hood of a lot of research publications, go over to Twitter and check the #OverlyHonestMethods tag. You're sure to find your own sins on display, things like: "Mostly it goes 43%, but once it went 95%. We reported the 95%." And "We used [this program] because doesn't everybody else?". How about "We used a modified version of Dr. Ididitfirst's apparatus, because we couldn't figure out how to make an exact replica" or "For details see Supp. Mat. We put as much as possible in there because it doesn't have to be written as carefully".

There are dozens of them, and more coming all the time. I'm adding a few myself, not that I would ever do anything like these, though, you understand.

Comments (27) + TrackBacks (0) | Category: General Scientific News

January 4, 2013

Anti-GMO. Until This Week.

Email This Entry

Posted by Derek

I wanted to take a moment to highlight this speech, given recently by environmentalist and anti-genetically modified organism activist Mark Lynas.

Let's make that former anti-GMO activist. As the speech makes clear, he's had a completely change of heart:

I want to start with some apologies. For the record, here and upfront, I apologise for having spent several years ripping up GM crops. I am also sorry that I helped to start the anti-GM movement back in the mid 1990s, and that I thereby assisted in demonising an important technological option which can be used to benefit the environment.

As an environmentalist, and someone who believes that everyone in this world has a right to a healthy and nutritious diet of their choosing, I could not have chosen a more counter-productive path. I now regret it completely.

. . .(This was) explicitly an anti-science movement. We employed a lot of imagery about scientists in their labs cackling demonically as they tinkered with the very building blocks of life. Hence the Frankenstein food tag – this absolutely was about deep-seated fears of scientific powers being used secretly for unnatural ends. What we didn’t realise at the time was that the real Frankenstein’s monster was not GM technology, but our reaction against it. . .

. . .desperately-needed agricultural innovation is being strangled by a suffocating avalanche of regulations which are not based on any rational scientific assessment of risk. The risk today is not that anyone will be harmed by GM food, but that millions will be harmed by not having enough food, because a vocal minority of people in rich countries want their meals to be what they consider natural.

As this post and this one make clear, I agree with this point of view wholeheartedly. I'm very glad to see this change of heart, and I hope that Lynas is able to get more people to thinking about this issue. He should be ready for a rough ride, though. . .

Update: well, not quite just this week. Lynas' recent book The God Species, which is referred to in the speech, marks his public break with his former views. He's also recently come to the defense of nuclear power - a view I also support - and this interview gives some of the reactions he's had so far to these turnabouts.

Comments (51) + TrackBacks (0) | Category: Current Events | General Scientific News

October 24, 2012

Chem Coach Carnival: A Few Questions

Email This Entry

Posted by Derek

Over at Just Like Cooking, See Arr Oh has been organizing a "Chem Coach Carnival". He's asking chemists (blogging and otherwise) some questions about their work, especially for the benefit of people who don't do it (or not yet), and I'm glad to throw an entry into the pile:

Describe your current job
My current job is titled "Research Fellow", but titles like this are notoriously slippery in biotech/pharma. What I really do is work in very early-stage research, pretty much the earliest that a medicinal chemist can get involved in. I help to think up new targets and work with the biologists to get them screened, then work to evaluate what comes out of the screening. Is it real? Is it useful? Can it be advanced? If not, what other options do we have to find chemical matter for the target?

What do you do in a standard "work day?"
My work day divides between my office and my lab. In the office, I'm digging around in the new literature for interesting things that my company might be able to use (new targets, new chemistry, new technologies). And I'm also searching for more information on the early projects that we're prosecuting now: has anyone else reported work on these, or something like them? And there are the actual compound series that we're working on - what's known about things of those types (if anything?) Have they ever been reported as hits for other targets? Any interesting reactions known for them that we could tap into? There are broad project-specific issues to research as well - let's say that we're hoping to pick up some activity or selectivity in a current series by targeting a particular region of our target protein. So, how well has that worked out for other proteins with similar binding pockets? What sorts of structures have tended to hit?

In the lab, I actually make some of the new compounds for testing on these ongoing projects. At this stage in my career (I've been in the industry since 1989), my main purpose is not cranking out compounds at the bench. But I can certainly contribute, and I've always enjoyed the physical experience of making new compounds and trying new reactions. It's a good break from the office, and the office is a good break from the lab when I have a run of discovering new ways to produce sticky maroon gunk. (Happens to everyone).

This being industry, there are also meetings. But I try to keep those down to a minimum - when my calendar shows a day full of them, I despair a bit. Most of the time, my feelings when leaving a meeting are those of Samuel Johnson on Paradise Lost: "None ever wished it longer".

Note: I've already described what happens downstream of me - here's one overview.

What kind of schooling / training / experience helped you get there?
I have a B.A. and a Ph.D., along with a post-doc. But by now, those are getting alarmingly far back in the past. What really counts these days is my industrial experience, which is now up to 23 years, at several different companies. Over that time, I don't think I've missed out on a single large therapeutic area or class of targets. And I've seen projects fail in all sorts of ways (and succeed in a few as well) - my worth largely depends on what I've learned from all of them, and applying it to the new stuff that's coming down the chute.

That can be tricky. The failings of inexperience are well known, but experience has its problems, too. There can be a tendency to assume that you really have seen everything before, and that you know how things are going to turn out. This isn't true. You can help to avoid some of the pitfalls you've tumbled into in the past, but drug research is big enough and varied enough that new ones are always out there. And things can work out, too, for reasons that are not clear and not predictable. My experience is worth a lot - it had better be - but that value has limits, and I need to be the first person to keep that in mind.

How does chemistry inform your work?
It's the absolute foundation of it. I approach biology thinking like a chemist; I approach physics thinking like a chemist. One trait that's very strong in my research personality is empiricism: I am congenitally suspicious of model systems, and I'd far rather have the data from the real experiment. And those real experiments need to be as real as possible, too. If you say enzyme assay, I'll ask for cells. If you have cell data, I'll ask about mice. Mice lead to dogs, and dogs lead to humans, and there's where we really find out if we have a drug, and not one minute before.

In general, if you say that something's not going to work, I'll ask if you've tried it. Not every experiment is feasible, or even wise, but a surprising amount of data gets left, ungathered, because someone didn't bother to check. Never talk yourself out of an easy experiment.

Finally, a unique, interesting, or funny anecdote about your career
People who know me, from my wife and kids to my labmates, will now groan and roll their eyes, because I am a walking collection of such things. Part of it's my Southern heritage; we love a good story well told. I think I'll go back to grad school for this one; I'm not sure if I've ever told it here on the blog:

When I first got to Duke, I was planning on working for Prof. Bert Fraser-Reid, who was doing chiral synthesis of natural products using carbohydrate starting materials. In most graduate departments, there's a period where the new students attend presentations by faculty members and then associate themselves with someone that they'd like to work for. During this process, I wanted to set up an interview with Fraser-Reid, so I left a note for him to that effect, with my phone number. His grad students told me, though, that he was out of town (which was not hard to believe; he traveled a great deal).

That night I was back in my ratty shared house off of Duke's East Campus, which my housemates and I were soon to find out we could not afford to actually heat for the winter (save for a coal stove in the front room). And at 9 PM, I was expecting a call from a friend of mine at Vanderbilt, a chemistry=major classmate of mine from my undergraduate school (Hendrix) who knew that I was trying to sign up with Fraser-Reid's group. So at 9 PM sharp, the phone rings, and I pick it up to hear my friend's voice, as if through a towel held over the phone, saying that he was Dr. Fraser-Reid, at Duke.

Hah! Nice try. "You fool, he's out of town!" I said gleefully. There was a pause at the other end of the line. "Ah, is this Derek Lowe? This is Dr. Fraser-Reid, at Duke." And that's when it dawned on me: this was Dr. Fraser-Reid. At Duke. One of my housemates was in the room while this was going on, and he told me that he'd thought until then that watching someone go suddenly pale was just a figure of speech. The blood drained from my brain as I stammered out something to the effect that, whoops, uh, sorry, I thought that he was someone else, arrgh, expecting another call, ho-ho, and so on. We did set up an appointment, and I actually ended up in his group, although he should have known better after that auspicious start. This particular mistake I have not repeated, I should add. Ever restless and exploring, I have moved on to other mistakes since then.

Comments (7) + TrackBacks (0) | Category: General Scientific News | Graduate School | Life in the Drug Labs

August 14, 2012

Reproducing Scientific Results - On Purpose

Email This Entry

Posted by Derek

We've spoken several times around here about the problems with reproducing work in the scientific literature. You have to expect some slippage on cutting-edge work, just because it's very complex and is being looked at for the first time. But at the same time, it's that sort of work that we're depending on to advance a field, so when it turns out to be wrong, it causes more damage than something older and more obscure that falls apart.

There's a new effort which is trying to attack the problem directly. Very directly. The Reproducibility Initiative is inviting people to have their work independently confirmed by third-party researchers. You'll be responsible for the costs, but at the end of it, you'll have a certification that your results have been verified. The validation studies themselves will be published in the new PLOS ONE Reproducibility Collection, and several leading publishers have agreed to link the original publications back to this source.

I very much hope that this catches on. The organizers have rounded up an excellent advisory committee, with representatives from academia and industry, both of whom would be well served by more accurate scientific publication. I can especially see this being used when someone is planning to commercialize some new finding - going to the venture capital folks with independent verification will surely count for a lot. Granting agencies should also pay attention, and reward people accordingly.

Here's an article by Carl Zimmer with more on the idea. I'll be keeping a close eye on this myself, and hope to highlight some of the first studies to make it through the process. With any luck, this can become the New Normal for groundbreaking scientific results.

Comments (27) + TrackBacks (0) | Category: General Scientific News | The Scientific Literature

August 7, 2012

GSK's Anti-Doping Ad

Email This Entry

Posted by Derek

Courtesy of a reader in the UK, here's an ad from GlaxoSmithKline that I don't think has been seen much on this side of the Atlantic. I hadn't realized that they were involved in the drug testing for the London games; it's interesting that their public relations folks feel that it's worth highlighting. They're almost certainly right - I think one of the major objections people have when they hear of a case of athletic doping is a violation of the spirit of fair play.

But one can certainly see the hands of the advertising people at work. The napthyl rings for the double-O of "blood" are a nice touch, but the rest of the "chemistry" is complete nonsense. Update: it's such complete nonsense that they have the double bonds in the napthyl banging into each other, which I hadn't even noticed at first. Is it still a "Texas Carbon" when it's from London? In fact, it's so far off that it took me a minute of looking at the image to realize that the reason things were written so oddly was that the words were supposed to be more parts of a chemical formula. It's that wrong - the chemical equivalent of one of those meaningless Oriental language tattoos.

But as in the case of the tattoos, it probably gets its message across to people who've never been exposed to any of the actual symbols and syntax. I'd be interested to know if this typography immediately says "Chemistry!" to people who don't know any. I don't have many good opportunities to test that, though - everyone around me during the day knows the lingo!

Comments (52) + TrackBacks (0) | Category: Analytical Chemistry | General Scientific News

August 6, 2012

A Brief Word From Mars

Email This Entry

Posted by Derek

I was up late last night, watching the folks at JPL celebrate the landing of the Mars Science Laboratory, Curiosity. (And needless to say, I was glad to see that the elaborate landing technology worked so well, as opposed to the back-up technique of "lithobraking", which is reliable but a bit hard on the equipment). I'm looking forward to seeing updates on Martian chemistry for the next few years.

And since we are well into the 21st century, it's only fitting and proper that we have a laser-firing, nuclear-powered robot rolling around on Mars. On to Europa, Titan, and Enceladus!

Comments (50) + TrackBacks (0) | Category: General Scientific News

June 18, 2012

More "More Scientists" Debate

Email This Entry

Posted by Derek

My recent post here on whether the US needs a big influx of scientists and engineers has attracted some attention. Discover magazine asked to reprint it on their site, and then Slate asked if I would write a response for them expanding my thoughts on the subject, which is now up here.

It feels odd for me, as a scientist, to be taking this side of the issue. I even think that not enough people know enough science and mathematics, and would like for these subjects to be taught better than they are in schools. But there's something about the attitude that "America needs more scientists, even mediocre ones" that really doesn't sit right with me. Science, and scientists, aren't like coal. We can't be stored for later use, nor hauled around to do whatever job it is that Generic Scientists are needed to do. It's messier than that, as a look at some of the science and technology industries (like the one I work in) might illustrate.

Comments (38) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

June 11, 2012

Another Critical Shortage

Email This Entry

Posted by Derek

Here's a perfectly appropriate response to the Slate piece about needing more scientists: "Dear Slate: America Needs More Artists".

America needs Thomas Kinkades and Andy Warhols, but it really needs a lot more good artists, more expressive artists, more mediocre artists, and more starving artists.

In theory, "artsiness" has never been cooler. America sanctifies Steve Jobs (the iPod designer), and envies da Vinci (the Renaissance man-cum-robotic surgeon). There are hipster sculptors, hipster poets, and hipster, well...hipsters. There's 20x200, an entire industry devoted to finding unknown artists, and letting you buy a slice. And yet, American art is in crisis: in this economy, gigs and commissions are tough to come by. Much of our great art comes from overseas (Italy, Japan, Russia) because there aren't enough artists here at home. And many of our best visual and musical minds are snatched up by mainstream media, producing viral apps (Draw Something) or 'selling out' their musical talents (American Idol).

A crisis indeed. Won't anyone take the time to help?

Comments (16) + TrackBacks (0) | Category: General Scientific News

June 6, 2012

How Not to Do Science Education

Email This Entry

Posted by Derek

Slate has one of those assume-the-conclusions articles up on science and technology education in the US. It's right there in the title: "America Needs More Scientists and Engineers".

Now, I can generally agree that America (and the world) needs more science and engineering. I'd personally like enough to realize room-temperature superconductors, commercially feasible fixation of carbon dioxide as an industrial feedstock, and both economically viable fusion power and high-efficiency solar beamed down from orbit. For starters. We most definitely need better technology and more scientific understanding to realize these things, since none of them (as far as we know) are at all impossible, and we sure don't have any of them yet.

But to automatically assume that we need lots more scientists and engineers to do that is a tempting, but illogical, conclusion. And one that my currently-unemployed readers who are scientists and engineers don't enjoy hearing about very much, I'd have to assume. I think that the initial fallacies are (1) lumping together all science education into a common substance, and (2) assuming that if you put more of that into the hopper, more good stuff will come out the other end. If I had to pick one line from the article that I disagree with the most, it would be this one:

America needs Thomas Edisons and Craig Venters, but it really needs a lot more good scientists, more competent scientists, even more mediocre scientists.

No. I hate to be the one to say it, but mediocre scientists are, in fact, in long supply. Access to them is not a rate-limiting step. Not all the unemployed science and technology folks out there are mediocre - not by a long shot (I've seen the CVs that come in) - but a lot of the mediocre ones are finding themselves unemployed, and they're searching an awful long time for new positions when that happens. Who, exactly, would be clamoring to hire a fresh horde of I-guess-they'll-do science graduates? Is that what we really need to put things over the top, technologically - more foot soldiers?

But I agree with the first part of the quoted statement, although different names might have come to my mind. My emphasis would be on "How do we get the smartest and most motivated people to go into science again?". Or perhaps "How do we educate future discoverers to live up to their potential?" I want to make sure that we don't miss the next John von Neumann or Claude Shannon, or that they don't decide to go off to the hedge fund business instead. I want to be able to find the great people who come out of obscurity, the Barbara McClintocks and Francis Cricks, and give them the chance to do what they're capable of. When someone seems to be born for a particular field, like R. B. Woodward for organic chemistry, I want them to have every chance to find their calling.

But even below that household-name level, there's a larger group of very intelligent, very inventive people who are mostly only known to those in their field. I have a list in my head right now for chemistry; so do you. These people we cannot have enough of, either - these are people who might be only a chance encounter or sudden thought away from a line of research that would lead to an uncontested Nobel Prize or billion-dollar industrial breakthrough.

To be fair, Slate may well get around to some of these thoughts; they're going to be writing about science education all month. But I wish that they hadn't gotten off on this particular foot. You've got to guard yourself against myths in this area. Here come a few of them:

1. Companies, in most cases, are not moving R&D operations overseas because they just can't find anyone here to do the jobs. They're doing that because it's cheaper that way (or appears to be; the jury's probably still out in many instances).

2. We are not, as far as I can see, facing the constant and well-known "critical shortage of scientists and engineers". There have been headlines with that phrase in them for decades, and I wish people would think about that before writing another one. Some fields may have shortages, but that's a different story entirely.

3. And that brings up another point, as mentioned above: while the earlier stages of science and math education are a common pathway, things then branch out, and how. Saying that there are so-many-thousand "science PhDs" is a pretty useless statistic, because by that point, they're scattered into all sorts of fields. A semiconductor firm will not be hiring me, for example.

There are more of these myths; examples are welcome in the comments. I'll no doubt return to this topic as more articles are published on it - it really is an important one. That's why it deserves more than "America needs more mediocre scientists". Sheesh.

Comments (53) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

May 25, 2012

Worst And/Or Craziest Misconceptions

Email This Entry

Posted by Derek

You run into a lot of scientific and medical misconceptions (particularly when you have a blog with a working e-mail address plastered on the front page of it!) There are plenty of harmless ones that are easy to correct, and at the other end of the scale there are major weltanschauung problems (like the "drug companies don't want to find a cure for cancer because it would put them out of business" line). Those involve what Kingsley Amis called "permanent tendencies of the heart and mind", and I'm not sure if they can be fixed at all.

I got to thinking about this subject again after seeing this item, which is pointing out to physicians that a meaningful number of their patients may well opt out of surgery for cancer because they believe that cancer spreads when exposed to air. This turns out to be a common enough belief that it's addressed on many medical sites. It's not one that I'd heard before, and I thought I'd heard quite a few of these.

So, in the spirit of discussions like this one, I'll toss out these questions: what's the farthest-from-reality misconception about medical/pharma topics you've encountered? And what widespread one do you think does the most harm? (Warning about that link: it goes to a hugely long thread, which will soak up your time as you continue running into yet-more-ridiculous beliefs that people have expressed).

My own candidates: the weirdest one I've encountered might be the person who still believed in spontaneous generation (that old bread just sort of "turned into" living mold, etc.). And the most harmful one, from a drug research perspective, might well be the constellation of "the government does all drug research" beliefs, or the one mentioned above, the "drug companies don't want to cure X" one, which shades into the "drug companies have a cure for X but they don't want to release it" belief.

Comments (149) + TrackBacks (0) | Category: General Scientific News

May 9, 2012

PhDs On Food Stamps?

Email This Entry

Posted by Derek

A number of people have sent me this article about the number of people with Master's and PhD degrees who are receiving food stamps. And while it's undeniable that the numbers have grown, I'd ask for everyone to keep their statistical glasses on. According to the chart at the end of the piece, the percentage of doctorate holders receiving assistance went from 0.05% in 2007 to 0.15% in 2010. (For MS/MA degree holders, it went from 0.5% to 1.3% over that same time).

So it can't be said that this is a widespread phenomenon. One would also want to see the numbers broken down by age cohort, and (especially) by field of study. The examples in the article are all history and English types. Also, if those figures are correct, the headline could have just as easily read "Master's Degree Holders Ten Times More Likely To Be On Food Stamps".

Honestly, the number I find most alarming in that chart is the total number of advanced degree holders. We went from 20 million in 2007 to 22 million in 2010 - two million more in only three years? The population of the country went from 301 million to 313 million during that time, so that's a pretty good crop of degree holders. Given what the economy has been like during that period, I'm surprised the food stamp figures aren't even higher.

Looking at advanced degrees as a percentage of the population, we have 4.3% in 1970, 7.2% in 1980, 8.8% in 1990, 8.6% in 2000 (a decrease I'm at a loss to explain), and 10.6% in 2009. Those figures don't quite add up with the ones in the food stamp article, but the trend certainly is in the same direction. We have figures in the growth in bachelor's degree or higher going back to 1940, and they show the relentless uptrend you'd expect.

So it shouldn't come as a surprise that well-educated people are participating more in some of the downsides that hit the rest of the population. Well-educated people are becoming more and more of the population.

Comments (20) + TrackBacks (0) | Category: Business and Markets | General Scientific News | Graduate School

April 12, 2012

A Federation of Independent Researchers?

Email This Entry

Posted by Derek

I've had an interesting e-mail from a reader who wants to be signed as "Mrs. McGreevy", and it's comprehensive enough that I'm going to reproduce it in full below.

As everyone but the editorial board of C&E News has noticed, jobs in chemistry are few and far between right now. I found your post on virtual biotechs inspiring, but it doesn't look like anyone has found a good solution for how to support these small firefly businesses until they find their wings, so to speak. Lots of editorials, lots of meetings, lots of rueful headshaking, no real road map forward for unemployed scientists.

I haven't seen this proposed anywhere else, so I'm asking you and your readership if this idea would fly:

What about a voluntary association of independent research scientists?

I'm thinking about charging a small membership fee (for non-profit administration and hard costs) and using group buying power for the practical real-world support a virtual biotech would need:

1. Group rates on health and life insurance.

How many would-be entrepreneurs are stuck in a job they hate because of the the health care plan, or even worse, are unemployed or underemployed and uninsurable, quietly draining their savings accounts and praying no one gets really sick? I have no idea how this would work across state lines, or if it is even possible,but would it hurt to find out? Is anyone else looking?

2. Group rates on access to journals and library services.

This is something I do know a bit about. My M.S. is in library science, and I worked in the Chemistry Library in a large research institution for years during grad school. What if there were one centralized virtual library to which unaffiliated researchers across the country could log in for ejournal access? What if one place could buy and house the print media that start-ups would need to access every so often, and provide a librarian to look things up-- it's not like everyone needs their own print copy of the Canada & US Drug Development Industry & Outsourcing Guide 2012 at $150 a pop. (But if 350 people paid $1 a year for a $350/yr online subscription . . . )

Yes, some of you could go to university libraries and look these things up and print off articles to read at home, but some of you can't. You're probably violating some sort of terms of service agreement the library and publisher worked out anyway. It's not like anyone is likely to bust you unless you print out stacks and stacks of papers, but still. It's one more hassle for a small company to deal with, and everyone will have to re-invent the wheel and waste time and energy negotiating access on their own.

3. How about an online community for support and networking-- places for blogs, reviews, questions, answers, exchanges of best practices, or even just encouragement for that gut-wrenching feeling of going out on your own as a new entrepreneur?

4. What sort of support for grantwriting is out there? Is there a hole that needs to be filled?

5. How about a place to advertise your consulting services or CRO, or even bid for a contract? Virtual RFP posting?

6. Would group buying power help negotiate rates with CROs? How about rates for HTS libraries, for those of you who haven't given up on it completely?

Is there a need for this sort of thing? Would anyone use it if it were available? How much would an unaffiliated researcher be willing to pay for the services? Does anyone out there have an idea of what sort of costs are involved, and what sort of critical mass it would take to achieve the group buying power needed to make this possible?

I'd be happy to spark a discussion on what a virtual biotech company needs besides a spare bedroom and a broadband connection, even if the consensus opinion is that the OP an ill-informed twit with an idea that will never fly. What do you need to get a virtual biotech started? How do we make it happen? There are thousands of unemployed lab scientists, and I refuse to believe that the only guy making a living these days from a small independently-funded lab is Bryan Cranston.

A very worthy topic indeed, and one whose time looks to have come. Thoughts on how to make such a thing happen?

Comments (59) + TrackBacks (0) | Category: Business and Markets | Drug Development | General Scientific News | The Scientific Literature

March 19, 2012

Running Out of Helium?

Email This Entry

Posted by Derek

Update: fixed link in first paragraph - sorry!

According to this article, and some others like it over the last few years, we are. Sale of the US government's strategic helium reserve lowered prices, which led to increased use, which now appears to be leading to shortages. (If you want to see a distorted market, look no further).

I'm no expert in this field, but my guess is that we're mainly running out of cheap helium. Continued oil and natural gas exploration should reveal more of it, but just as those petrochemicals won't be cheaper, helium won't be, either. And come to think of it, I'm not sure how much helium is to be found in shale gas and the like, as opposed to traditional formations. Some quick Googling suggest that shale is too porous to contain much of the helium, which I can well believe.

So prepare to pay even more to keep those NMR magnets running - it's hard to imagine that it'll ever get cheaper than it's been. Peak Oil, I'm not so sure about, but Peak Helium may have already been realized. . .

Comments (26) + TrackBacks (0) | Category: General Scientific News

February 13, 2012

Nobel Prizes in Chemistry For People Who Aren't Chemists

Email This Entry

Posted by Derek

Nobelist Roald Hoffman has directly taken on a topic that many chemists find painful: why aren't more chemistry Nobel prizes given, to, well. . .chemists?

". . .the last decade has been especially unkind to "pure" chemists, asa only four of ten Nobel awards could be classified as rewarding work comfortably ensconced in chemistry departments around the world. And five of the last ten awards have had a definite biological tinge to them.

I know that I speak from a privileged position, but I would urge my fellow chemists not to be upset."

He goes on to argue that the Nobel committee is actually pursuing a larger definition of chemistry than many chemists are, and that we should take it and run with it. Hoffmann says that the split between chemistry and biochemistry, back earlier in the 20th century, was a mistake. (And I think he's saying that if we don't watch out, we're going to make the same mistake again, all in the name of keeping the discipline pure).

We're going to run into the same problem over and over again. What if someone discovers some sort of modified graphene that's useful for mimicking photosynthesis, and possibly turning ambient carbon dioxide into a useful chemical feedstock? What if nanotechnology really does start to get off the ground, or another breakthrough is made towards room-temperature superconductors, this time containing organic molecules? What would a leap forward in battery technology be, if not chemistry? Or schemes to modify secreted proteins or antibodies to make them do useful things no one has ever seen? Are we going to tell everyone "No, no. Those are wonderful, those are great discoveries. But they're not chemistry. Chemistry is this stuff over here, that we complain about not getting prizes for".

Comments (16) + TrackBacks (0) | Category: General Scientific News | Press Coverage | Who Discovers and Why

February 6, 2012

Let's Start Off the Meeting With An Ad, OK?

Email This Entry

Posted by Derek

I'm sitting in the main conference hall at the SLAS meeting as things get going. And I have to say, it's a big crowd, and there are some very interesting things on the agenda. But I've just seen something I've never seen before: an ad being played on the screens for the entire attending audience, just before the keynote address. Thermo Scientific has clearly put a lot of money into this meeting (as well they should), but sitting through a NFL-voice-over style ad during the kickoff to a scientific meeting ("The power to win!" is a real first.

Comments (11) + TrackBacks (0) | Category: General Scientific News

January 23, 2012

Strangest Presentation You've Seen?

Email This Entry

Posted by Derek

Friday's mention of the Brindley lecture prompts me to throw this question out: what's the most weirdly memorable scientific presentation you've ever seen?

I'll put one out there that still sticks in my mind. Back in 1998, I was attending the Gordon Conference on Heterocycles. One of the speakers was a young faculty member from Montana, who was supposed to be speaking on metal-catalyzed reactions of indoles. Instead, he came in with a completely different slide deck on origins-of-life chemistry, which made it clear, rather quickly, that he not only did not buy into the "RNA world" hypothesis, but considered it (and much other origins-of-life work) to be the next thing to a conspiracy.

The audience took this in with some visible discomfort, as the talk itself became more passionate and agitated. The whole topic was something that clearly upset and offended the speaker, but I can't say that he made many converts. There were a couple of questions from the floor at the end, but I think that many people were just hoping to get this one over with and move on. The speaker himself moved on shortly to a small Adventist school, in a department that says that it hopes to provide a "scriptural perspective" on scientific issues, but he doesn't seem to be listed on the faculty there now, and I've been unable to trace him after that. . .

Comments (29) + TrackBacks (0) | Category: General Scientific News

January 20, 2012

Worst Lecture of All, Or Greatest?

Email This Entry

Posted by Derek

Depends on your perspective! Since it's Friday, I present this memoir of the infamous Brindley lecture from 1983. G.S. Brindley appears to have been a pioneer in urology, and in fact discovered the first useful therapies for erectile dysfunction.

But the way he chose to announce these discoveries to the world was. . .well, read the article. Let's just say that he was intent on leaving no doubts, and that no doubts were left.

Comments (30) + TrackBacks (0) | Category: General Scientific News

November 4, 2011

What's the Hardest Thing?

Email This Entry

Posted by Derek

Here's a quick question for the crowd: when you're talking with people outside your field, or outside of science completely, what's the hardest thing in your area to explain? I end up doing a lot of explaining myself, and I find that a lot of key drug discovery concepts can be communicated pretty quickly.

But not all of them, and perhaps not all at the same time. I can talk about PK and absorption, metabolism, etc., and I can talk about molecular properties and selectivity, and toxicology. Keeping all of those in mind at the same time, though, seems to be difficult if you're not used to doing it, thus the trouble with explaining Paracelsus' remark that "the dose makes the poison". Selenium is a good place to experience that: try getting across to someone that there's an essential nutritional element that's also poisonous. It's like trying to say that cyanide is a vitamin - but then again, if carbon monoxide is a neurotransmitter, maybe it is.

I think that the other broad issue that's hard to communicate is the amount that we don't know. That specifically comes up in discussions of toxicology - people want to know if this drug, this compound, is toxic or not. And if it is, how do we know that the next one isn't like that? Those are all questions that do a sort of reverse origami trick: they start off in a neat, comprehensible shape, but unfold to heaps of crumpled paper as soon as you really start pulling on them. How can we know so much, yet know so little? Why have we been studying some of these systems for decades and still not understand them? That probably gets back to the repeated point that living biological systems are simply more complex - much more complex - than anything that anyone has ever dealt with in everyday life. And what's more, they're complex in different ways than we're ever used to dealing with; it's not just differences in degree (although those certainly apply) but differences in kind.

But don't confine yourself to the big meta-issues. There are plenty of smaller concepts and ideas that don't lend themselves to fast explanations. A meaningful one-paragraph (or one-sentence!) explanation of NMR imaging for someone with no background, for example, is no small undertaking. (My attempt: "We're all full of water molecules, in all sorts of environments in the body. And they behave differently when you put them in a strong magnetic field, which lets us pick up different signals from them and turn them into images.") How about hydrogen bonding? Or chirality? What are your sticking points when you try to explain what you do?

Comments (66) + TrackBacks (0) | Category: General Scientific News

October 31, 2011

Very Likely Not Real, But Still. . .(The E-Cat)

Email This Entry

Posted by Derek

I occasionally cover odd attempts at alternative - very alternative - energy sources here, because there's a chemistry angle to many of them. The various cold fusion claims have always gotten a slightly less frosty reception among professional chemists than among professional physicists, on average. And yes, there are two good explanations of that, which are not mutually exclusive: (1) that the chemists are willing to be a bit more open-minded since (among other things) they have less invested in the state of physics as it is, and (2) that the chemists are willing to be more open-minded because they know less about physics.

So far, the track record on these things has been pretty close to 100% hardtack disappointment, dry as dust and crunchy as hell. But as Tyler Cowen put it over at Marginal Revolution, the expected value of such things is so high that a small amount of attention is worthwhile. The latest headline-grabber is a mysterious thingie from Italy called the E-Cat, which I mentioned briefly here back in July.

The inventors apparently concluded a larger-scale demonstration over the weekend, as reported here, at the request of an unnamed client from the US. The problem, as that article shows, is that we really don't have a lot more to go on: this "client" could plausibly be DARPA, or that could (also plausibly) just be what the device's backers would like for everyone to think, the better to fleece the unwary in the next round.

So for now, I'm just noting this with cautious interest. I certainly hope that the people behind this are operating in good faith, in which case I will in good faith wish them well. But we'll see what happens next, if anything. For now, the "snake oil" tag stays on.

Comments (23) + TrackBacks (0) | Category: General Scientific News | Snake Oil

October 27, 2011

Liquid Handling

Email This Entry

Posted by Derek

While mentioning lab equipment, I thought I'd also note that I've been contacted by a fellow who's trying to interest people in a newly invented high-throughput low-footprint liquid handler device that he's prototyped. Haven't seen it in action, don't know him personally, but he's out in the Bay area for the next few days, and can be reached at carlcrott-at-gmail-dot-com, if you're in the market for that sort of thing. I figure in this business environment, people can use a break. . .

Comments (9) + TrackBacks (0) | Category: General Scientific News

October 17, 2011

The Singularity, Postponed

Email This Entry

Posted by Derek

I've had some problems over the years with the Singularity-Is-Near line of thought, and some problems with the "If we can build a new generations of microchips in five years, we ought to be able to cure cancer in ten" idea. Here's an article by Paul Allen in Technology Review that takes aim at both of these simultaneously:

The complexity of the brain is simply awesome. Every structure has been precisely shaped by millions of years of evolution to do a particular thing, whatever it might be. It is not like a computer, with billions of identical transistors in regular memory arrays that are controlled by a CPU with a few different elements. In the brain every individual structure and neural circuit has been individually refined by evolution and environmental factors. The closer we look at the brain, the greater the degree of neural variation we find. Understanding the neural structure of the human brain is getting harder as we learn more. Put another way, the more we learn, the more we realize there is to know, and the more we have to go back and revise our earlier understandings. We believe that one day this steady increase in complexity will end—the brain is, after all, a finite set of neurons and operates according to physical principles. But for the foreseeable future, it is the complexity brake and arrival of powerful new theories, rather than the Law of Accelerating Returns, that will govern the pace of scientific progress required to achieve the singularity.

Very true. Imagine a fiendishly complex chip diagram, but with not a single component of it standardized. It's one bespoke piece of hardware after another, billions of them, and the wiring between them was put together the same idiosyncratic way. And it's altering while you study it - in fact, it may be altering because you're studying it. Glorious stuff, and understanding it is going to give us extraordinary powers. But that's not happening soon, or on anyone's schedule.

Comments (19) + TrackBacks (0) | Category: General Scientific News | The Central Nervous System

October 4, 2011

Podcast Interview on Drug Discovery

Email This Entry

Posted by Derek

Here's an interview that I did recently with Paul Howard of the Manhattan Institute on the results of that immunotherapy leukemia trial (and on some broader topics around the current state of drug discovery). Anyone who would like to pitch a blockbuster syndicated radio show with me on structure-activity relationships and preclinical drug development, have your people call my people.

Comments (2) + TrackBacks (0) | Category: General Scientific News

Yep, That's A Nobel Prize, Right There

Email This Entry

Posted by Derek

Today's announcement of the Physics Nobel came as no surprise. I remember when those results came out in 1998 (that the universe's expansion was accelerating rather than slowing down to some degree), and immediately thinking "Nobel if it holds up". I thought the same thing about RNA interference when I first heard about it, and there are many other discoveries in the same category. Not all of them have been given Nobels, but what I mean are things that are immediately obvious that they are Nobel-worthy.

Now, here's my question for today: how many of these have we had in chemistry? And how many have we had recently? It seems to me that (out of the indisputable chemistry-and-not-biology prizes), there aren't as many as you might find in other fields. Perhaps chemistry is a mature enough science that fundamental surprises and breakthroughs are not as common, and when they occur, they come on more slowly. Thoughts?

Comments (24) + TrackBacks (0) | Category: General Scientific News

October 3, 2011

Chemistry Nobel Time

Email This Entry

Posted by Derek

The first week in October is on us again, and this Wednesday is the Nobel Prize in Chemistry. So what can we say about who should get it (and about who actually will?)

Your first place to turn should be Paul Bracher's ChemBark post on the topic. He has a comprehensive list of candidates, but the only ones with better odds than the field bet are:

Spectroscopy & Application of Lasers (Zare/Moerner/+), 6-1
Nuclear Hormone Signaling (Chambon/Evans/Jensen), 7-1
Bioinorganic Chemistry (Gray/Lippard/Holm/–), 8-1
Techniques in DNA Synthesis, (Caruthers/Hood/+), 10-1

That's a very reasonable list, and I think that Zare/Moerner (and possibly et al.) are definitely going to win at some point. There's a strong case to be made for each of the others, too. Meanwhile, Thomson/Reuters has narrowed things down quite a bit. Their three contenders are:

Electrochemistry (Bard et al.)
Molecular Dynamics (Karplus et al.)
Dendritic Polymers (Fréchet, Tomalia, Vögtle)

Those first two have been contenders for many years (which tends to move them down on the ChemBark list, and up on the Thomson one), but no one could complain about either of them. I really, really doubt that dendritic chemistry's going to win this year, though, and I'd like to know what put that topic so far up the list. Maybe some day, but not yet, in my opinion. My guess is that the sheer number of publications in the field has skewed things a bit (after all, keeping track of such things is Thomson/Reuters' business). Wavefunction's list is a good one to check out, and seems much more in tune with reality.

There are some categories of research that I would like to see win at some point, although narrowing down the names won't be easy. I think that directed-evolution methods are a great area with a lot of potential, for one, and the whole activity-based protein profiling/in vivo cell labeling stuff is another. Eventually I expect some nanotech/molecular machine discovery to win, but only once it gets far enough along to connect with the real world. And the work on photochemical energy technology (carbon dioxide fixation, hydrogen generation, and so on) will also be a strong candidate when something looks world-changing enough. Other discoveries in the surely-Nobel-worthy category are GPCR structure (and that sort of thing has usually been moved over into the Chemistry prize) and DNA-based diagnostic methods (hard to narrow that one down, though).

This year? I think the committee will go back and pick up one of those prizes for long-time contenders; I don't expect any massive surprises. And I certainly don't expect anything in organic chemistry this time. But we'll see on Wednesday. What's that? You want me to really pick something? Fine. . .I'll guess the nuclear receptor people, or the single-molecule spectroscopy people, in basically a dead heat. I think that the ChemBark odds are correct.

Comments (32) + TrackBacks (0) | Category: General Scientific News

August 18, 2011

The NIH Wonders About the Future of Biomedical Workers

Email This Entry

Posted by Derek

A reader passes along this request for comment by the NIH. The "Advisory Committee to the NIH Director Working Group on the Future Biomedical Research Workforce" is asking for thoughts on issues such as the length of time it takes to get a PhD, the balance between non-US and US workers, length of post-doctoral training, the prospects for employment after such is completed, general issues relating to whether people choose biomedical research as a career at all, and so on.

These are, of course, issues that have come up here repeatedly (as well they should), so if you want to have a shot at influencing some NIH thinking on them, they're asking for anyone's thoughts by October 7. (Use this form).

Comments (9) + TrackBacks (0) | Category: Business and Markets | General Scientific News | Graduate School

August 8, 2011

Gilenya's Price

Email This Entry

Posted by Derek

A couple of weeks ago, we had this discussion about the cost-effectiveness of drugs for multiple sclerosis. It was pointed out that Novartis's new Gilenya (fingolimod) is priced even higher than the drugs in the study that found that MS drugs are among the priciest in the world for their medical benefit.

Now the United Kingdom's NICE has said that Gilenya has not (so far) shown enough efficacy to justify its price. There's going to be a lot of emotionally engaged comment on both sides of this issue, but people should have been able to see this coming. And by "people", yes, I also mean Novartis.

Comments (10) + TrackBacks (0) | Category: General Scientific News | The Central Nervous System

June 1, 2011

Return of the Arsenic Bacterium

Email This Entry

Posted by Derek

You'll all remember the big news about the arsenic-using bacteria - that Science paper from last December. What you may not realize is that the paper is only now coming out in print. The delay seems to have been to allow time for an extraordinary number of responses to be published at the same time. I'll summarize those, and the counterarguments made by the original authors.

Rosie Redfield of UBC, whose blog was one of the earliest criticisms of the paper, objects that the culture media used were not pure. She maintains that there was enough phosphate in the growth medium to account for all the cell growth seen, without having to invoke arsenic-containing DNA. She also has a problem with the way that the DNA fractions in the original paper were (not) purified, pointing out that the procedures used could easily drag along many contaminants.

In response, Wolfe-Simon et al. don't find the trace-phosphorus objection compelling, they say, because the arsenic-stimulated organisms were grown under the same P background as the controls at that point, and the arsenic group grew much better. As for the DNA purification, they go over their procedures, state that they didn't see evidence of particulate contamination, and point out that negatively-charged arsenate is unlikely to stick to DNA unless it's covalently bound.

A team from CNRS and JPL makes the point (as others did at the time of first publication) that arsenic's own redox chemistry makes the original assertion hard to believe. Under all known physiological conditions, arsenate should be less stable than arsenite, and arsenite can't be a plausible substitute for phosphate (even if you buy that arsenate can). They also believe that the bacteria are running on residual phosphorus: "GFAJ-1 appears to do all it can to harvest P atoms from the medium while drowning in As. . ."

Wolfe-Simon et al. reply by saying that they specifically looked for reduced arsenic species in the cells, without success, and suggest that something must be stabilizing arsenate that no one has yet seen or considered.

Another team response, from Hungary and Johns Hopkins, objects to the way that the P:As ratios were calculated in the paper. The error for the dry-weight arsenic percentage in the bacteria is larger than the value itself, so you can't really be sure that there was no arsenic at all. The mass spec data used in the paper, they say, also have such high fluctuations as to make the numbers unable to support the paper's claims.

In response, Wolfe-Simon et al. say that they don't find the arsenic numbers to be all that variable, considering the conditions. And the phosphorus numbers don't vary much at all, by comparison, and the arsenic numbers are always higher.

Stefan Oehler, from Greece, asks why density gradient centrifugation of the supposed arsenic-containing DNA wasn't done (as did other observers when the paper came out). As-DNA should be heavier. Comparing hydrolysis rates of the As-DNA with the normal phosphate form "could also have been easily done", and he says that without these data, the paper is unconvincing. One major suggestion he has is to see how and where the bacteria incorporate radioactive arsenic.

David Borhani (ex-Abbott) has objections that are similar to some of the others. He's not convinced that the "-P" media really don't have enough phosphorus left in them to explain the results, and says that the agarose gels shown are hard to square with the paper's claims. (The phosphorus-containing DNA looks more degraded than the putative arsenic-containing sample, for example, and the DNA being compared is of different sizes to start with). He has the same problems with the error bars as mentioned above.

Steven Benner (who, interestingly, appeared at the original press conference back in December, albeit not as a cheerleader), comes at the problem from a chemical angle. The rate constants for arsenate hydrolysis gives you an expected half-life for such esters inside a cell of seconds to minutes (at best), which doesn't seem feasible for use in biomolecules. He goes over several possibilities for ways to make such linkages more stable - or for judging the literature on arsenate stability to be wrong - and can't make any of them work. Another big problem is that the phosphates in DNA have to survive as such for numerous steps in the cell, and it's hard to see how arsenate could substitute across such a wide range of biochemistry. He'd also like to see the As-DNA subjected to hydrolysis and to enzymes such as DNA kinase or exonuclease, to see how it behaves. "Above all", he says, do the radioactive arsenic experiment.

In response, Wolfe-Simon et al. say that there's very little data on the stability of arsenate esters of anything but very small molecules - steric hindrance, among other things, would be expected to make the bioesters more stable. They refer to a paper showing that arsenate esters of glucose were much more stable than expected, for example.

Patricia Foster of Indiana suggests that the process of raising the GFAJ-1 bacteria selected for mutants that have lost their phosphate inorganic transporter (Pit) system, but have pumped up their phosphate-specific transport (Pst) system. It's been shown in E. coli, she points out, that arsenate poisons the former transporter, but actually stimulates the latter, which would account for the apparent stimulatory effect of arsenic on GFAJ-1.

Wolfe-Simon et al. respond by saying that if the Pst pathway were stimulated, they'd expect to see evidence of arsenate detoxification pathways (thioarsenate, methylation, reduction), and they don't. (That seems weird to me - surely the organism, no matter what, is seeing a lot more arsenate than it can use, and would have to do some of these things?)

Finally, James Cotner and Edward Hall of Minnesota and Vienna, respectively, note that their own work was cited in the original paper on the phosphorus content of bacteria. They object, though, saying that their phosphorus-rich experiment make a poor comparison with the GFAJ-1 case. In fact, they say, they've now published a survey of the elemental content of freshwater bacteria, and that these can actually be highly depleted in phosphorus. The phosphorus content measured in GFAJ-1 does not, in fact, fall outside of the range seen in organisms grown under naturally P-limiting conditions.

Wolfe-Simon et al. reply that Cotner and Hall's numbers are taken from individual bacteria at the low end of the range, not whole populations, making them a poor comparison. Their whole-population values, they say, are actually similar to their own phosphorus control cultures, and are both higher than the arsenate-grown bacteria.

So, in the end, the authors are sticking to their original arsenic hypothesis. They agree that analyzing DNA after separating it from the gels would be a useful experiment (as Redfield and other propose), and they also say that they did not mean to suggest that the GFAJ-1 bacteria have a "wholesale" subsitution of arsenate for phosphate, just that they do have some. And they're making GFAJ-1 available to people who want to take a crack at their own experiments.

This is really a remarkable exchange, but mostly due to its sheer concentration in time and in publication. But this is exactly how science is done, although it usually happens a bit more slowly and in a more disorganized fashion than what we're seeing here. But these extraordinary claims have brought an extraordinary response.

I think that things have gone as far as they can with the data from the original paper, and it's fair to say that that's not far enough to convince a lot of people. Next step: more data, and more experiments. One way or another, this will get detangled.

Comments (35) + TrackBacks (0) | Category: General Scientific News

May 16, 2011

A Google Oddity

Email This Entry

Posted by Derek

A comment to the last post mentioned that if you search the word "biotechnology" in Google's Ngram search engine, something odd happens. There's the expected rise in the 1970s and 80s, but there's also a bump in the early 1900s, for no apparent reason. Curious about this, I ran several other high-tech phrases through and found the exact same effect.

Here's a good example, with some modern physics phrases. And you get the same thing if you search "nanotechnology", "ribosome", "atomic force microscope", "RNA interference", "laser", "gene transfer", "mass spectrometer" or "nuclear magnetic resonance". There's always a jump back in exactly the same period on the early 1900s.
So what's going on? I can understand some OCR errors, but why do these things show up in this specific Edwardian-age window? Can anyone at Google shed any light on this?

Comments (28) + TrackBacks (0) | Category: General Scientific News | The Scientific Literature

Ups and Downs

Email This Entry

Posted by Derek

I was thinking the other day that I never remembered hearing the phrase "Big Pharma" when I first got a job in this business (1989). Now I have some empirical proof, thanks to the Google Labs Ngram Viewer, that the phrase has only come into prominence more recently. (Fair warning: you can waste substantial amounts of time messing with this site). Here's the incidence rate of "big pharma" in English-language books from 1988 to 2000.big%20pharma%20graph%2Cjpg.jpg
It comes from nowhere, blips to life in 1992, doesn't even really get off the baseline until 1994 or so, and then takes off. (The drops in 2005 and 2008 remain unexplained - did the log phase of its growth end in 2004?)

Update: that graph holds for the uncapitalized version of the phrase. If you put the words in caps, you get the even more dramatic takeoff shown below:

To be fair, though, there seems to have been a general rise in Big Pharma-related literature during that period. Try out this graph, comparing mentions of Merck, Pfizer, and Novartis since 1970. The last-named, of course, didn't even exist until the early 1990s, but they (like the others) have spent the time since then zipping right up, with no apparent end in sight. (Merck, especially - what's with those guys?) And what accounts for this? Business books? Investing guides? Speculation is welcome.

Note: the above paragraph was written before realizing that the Google Ngram search is case-sensitive - so, as was pointed out in the comments, I was picking up on people not caring about capitalization more than anything else. Below is the correct graph, with initial capitals in the search, and it makes more sense. Merck still is the king of book mentions, though, for all the coverage that Pfizer gets.

I'll finish off with this one, using a longer time scale. Yes, folks, for better or worse, it appears that the phrase "organic chemistry" peaked out between book covers around 1950, and has been declining ever since. Meanwhile, "total synthesis" starting rising during the World War II era (penicillin?), and kept on moving up until a peak around 1980. Interestingly, things turned around in 2000 or so, and especially since 2003. And this can't be ascribed to some sort of general surge in chemistry publications - look at the "organic chemistry" line during the same period. Is there some other field that's adopted the phrase?

Comments (20) + TrackBacks (0) | Category: Drug Industry History | General Scientific News | The Scientific Literature

May 11, 2011

Writing About Science, and Liking It

Email This Entry

Posted by Derek

Via John Hawks, here's an interesting interview with writer John McPhee, known to many for his long-form explorations (and explanations) of geology.

When he starting doing that in the New Yorker, though, editor Wallace Shawn told him to go ahead, although he warned him that "readers will rebel". And that they did - McPhee says that he got extremely polarized feedback from those pieces: loved them, loathed them. His explanation?

Two cultures. There are some people whose cast of mind admits that sort of stuff, and there are others who are just paralyzed by it at the outset, no matter how crafty the writing might be. A really nice thing that happens is when people say, I never thought I’d be interested in that subject until I read your piece. These letters come about geology too, but there are some people who just aren’t going to read it at all. Some lawyer in Boston sent me a letter—this man, this adult, had gone to the trouble to write in great big letters: stop writing about geology. And it’s on the letterhead of a law firm in Boston. I did not write back and say, One thing this country could very much use is one less lawyer. Why don’t you stop doing law?

Good point! But I know what he's talking about. I remember William Rusher, who used to publish National Review, writing about how he had to tell a colleague that "there is no concept so simple that I can fail to understand it when presented as a graph". That made me feel the two cultures divide, for sure. But it's perhaps not as stark as the classic C. P. Snow formulation: there are plenty of scientists who appreciate literature and the arts, and (as McPhee notes), there are plenty of people who know more about the humanities who find that they enjoy scientific topics once they're exposed to them.

My two cultures, then are the people who can appreciate science and the arts, versus the people who can appreciate only one of the two. (I'm leaving aside people who can't appreciate either one). So there are one-mode-only folks like the lawyer who wrote above (or William Rusher), and the corresponding scientists and engineers who might never pick up a book or appreciate a painting. And it may just be my own prejudices speaking, but I think that there are more one-mode-onlies who fit that first description, and that actually does take us back to a famous quote from C. P. Snow:

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is the scientific equivalent of: Have you read a work of Shakespeare's?

Right he was, and is. If you (scientist or literary type), are up for an in-depth discussion on the Second Law, with reference to Shakespeare and much else, try this by Frank Lambert, who's put a lot of thought into the subject.

Comments (41) + TrackBacks (0) | Category: General Scientific News

December 17, 2010

Politics in the Lab

Email This Entry

Posted by Derek

Man, have I been avoiding this topic. But I think it's time. Slate recently published this piece on the political affiliations of scientists, with the provocative sub-head: "Most scientists in this country are Democrats. That's a problem."

Is it? Is that even true? The piece is based on this survey by the Pew Foundation, which was conducted in 2009 by surveying over 2000 members of the AAAS. Now, I'm trying to come up with the figures, but my strong impression is that the organization skews pretty strongly academic, which might account for some of the numbers. (Science, for example, runs articles like this one, explaining the mysterious world of industry to job-seekers.

So I'm not sure if the Pew numbers are accurate. Still. . .for what it's worth, they come out like this: for self-described party affiliation, Dem/Rep/Ind, the general public was 35/23/34, and the sample of scientists was 55/6/32. And in more philosophical terms, as self-identified liberal/moderate/conservative, the general public was 20/38/37, and the scientists were 52/35/9. Those are some pretty stark differences - correcting for sample bias would probably even things out some, but I can't imagine it would be enough to take care of a gap that large.

The Slate piece says that this is indeed a problem:

During the Bush administration, Democrats discovered that they could score political points by accusing Bush of being anti-science. In the process, they seem to have convinced themselves that they are the keepers of the Enlightenment spirit, and that those who disagree with them on issues like climate change are fundamentally irrational. Meanwhile, many Republicans have come to believe that mainstream science is corrupted by ideology and amounts to no more than politics by another name. Attracted to fringe scientists like the small and vocal group of climate skeptics, Republicans appear to be alienated from a mainstream scientific community that by and large doesn't share their political beliefs. The climate debacle is only the most conspicuous example of these debilitating tendencies, which play out in issues as diverse as nuclear waste disposal, protection of endangered species, and regulation of pharmaceuticals.

I think that's a reasonable summary, especially if you're the sort of person who thinks about politics all the time. But that's a key consideration: not everyone does. It's hard to remember this if you're interested in politics yourself, and if you spend a lot of time following current events and world affairs. Politics, and political ideology, is just one template people use to view the world. Everything can be fit into it one way or another, and it's fun to keep score. I imagine the point-totaling sound as being like a pre-digital pinball machine: chunk-chunk-chunk-ding! This side scores, that side scores.

But how much of this overlaps with what goes on in the labs? The examples in the quoted paragraph certainly do, but there are many less politically contentious issues that are scientifically important. It's hard to fit disagreements over dark matter or RNA's role in early life forms into a left/right framework, much less intramural spats like the structure of the norbornyl cation, the usefulness of total synthesis, or how much palladium you really need to do a metal-catalyzed coupling.

In the end, I think the "it's a problem" conclusion of the Slate article should be amended to "it's a problem if you measure everything by politics". That's a temptation that should be avoided, as far as I'm concerned. Orwell was right to consider a world where everything was subordinate to political concerns as a nightmare. And while I have strong political opinions myself, and follow the whole business much more than I follow any traditional sport, I still would like to have some areas free of it.

Comments (63) + TrackBacks (0) | Category: General Scientific News

December 8, 2010

NASA's Arsenic Bacteria: A Call For Follow-Up Experiments

Email This Entry

Posted by Derek

Since the posts here on the possible arsenic-using bacteria have generated so many comments, I'd like to try to bring things together. If you think that the NASA results need shoring up - and a lot of people do, including me - please leave a comment here about what data or new experiments you'd want to see. I'll assemble these into a new post and try to get some attention for it.

The expertise among the readership here is largely in chemistry, so it would make sense to have suggestions from that angle - I assume that microbiologists are putting together their own lists elsewhere! I know that several readers have already put forward some ideas in the comment threads from the earlier posts - I'll go back and harvest those, but feel free to revise and extend your remarks for this one.

So, the questions on the table are: do you find the Science paper convincing? And if not, what would it take to make it so?

Comments (38) + TrackBacks (0) | Category: Analytical Chemistry | General Scientific News

December 7, 2010

Arsenic Bacteria: Does The Evidence Hold Up?

Email This Entry

Posted by Derek

It's time to revisit the arsenic-using bacteria paper. I wrote about it on the day it came out, mainly to try to correct a lot of the poorly done reporting in the general press. These bacteria weren't another form of life, they weren't from another planet, they weren't (as found) living on arsenic (and they weren't "eating" it), and so on.

Now it's time to dig into the technical details, because it looks like the arguing over this work is coming down to analytical chemistry. Not everyone is buying the conclusion that these bacteria have incorporated arsenate into their biomolecules, with the most focused objections being found here, from Rosie Redfield at UBC.

So, what's the problem? Let's look at the actual claims of the paper and see how strong the evidence is for each of them:

Claim 1: the bacteria (GFAJ-1) grow on an arsenate-containing medium with no added phosphate. The authors say that after several transfers into higher-arsentic media, they're maintaining the bacteria in the presence of 40 mM arsenate, 10 mM glucose, and no added phosphate. But that last phrase is not quite correct, since they also say that there's about 3 micromolar phosphate present from impurities in the other salts.

So is that enough? Well, the main evidence is that (as shown in their figure 1), that if you move the bacteria to a medium that doesn't have the added arsenate (but still has the background level of phosphate) that they don't grow. With added arsenate they do, but slowly. And with added phosphate, as mentioned before, they grow more robustly. It looks to me as if the biggest variable here might be the amount of phosphate that could be contaminating the arsenate source that they use. But their table S1 shows that the low level of phosphate in the media is the same both ways, whether they've added arsenate or not. Unless something's gone wrong with that measurement, that's not the answer.

One way or another, the fact that these bacteria seem to use arsenate to grow seems hard to escape. And they're not the kind of weirdo chemotroph to be able to run off arsenate/arsenite redox chemistry (if indeed there are any bacteria that use that system at all). (The paper does get one look at arsenic oxidation states in the near-edge X-ray data, and they don't see anything that corresponds to the plus-3 species). That would appear to leave the idea that they're using arsenate per se as an ingredient in their biochemistry - otherwise, why would they start to grow in its presence? (The Redfield link above takes up this question, wondering if the bacteria are scavenging phosphorus from dead neighbor cells, and points out that the cells may actually still be growing slowly without either added arsenic or phosphate).

Claim 2: the bacteria take up arsenate from the growth medium. To check this, the authors measured intracellular arsenic by ICP mass spec. This was done several ways, and I'll look at the total dry weight values first.

Those arsenic levels were rather variable, but always run high. Looking at the supplementary data, there are some large differences between two batches of bacteria, one from June and one from July. And there's also some variability in the assay itself: the June cells show between 0.114 and 0.624% arsenic (as the assay is repeated), while the July cells show much lower (and tighter) values, between 0.009% and 0.011%. Meanwhile, the corresponding amount of phosphorus is 0.023% to 0.036% in June (As/P of 5 up to 27), and 0.011 to 0.014 in July (As/P of 0.76 to 0.97).

The paper averages these two batches of cells, but it certainly looks like the June bunch were much more robust in their uptake of arsenate. You might look at the July set and think, man, those didn't work out at all, since they actually have more phosphorus than arsenic in them. But the background state should be way lower than that. When you look at the corresponding no-arsenic cell batches, the differences are dramatic in both June and July. The June batch showed at least ten times as much phosphorus in them, and a thousand times less arsenic, and the July run of no-arsenate cells showed (compared to the July arsenic bunch) 60 times as much phosphorus and 1/10th the arsenic. The As/P ratio for both sets hovers around 0.001 to 0.002.

I'll still bet the authors were very disappointed that the July batch didn't come back as dramatic as the June ones. (And I have to give them some credit for including both batches in the paper, and not trying just to make it through with the June-bugs). One big question is what happens when you run the forced-arsenate-growth experiment more times: are the June cells typical, or some sort of weird anomaly? And do they still have both groups growing even now?

One of the points the authors make is that the arsenate-grown cells don't have enough phosphorus to survive. Rosie Redfield doesn't buy this one, and I'll defer to her expertise as a microbiologist. I'd like to hear some more views on this, because it's a potentially important. There are several possibilities - from most exciting to least:

1. The bacteria prefer phosphorus, but are able to take up and incorporate substantial amounts of arsenate, to the point that they can live even below the level of phosphorus needed to normally keep them alive. They probably still need a certain core amount of phosphate, though. This is the position of the paper's authors.

2. The bacteria prefer phosphorus, but are able to take up and incorporate substantial amounts of arsenate. But they still have an amount of phosphate present that would keep them going, so the arsenate must be in "non-critical" biochemical spots - basically, the ones that can stand having it. (This sounds believable, but we still have to explain the growth in the presence of arsenate).

3. The bacteria prefer phosphorus, but are able to take up and incorporate substantial amounts of arsenate. This arsenate, though, is sequestered somehow and is not substituting for phosphate in the organisms' biochemistry. (In this case, you'd wonder why the bacteria are taking up arsenate at all, if they're just having to ditch it. Perhaps they can't pump it out efficiently enough?) And again, we'd have to explain the growth in the presence of arsenate - for a situation like this, you'd think that it would hurt, rather than help, by imposing an extra metabolic burden. I'm assuming here, for the sake of argument, that the whole grows-in-the-presence-of-arsenate story is correct.

Claim 3: the bacteria incorporate arsenate into their DNA as a replacement for phosphate. This is an attempt to distinguish between the possibilities just listed. I think that authors chose the bacterial DNA because DNA has plenty of phosphate, is present in large quantities and can be isolated by known procedures (as opposed to lots of squirrely little phosphorylated small molecules), and would be a dramatic example of arsenate incorporation. These experiments were done by giving the bacteria radiolabeled arsenate, and looking for its distribution.

Rosie Redfield has a number of criticisms of the way the authors isolated the DNA in these experiments, and again, since I'm not a microbiologist, I'll stand back and let that argument take place without getting involved. It's worth noting, though, that most (80%) of the label was in the phenol fraction of the initial extraction, which should have proteins and smaller-molecular-weight stuff in it. Very little showed up in the chloroform fraction (where the lipids would be), and most of the rest (11%) was in the final aqueous layer, where the nucleic acids should accumulate. Of course, if (water-soluble) arsenate was just hanging around, and not being incorporated into biomolecules, the distribution of the label might be pretty similar.

I think a very interesting experiment would be to take non-arsenate-grown GFAJ-1 bacteria, make pellets out of them as was done in this procedure, and then add straight radioactive arsenate to that mixture, in roughly the amounts seen in the arsenate-grown bacteria. How does the label distribute then, as the extractions go on?

Here we come to one of my biggest problems with the paper, after a close reading. When you look at the Supplementary Material, Table S1, you see that the phenol extract (where most of the label was), hardly shows any difference in total arsenic amounts, no matter if the cells were grown high arsenate/no phosphorus or high phosphorus/no arsenate. The first group is just barely higher than the second, and probably within error bars, anyway.

That makes me wonder what's going on - if these cells are taking up arsenate (and especially if they grow on it), why don't we see more of it in the phenol fraction, compared to bacteria that aren't exposed to it at all? Recall that when arsenic was measured by dry weight, there was a real difference. Somewhere there has to be a fraction that shows a shift, and if it's not in the place where 80% of the radiolabel goes, then were could that be?

I think that the authors would like to say "It's in the DNA", but I don't see that data as supporting enough of a change in the arsenic levels. In fact, although they do show some arsenate in purified DNA, the initial DNA/RNA extract from the two groups (high As/no P and no As/high P) shows more arsenic in the bacteria that weren't getting arsenic at all. (These are the top two lines in Table S1 continued, top of page 11 in the Supplementary Information). The arsenate-in-the-DNA conclusion of this paper is, to my mind, absolutely the weakest part of the whole thing.

Conclusion: All in all, I'm very interested in these experiments, but I'm now only partly convinced. So what do the authors need to shore things up? As a chemist, I'm going to ask for more chemical evidence. I'd like to see some mass spec work done on cellular extracts, comparing the high-arsenic and no-arsenic groups. Can we see evidence of arsenate-for-phosphate in the molecular weights? If DNA was good enough to purify with arsenate still on it, how about the proteome? There are a number of ways to look that over by mass-spec techniques, and this really needs to be done.

Can any of the putative arsenate-containing species can be purified by LC? LC/mass spec data would be very strong evidence indeed. I'd recommend that the authors look into this as soon as possible, since this could address biomolecules of all sizes. I would assume that X-ray crystallography data on any of these would be a long shot, but if the LC purification works, it might be possible to get enough to try. It would certainly shut everyone up!

Update: this seems like the backlash day. Nature News has a piece up, which partially quotes from this article Carl Zimmer over at Slate.

Comments (35) + TrackBacks (0) | Category: Biological News | General Scientific News

November 15, 2010

Kitchen Chemistry Gear

Email This Entry

Posted by Derek

I (and other chemists) have been talking for years about the connections between organic chemistry and cooking. The usual saying is that you should never trust the lab work of an organic chemist who's hopeless in the kitchen. I agree with that one - I've known good chemists who don't cook (among them, a colleague in grad school who used his oven as a filing cabinet), but I don't think I've ever known one who can't.

Well, the techno-culinary fashion in recent years is blurring the line even more, from the other direction. Check out this web site from the Kohler people, makers of sinks, faucets, and the like. Vacuum apparatus is available for you to experiment with sous vide techniques at home - and if you scroll down, the crossover is complete. Yep, there's a rota-vap, right out of the lab and ready for the kitchen counter. I've always wondered if those would be good for reducing a sauce, and now, well, we're going to find out. . .

Comments (47) + TrackBacks (0) | Category: General Scientific News

October 20, 2010

Is Cancer A Disease of the Modern World?

Email This Entry

Posted by Derek

This paper in Nature Reviews Cancer is getting more attention in the popular press than most papers in that journal manage. Titled "Cancer: an old disease, a new disease or something in between?", it goes over the archaeological evidence for cancer rates in ancient populations, and goes on to speculate whether the incidence of the disease is higher under modern conditions.

I'd be interested in knowing that, too, but the problem seems to be that there's not much evidence one way or another. The authors concentrate on the evidence that can be found in bone samples, since these are naturally the most numerous, but it seems to be quite hard to get any meaningful histology data from ancient bone tissue. As for other tissue, the Egyptian record is probably the most statistically robust, thanks to deliberate mummification and the desert conditions, but even that isn't too definitive. The Greeks definitely described metastatic tumors, though (and in fact, gave us our name for the disease).

Still, they believe that the archaeological record indicates a smaller incidence of cancer than you'd expect, although given the long list of confounding factors they present, I'm not sure how sturdy that result is. One of the biggest of those is the shorter life expectancies in ancient populations, and it's not easy to get around that one. As the authors themselves point out, working-class ancient Egyptians seem to have mostly died at ages between 25 and 30 (!), and there aren't many forms of cancer that would be expected to show up well under those demographic conditions. (Osteosarcoma is the main tumor type the authors look for as being not so age-dependent).

The paper itself is fairly calm about its conclusions:

Despite the fact that other explanations, such as inadequate techniques of disease diagnosis, cannot be ruled out, the rarity of malignancies in antiquity is strongly suggested by the available palaeopathological and literary evidence. This might be related to the prevalence of carcinogens in modern societies.

But the press reports (based, I think, partly on further statements from the authors) haven't been. "Cancer Is A Modern Disease", "No Cancer In Ancient Times" go the headlines. (Go tell that last one to the ancient Greeks). And it's impossible to deny the environmental causes of some cancers - I'll bet that lung cancer rates prior to the introduction of tobacco into the Old World were pretty low, for example. Repeated exposure to some industrial chemicals (benzene and benzidine, right off the top of my head) are most definitely linked to increased risk of particular tumor types.

So in that way, modern cancer incidence probably is higher, at least for specific forms of the disease. But (as mentioned above) the single biggest factor is surely our longer lives. Eventually, some cells are going to hit on the wrong combination of mutations if you just give them enough time. And the widely reported statement from Professor Davids, one of the paper's authors, that "There is nothing in the natural environment that causes cancer", is flat-out wrong. What about UV light from the sun? Aflatoxins from molds? Phorbol esters in traditional herbal recipes?

That statement strongly suggests a habit of mind that I think has to be guarded against: the "Garden of Eden" effect. That's the belief, widely held in one form or another, that there was a time - long ago - when people were in harmony with nature, ate pure, wholesome natural foods (the kind that we were meant to eat), and didn't have all the horrible problems that we have in these degenerate modern times. (You can see a lot of Rousseau in there, too, what with all that Noble Savage, corrupted-by-modern-society stuff).

This 1990 article (PDF) by Bruce Ames and Lois Gold, "Misconceptions on Pollution and the Causes of Cancer" is a useful corrective to the idea that modern environments cause all cancers. You'll have to guard yourselves, though, against the prelapsarian Golden Age fallacy. It's everywhere.

Comments (31) + TrackBacks (0) | Category: Cancer | General Scientific News

October 18, 2010

So, How Well Does Winning a Nobel Set You Up?

Email This Entry

Posted by Derek

Financially, maybe not as well as you'd think. Ask Martin Chalfie, one of the fluorescent-protein Chemistry prize winners from 2008. . .

Comments (26) + TrackBacks (0) | Category: General Scientific News

September 28, 2010

Nobel Season 2010

Email This Entry

Posted by Derek

As we head towards October, the thoughts of a very select group of scientists may be turning to their chances of winning a Nobel Prize - and the thoughts of the rest of us turn to laying odds on the winners. I've handicapped the race here before (here's the 2009 version), and that's one place to start a list. Another excellent roundup can be found over at Chembark, and another well-annotated one at the Curious Wavefunction. Meanwhile, Thomson/Reuters sent me their citation-voodoo list the other day, but to my eyes, they're always a bit off the mark.

So who are the favorites? Last year I mentioned Zare, Bard, and Moerner for single-atom spectroscopy, and I think that after a run of biology-laced prizes that a swing back over to nearly-physics is pretty plausible. If the committee is going to stick with nearly-biology, then perhaps humanized antibodies, microarrays, or chaperone proteins will make it in, but I really don't think that this is the year (in the Chemistry prize, anyway). On the chemistry/medicine interface, there's always the chance that the committee could turn around and honor Carl Djerassi after all these years, but that's the only med-chem themed prize I can see. I think the chances of a pure organic synthesis prize are very low indeed - and that includes palladium-catalyzed couplings, too, unfortunately. There are too many people deserving of credit there, "too many" meaning "more than three" for Nobel purposes, and not all of them are still alive.

The more I think about it, the more skeptical I am of a Nobel for dye-based solar cells (Grätzel et al.) or any form of asymmetric catalysis this year. If anything, the committee waits too long before recognizing things, and it's just too early for these (and some other ideas floating around out there). The Thomson/Reuters list seems to be very big on metal-organic framework materials, for example, and I just don't see it. Waiting too long is a problem, but giving trendy things out too soon can be an even bigger one.

On the other end of the scale, I used to confidently predict a Nobel for RNA interference (in one field or another), and they finally took care of that one. The only Nobel I feel similarly sure of is in Physics, for the "dark energy" finding that the expansion of the universe is accelerating. At some point that one's going to win - maybe when there's more of an explanation for it, although that could be a bit of a wait. This is an area where I and the Thomson/Reuters people agree (and a lot of physicists seem to go along, too).

Want to make your own odds? This Chembark post is a fine overview of the factors involved. Suggestions welcome in the comments from anyone who feels as if their psychic powers are tuned up. . .

Comments (46) + TrackBacks (0) | Category: Chemical News | General Scientific News

September 23, 2010

Chemical Biology - The Future?

Email This Entry

Posted by Derek

I agree with many of the commenters around here that one of the most interesting and productive research frontiers in organic chemistry is where it runs into molecular biology. There are so many extraordinary tools that have been left lying around for us by billions of years of evolution; not picking them up and using them would be crazy.

Naturally enough, the first uses have been direct biological applications - mutating genes and their associated proteins (and then splicing them into living systems), techniques for purification, detection, and amplification of biomolecules. That's what these tools do, anyway, so applying them like this isn't much of a shift (which is one reason why so many of these have been able to work so well). But there's no reason not to push things further and find our own uses for the machinery.

Chemists have been working on that for quite a while. We look at enzymes and realize that these are the catalysts that we really want: fast, efficient, selective, working at room temperature under benign conditions. If you want molecular-level nanotechnology (not quite down to atomic!), then enzymes are it. The ways that they manipulate their substrates are the stuff of synthetic organic daydreams: hold down the damn molecule so it stays in one spot, activate that one functional group because you know right where it is and make it do what you want.

All sorts of synthetic enzyme attempts have been made over the years, with varying degrees of success. None of them have really approached the biological ideals, though. And in the "if you can't beat 'em, join 'em" category, a lot of work has gone into modifying existing enzymes to change their substrate preferences, product distributions, robustness, and turnover. This isn't easy. We know the broad features that make enzymes so powerful - or we think we do - but the real details of how they work, the whole story, often isn't easy to grasp. Right, that oxyanion hole is important: but just exactly how does it change the energy profile of the reaction? How much of the rate enhancement is due to entropic factors, and how much to enthalpic ones? Is lowering the energy of the transition state the key, or is it also a subtle raising of the energy of the starting material? What energetic prices are paid (and earned back) by the conformational changes the protein goes through during the catalytic cycle? There's a lot going on in there, and each enzyme avails itself of these effects differently. If it weren't such a versatile toolbox, the tools themselves wouldn't come out being so darn versatile.

There's a very interesting paper that's recently come on on this sort of thing, to which I'll devote a post by itself. But there are other biological frontiers beside enzymes. The machinery to manipulate DNA is exquisite stuff, for example. For quite a while, it wasn't clear how we organic chemists could hijack it for our own uses - after all, we don't spend a heck of a lot of time making DNA. But over the years, the technique of adding DNA segments onto small molecules and thus getting access to tools like PCR has been refined. There are a number of applications here, and I'd like to highlight some of those as well.

Then you have things like aptamers and other recognition technologies. These are, at heart, ways to try to recapitulate the selective binding that antibodies are capable of. All sorts of synthetic-antibody schemes have been proposed - from manipulating the native immune processes themselves, to making huge random libraries of biomolecules and zeroing in on the potent ones (aptamers) to completely synthetic polymer creations. There's a lot happening in this field, too, and the applications to analytical chemistry and purification technology are clear. This stuff starts to merge with the synthetic enzyme field after a point, too, and as we understand more about enzyme mechanisms that process looks to continue.

So those are three big areas where molecular biology and synthetic chemistry are starting to merge. There are others - I haven't even touched here on in vivo reactions and activity-based proteomics, for example, which is great stuff. I want to highlight these things in some upcoming posts, both because the research itself is fascinating, and because it helps to show that our field is nowhere near played out. There's a lot to know; there's a lot to do.

Comments (33) + TrackBacks (0) | Category: Analytical Chemistry | Biological News | Chemical News | General Scientific News | Life As We (Don't) Know It

August 3, 2010

Know How to Make Praziquantel? Tell The World.

Email This Entry

Posted by Derek

One of the people I met this past weekend was Matt Todd, chemistry professor at the University of Sydney. We talked about a project his lab is working on, and I wanted to help call attention to it.

They're working on praziquantel, also known as PZQ or Biltricide, which is used to cure schistosomiasis in the tropics. It's on the WHO's list of essential medicines for this reason. But PZQ is used now as a racemate, and this is one of those cases where everyone would be better off with a single enantiomer - not least, because the active enantiomer is significantly easier for patients to stand than the racemic mixture. Problem is, there's no cheap enantioselective synthesis or resolution.

So what Todd's group has done is crowdsourced the problem. Here's the page to start with, where they lay out the current synthetic difficulties - right now, those include enantioselective Pictet-Spengler catalysts and help with the resolution of a key intermediate. They were in need of chiral HPLC conditions, but that problem has recently been solved. I'd like to ask the chemists in the crowd here to take a look, because it wouldn't surprise me if one of us had some ideas that could help. Don't leave your suggestions here, though; do it over at their pages so it's all in one place.

This sort of thing is an excellent fit with open-source models for doing science: it's all pro bono, and the more eyes that take a look at the situation, the better the chance that a solution will emerge. I don't think it's getting the publicity it deserves. And no, in case anyone's wondering, I don't think that this is how we're all going to end up discovering drugs. Figuring out how to do this for large commercial projects tends to bring on frantic hand-waving. But in cases like this - specific problems where there's no chance for profit to push things along - I think it can work well. It makes a lot more sense than that stuff I was linking to last week!

Comments (22) + TrackBacks (0) | Category: Business and Markets | Chemical News | General Scientific News | Infectious Diseases

July 9, 2010

Lechleiter's Prescription for Science

Email This Entry

Posted by Derek

John Lechleiter of Eli Lilly has an op-ed in today's Wall Street Journal on innovation in the US. Needless to say, he's worried:

A recent study ranked the U.S. sixth among the top 40 industrialized nations in innovative competitiveness, but 40th out of 40 in "the rate of change in innovation capacity" over the past decade. The ranking, published last year by the Information Technology and Innovation Foundation, measured what countries are doing—in higher education, investment in research and development, corporate tax rates, and more—to become more innovative in the future. The U.S. ranked dead last.

He goes on to say that we need a climate that appreciates new technology (which I certainly think we have), the financial system to support it (which is where he makes the case for favorable tax treatment), and the people who can do it. That's the longest single section of the whole piece:

The final and most important elements are the seeds of innovation, which equate to talented people and their ideas. Human beings—with their talent and energy, creativity and insights—are a priceless resource, but one that is woefully underdeveloped in this country.

There are three policies necessary to cultivate these seeds of innovation. First, with our kids falling further behind on international comparisons in education, we've got to get serious about broad improvement in science and math instruction in our grade schools and high schools.

Second, we need immigration laws that allow and encourage top scientists from other countries to choose to work in the United States. This does not entail drastic changes, but a sensible increase in visas for highly skilled immigrants and a shorter, simpler green-card application process.

Third, we need a well-funded basic research infrastructure within academic and government labs. What's required is not some new "Manhattan Project," but a long-term funding commitment necessary to attract more outstanding scientists to basic research and keep them engaged in productive work throughout their careers.

Well, there are going to be a lot of people reading this who will snort and say "Great, domestic science policy advice from Lilly, the company that's outsourcing everything short of what has to be picked up by a crane". And that call for better science education, together with the immigration reform paragraph, takes us into the "Danger! Undersupply of Scientists!" territory that drives many actual scientists crazy as they scan the employment ads and reformat their CVs.

I agree that science and engineering should be valued more in this country. But given our culture, what would make it so would be the perception that these fields are great places to have lucrative jobs, and that perception is currently taking an awful beating. Justifiably. So I'm not seeing this as a supply-of-scientists problem, as much as I see it as a shortage-of-ways-for-scientists-to-make-a-living one. . .

Comments (61) + TrackBacks (0) | Category: Business and Markets | General Scientific News | Press Coverage

April 8, 2010

Let's Sequence These Guys

Email This Entry

Posted by Derek

A very weird news item: multicellular organisms that appear to be able to live without oxygen. They're part of the little-known (and only recently codified) phylum Loricifera, and these particular organisms were collected at the bottom of the Mediterranean, in a cold, anoxic, hypersaline environment.

They have no mitochondria - after all, they don't have any oxygen to work with. Instead, they have what look like hydrogenosome organelles, producing hydrogen gas and ATP from pyruvate. I'm not sure how large an organism you can run off that sort of power source, since it looks like you only get one ATP per pyruvate (as opposed to two via the Krebs cycle), but the upper limit has just been pushed past a significant point.

Comments (3) + TrackBacks (0) | Category: Biological News | General Scientific News | Life As We (Don't) Know It

March 26, 2010

Try It At Home

Email This Entry

Posted by Derek

Technical book author (and occasional commenter here) Robert Bruce Thompson has a channel on YouTube called "The Home Scientist" that's quite interesting. Many of these seem to be companion videos for his book, The Illustrated Guide to Home Chemistry Experiments. This is real, well-done chemistry with reagents that can be easily purchased and manipulated by a competent non-chemist. Well worth sending on to people who would like to get a feel for what the science is like!

Comments (8) + TrackBacks (0) | Category: General Scientific News

March 12, 2010

Garage Biotech

Email This Entry

Posted by Derek

Freeman Dyson has written about his belief that molecular biology is becoming a field where even basement tinkerers can accomplish things. Whether we're ready for it or not, biohacking is on its way. The number of tools available (and the amount of surplus equipment that can be bought) have him imagining a "garage biotech" future, with all the potential, for good and for harm, that that entails.

Well, have a look at this garage, which is said to be somewhere in Silicon Valley. I don't have any reason to believe the photos are faked; you could certainly put your hands on this kind of equipment very easily in the Bay area. The rocky state of the biotech industry just makes things that much more available. From what I can see, that's a reasonably well-equipped lab. If they're doing cell culture, there needs to be some sort of incubator around, and presumably a -80 degree freezer, but we don't see the whole garage, do we? I have some questions about how they do their air handling and climate control (although that part's a bit easier in a California garage than it would be in a Boston one). There's also the issue of labware and disposables. An operation like this does tend to run through a goodly amount of plates, bottles, pipet tips and so on, but I suppose those are piled up on the surplus market as well.

But what are these folks doing? The blog author who visited the site says that they're "screening for anti-cancer compounds". And yes, it looks as if they could be doing that, but the limiting reagent here would be the compounds. Cells reproduce themselves - especially tumor lines - but finding compounds to screen, that must be hard when you're working where the Honda used to be parked. And the next question is, why? As anyone who's worked in oncology research knows, activity in a cultured cell line really doesn't mean all that much. It's a necessary first step, but only that. (And how many different cell lines could these people be running?)

The next question is, what do they do with an active compound when they find one? The next logical move is activity in an animal model, usually a xenograft. That's another necessary-but-nowhere-near-sufficient step, but I'm pretty sure that these folks don't have an animal facility in the basement, certainly not one capable of handling immunocompromised rodents. So put me down as impressed, but puzzled. The cancer-screening story doesn't make sense to me, but is it then a cover for something else? What?

If this post finds its way to the people involved, and they feel like expanding on what they're trying to accomplish, I'll do a follow-up. Until then, it's a mystery, and probably not the only one of its kind out there. For now, I'll let Dyson ask the questions that need to be asked, from that NYRB article linked above:

If domestication of biotechnology is the wave of the future, five important questions need to be answered. First, can it be stopped? Second, ought it to be stopped? Third, if stopping it is either impossible or undesirable, what are the appropriate limits that our society must impose on it? Fourth, how should the limits be decided? Fifth, how should the limits be enforced, nationally and internationally? I do not attempt to answer these questions here. I leave it to our children and grandchildren to supply the answers.

Comments (42) + TrackBacks (0) | Category: Biological News | Drug Assays | General Scientific News | Regulatory Affairs | Who Discovers and Why

February 26, 2010

A Friday Book Recommendation

Email This Entry

Posted by Derek

This isn't exactly med-chem, but its focus probably overlaps with the interests of a number of readers around here. I recently came across a copy of A Field Guide to Bacteria and enjoyed it very much. I don't think there's another book quite like it available: it describes where you're likely to find different varieties of bacteria (from hot springs to your fridge), how they behave in a natural environment (as opposed to a culture dish) and how to identify them by field marks, if possible. It's not written for microbiologists, but it can provide a different perspective even if you work in the field (since many people that do focus on pathogens - really a very small subset of bacteria, when you get down to it).

I'm already inspired to set up some Winogradsky columns with my kids, perhaps with some unusual chemical additives to see what happens. If we discover anything, I'll report back. . .

Comments (11) + TrackBacks (0) | Category: Book Recommendations | General Scientific News | Infectious Diseases

January 22, 2010

Maybe You Need Some More Testosterone Over There

Email This Entry

Posted by Derek

This one's also from the Department of Placebo Effects - read on. An interesting paper out in Nature details a study where volunteers took small doses of testosterone or placebo, and then participated in a standard behavioral test, the "Ultimatum Game". That's the one where two people participate, with one of them given a sum of money (say, $10), that's to be divided between the two of them. The player with the money makes an offer to divide the pot, which the other player can only take or leave (no counteroffers). A number of interesting questions about altruism and competition have been examined through this game and its variants - basically, the first thing to ask is how much the "dictator" player will feel like offering at all. (If you like, here's the Freakonomics guys talking about the game, which features in a chapter of their latest, SuperFreakonomics).

What's been found in many studies is that the second players often reject offers that they feel are insultingly low, giving up a sure gain for the sake of pride and sending a message to the first player. I think of this as the "Let me tell you what you can do with your buck-fifty" option. So what does exposure to testosterone do for this behavior? As the authors of the new paper talk about, there are two (not necessarily exclusive) theories about some of the hormone's effects. Increases in aggression and competitiveness are widely thought to be one of these, but there's also a good amount of literature to suggest that status-seeking behavior is perhaps more important. But if someone is going to be aggressive about the ultimatum game, they're going to make a lowball offer and damn the consequences, whereas if they're looking for status, they may well choose a course that avoids having their offer thrown back in their face.

Using known double-blind conditions for testosterone dosing in female subjects (sublingual dosing four hours before the test), the second behavior was observed. Update: keep in mind, women have endogenous testosterone, too. The subjects who got testosterone made more generous offers (from about $3.50 to closer to $4.00). The error bars on that measurement just miss overlapping, p = 0.031. But here's the part I found even more interesting: the subjects who believed that they got testosterone made significantly less fair/generous offers than the ones who believed that they got the placebo (P = 0.006). Because, after all, testosterone makes you all tough and nasty, as everyone knows. As the authors sum it up:

"The profound impact of testosterone on bargaining behaviour supports the view that biological factors have an important role in human social interaction. This does, of course, not mean that psychological factors are not important. In fact, our finding that subjects’ beliefs about testosterone are negatively associated with the fairness of bargaining offers points towards the importance of psychological and social factors. Whereas other animals may be predominantly under the influence of biological factors such as hormones, biology seems to exert less control over human behaviour. Our findings also teach an important methodological lesson for future studies: it is crucial to control for subjects’ beliefs because the pure substance effect may be otherwise under- or overestimated. . ."

Comments (13) + TrackBacks (0) | Category: Biological News | General Scientific News | The Central Nervous System

January 8, 2010

Find That Pattern

Email This Entry

Posted by Derek

I have to take my hat off to this guy at the Times of London. The British press recently played a story about how various ancient sites were linked up in uncanny triangular formations - well, it turns out that the same chilling patterns are found in other ancient monuments as well. Read and be enlightened.

Comments (13) + TrackBacks (0) | Category: General Scientific News

December 16, 2009

Pass the Popcorn

Email This Entry

Posted by Derek

Year-end rushing around has left me little time for blogging last night or this morning. But a discussion with a colleague the other day leads me to ask a quick question of the readership: has there ever, in your view, been a realistic depiction of a research chemist in some sort of popular entertainment (TV, movie, reasonably-selling novel)? I'm hard-pressed to think of many examples myself, partly because what we do isn't (a) all that easy to explain in a dramatic setting, and (b) tends to operate with non-dramatic pacing, to put it mildly. But I'd be glad to hear some suggestions. . .

Comments (66) + TrackBacks (0) | Category: General Scientific News

December 2, 2009

Copyright 1671: I Like the Sound of That

Email This Entry

Posted by Derek

Thanks to the Royal Society, here's the sort of scientific paper that they just don't make any more: "A Letter of Mr. Isaac Newton, Professor of the Mathematicks at the University of Cambridge, Containing His New Theory About Light and Colors". Along the way, in between making fundamental observations about refraction, rainbows, white light, complementary colors, and human perception, he invents the reflecting telescope that I take out into my yard on clear nights.

Newton was the Real Deal if anyone ever was. Like Bernoulli, you may recognize the lion by his paw.

Comments (20) + TrackBacks (0) | Category: General Scientific News

December 1, 2009

Climategate and Scientific Conduct

Email This Entry

Posted by Derek

Everyone has heard about the "Climategate" scandal by now. Someone leaked hundreds of megabytes of information from the University of East Anglia's Climatic Research Unit, and the material (which appears to be authentic) is most interesting. I'm not actually going to comment on the climate-change aspect of all this, though. I have my own opinions, and God knows everyone else has one, too, but what I feel needs to be looked at is the scientific conduct. I'm no climatologist, but I am an experienced working scientist - so, is there a problem here?

I'll give you the short answer: yes. I have to say that there appears to be several, as shown by many troubling features in the documents that have come out. The first one is the apparent attempts to evade the UK's Freedom of Information Act. I don't see how these messages can be interpreted in any other way as an attempt to break the law, and I don't see how they can be defended:

Can you delete any emails you may have had with Keith re AR4?
Keith will do likewise. He's not in at the moment - minor family crisis. Can you also email Gene and get him to do the same? I don't have his new email address. We will be getting Caspar to do likewise.

A second issue is a concerted effort to shape what sorts of papers get into the scientific literature. Again, this does not seem to be a matter of interpretation; such messages as this, this, and this spell out exactly what's going on. You have talk of getting journal editors fired:

This is truly awful. GRL has gone downhill rapidly in recent years.
I think the decline began before Saiers. I have had some unhelpful dealings with him recently with regard to a paper Sarah and I have on glaciers -- it was well received by the referees, and so is in the publication pipeline. However, I got the impression that Saiers was trying to keep it from being published.

Proving bad behavior here is very difficult. If you think that Saiers is in the greenhouse skeptics camp, then, if we can find documentary evidence of this, we could go through official AGU channels to get him ousted. Even this would be difficult.

And of trying to get papers blocked from being referenced:

I can't see either of these papers being in the next IPCC report. Kevin and I will keep them out somehow - even if we have to redefine what the peer-review literature is !

Two questions arise: is this defensible, and does such behavior take place in other scientific disciplines? Personally, I find this sort of thing repugnant. Readers of this site will know that I tend to err on the side of "Publish and be damned", preferring to let the scientific literature sort itself out as ideas are evaluated and experiments are reproduced. I support the idea of peer review, and I don't think that every single crazy idea should be thrown out to waste everyone's time. But I set the "crazy idea" barrier pretty low, myself, remembering that a lot of really big ideas have seemed crazy at first. If a proposal has some connection with reality, and can be tested, I say put it out there, and the more important the consequences, the lower the barrier should be. (The flip side, of course, is that when some oddball idea has been tried and found wanting, its proponents should go away, to return only when they have something sturdier. That part definitely doesn't work as well as it should.)

So this "I won't send my work to a journal that publishes papers that disagree with me" business is, in my view, wrong. The East Anglia people went even farther, though, working to get journal editors and editorial boards changed so that they would be more to their liking, and I think that that's even more wrong. But does this sort of thing go on elsewhere?

It wouldn't surprise me. I hate to say that, and I have to add up front that I've never witnessed anything like this personally, but it still wouldn't surprise me. Scientists often have very easily inflamed egos, and divide into warring camps all too easily. But while it may have happened somewhere else, that does not make it normal (and especially not desirable) scientific behavior. This is not a standard technique by which our sausage is made over here.

What I've seen in organic chemistry are various attempts to steer papers to particular reviewers (or evade other ones). And I've seen people fire off angry letters to journal editors about why some particular paper was published (and why the letter writer's manuscript in response had not been accepted in turn, likely as not). The biggest brawl of them all was still going early in my career (having started before I was born): the fight over the nonclassical norbornyl cation, the very mention of which is still enough to make some older chemists put their hands over their ears and start to hum loudly. That one involved (among many others) two future Nobel Prize winners (H. C. Brown and George Olah), and got very heated indeed - but I still don't recall either one of them trying to get journal editors fired after publishing rival manuscripts. You don't do that sort of thing.

And that brings up an additional problem with all this journal curating: the CRU people have replied to their critics in the past by saying that more of their own studies have been published in the peer-reviewed literature. This is disingenuous when you're working at the same time to shape the peer-reviewed literature into what you think it should look like.

A third issue I want to comment on are the problems with the data and its analysis. I have deep sympathy for the fellow who tried to reconcile the various poorly documented and conflicting data sets and buggy, unannotated code that the CRU has apparently depended on. And I can easily see how this happens. I've been on long-running projects, especially some years ago, where people start to lose track of which numbers came from where (and when), where the underlying raw data are stored, and the history of various assumptions and corrections that were made along the way. That much is normal human behavior. But this goes beyond that.

Those of us who work in the drug industry know that we have to keep track of such things, because we're making decisions that could eventually run into the tens and hundreds of millions of dollars of our own money. And eventually we're going to be reviewed by regulatory agencies that are not staffed with our friends, and who are perfectly capable of telling us that they don't like our numbers and want us to go spend another couple of years (and another fifty or hundred million dollars) generating better ones for them. The regulatory-level lab and manufacturing protocols (GLP and GMP) generate a blizzard of paperwork for just these reasons.

But the stakes for climate research are even higher. The economic decisions involved make drug research programs look like roundoff errors. The data involved have to be very damned good and convincing, given the potential impact on the world economy, through both the possible effects of global warming itself and the effects of trying to ameliorate it. Looking inside the CRU does not make me confident that their data come anywhere close to that standard:

I am very sorry to report that the rest of the databases seem to be in nearly as poor a state as Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO and one with, usually overlapping and with the same station name and very similar coordinates. I know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh! There truly is no end in sight... So, we can have a proper result, but only by including a load of garbage!

I do not want the future of the world economy riding on this. And what's more, it appears that the CRU no longer has much of their original raw data. It appears to have been tossed over twenty years ago. What we have left, as far as I can see, is a large data set of partially unknown origin, which has been adjusted by various people over the years in undocumented ways. If this is not the case, I would very much like the CRU to explain why not, and in great detail. And I do not wish to hear from people who wish to pretend that everything's just fine.

The commentator closest to my views is Clive Crook at The Atlantic, whose dismay at all this is unhidden. I'm not hiding mine, either. No matter what you think about climate change, if you respect the scientific endeavor, this is very bad news. Respect has to be earned. And it can be lost.

Comments (170) + TrackBacks (0) | Category: Current Events | General Scientific News | The Dark Side | The Scientific Literature

September 16, 2009

Real Electrons

Email This Entry

Posted by Derek

I posted images of real pentacene molecules the other day, but now the single molecule/single-atom imaging field has reached another milestone. There's a paper coming out in Physical Review B from a team in Kharkov using a field emission electron microscope. At heart, that's a pretty old type of machine, first invented back in the 1930s, and it's long provided images of the arrangements of atoms in metal surfaces. (More precisely, you're getting an image of the work function, the energy needed to remove electrons from the material).

But this latest work is something else entirely. The researchers have improved the resolution and sensitivity, narrowing things down to single-atom tips. So instead of a tungsten surface, we have a single carbon atom at the end of a chain. And instead of the behavior of the electons in a bulk metal, we have the electron density around one nucleus. Behold the s and p orbitals. Generations of students have learned these as abstractions, diagrams on a page. I never thought I'd see them, and I never thought I'd see the day when when it was even possible. As always, I react to these things with interest, excitement, and a tiny bit of terror at seeing something that I assumed would always be hidden.

Comments (53) + TrackBacks (0) | Category: General Scientific News

September 3, 2009

Real Molecules

Email This Entry

Posted by Derek

Most of you will have heard about the recent accomplishment at the IBM Zürich labs, using an atomic force microscope with unprecedented resolution. They've imaged individual molecules so well that the atoms and bonds are alarmingly clear. I thought I'd put up one of the less-used images from the paper - here are some pentacene molecules (five fused benzene rings) sitting around on a surface. Not a simulation, not a model: real molecules. It gives me a slight chill, to tell you the truth.

Comments (37) + TrackBacks (0) | Category: General Scientific News

July 2, 2009

Jargon Will Save Us All

Email This Entry

Posted by Derek

Moore's Law: number of semiconductors on a chip doubling every 18 months or so, etc. Everyone's heard of it. But can we agree that anyone who uses it as a metaphor or perscription for drug research doesn't know what they're talking about?

I first came across the comparison back during the genomics frenzy. One company that had bought into the craze in a big way press-released (after a rather interval) that they'd advanced their first compound to the clinic based on this wonderful genomics information. I remember rolling my eyes and thinking "Oh, yeah", but on a hunch I went to the Yahoo! stock message boards (often a teeming heap of crazy, then as now). And there I found people just levitating with delight at this news. "This is Moore's Law as applied to drug discovery!" shouted one enthusiast. "Do you people realize what this means?" What it meant, apparently, was not only that this announcement had come rather quickly. It also meant that this genomics stuff was going to discover twice as many drugs as this real soon. And real soon after that, twice as many more, and so on until the guy posting the comment was as rich as Warren Buffet, because he was a visionary who'd been smart enough to load himself into the catapult and help cut the rope. (For those who don't know how that story ended, the answer is Not Well: the stock that occasioned all this hyperventilation ended up dropping by a factor of nearly a hundred over the next couple of years. The press-released clinical candidate was never, ever, heard of again).

I bring this up because a reader in the industry forwarded me this column from Bio-IT World, entitled, yes, "Only Moore's Law Can Save Big Pharma". I've read it three times now, and I still have only the vaguest idea of what it's talking about. Let's see if any of you can do better.

The author starts off by talking about the pressures that the drug industry is under, and I have no problem with him there. That is, until he gets to the scientific pressures, which he sketches out thusly:

Scientifically, the classic drug discovery paradigm has reached the end of its long road. Penicillin, stumbled on by accident, was a bona fide magic bullet. The industry has since been organized to conduct programs of discovery, not design. The most that can be said for modern pharmaceutical research, with its hundreds of thousands of candidate molecules being shoveled through high-throughput screening, is that it is an organized accident. This approach is perhaps best characterized by the Chief Scientific Officer of a prominent biotech company who recently said, "Drug discovery is all about passion and faith. It has nothing to do with analytics."

The problem with faith-based drug discovery is that the low hanging fruit has already been plucked, driving would be discoverers further afield. Searching for the next miracle drug in some witch doctor's jungle brew is not science. It's desperation.

The only way to escape this downward spiral is new science. Fortunately, the fuzzy outlines of a revolution are just emerging. For lack of a better word, call it Digital Chemistry.

And when the man says "fuzzy outline", well, you'd better take him at his word. What, I know you're all asking, is this Digital Chemistry stuff? Here, wade into this:

Tomorrow's drug companies will build rationally engineered multi-component molecular machines, not small molecule drugs isolated from tree bark or bread mold. These molecular machines will be assembled from discrete interchangeable modules designed using hierarchical simulation tools that resemble the tool chains used to build complex integrated circuits from simple nanoscale components. Guess-and-check wet chemistry can't scale. Hit or miss discovery lacks cross-product synergy. Digital Chemistry will change that.

Honestly, if I start talking like this, I hope that onlookers will forgo taking notes and catch on quickly enough to call the ambulance. I know that I'm quoting too much, but I have to tell you more about how all this is going to work:

But modeling protein-protein interaction is computationally intractable, you say? True. But the kinetic behavior of the component molecules that will one day constitute the expanding design library for Digital Chemistry will be synthetically constrained. This will allow engineers to deliver ever more complex functional behavior as the drugs and the tools used to design them co-evolve. How will drugs of the future function? Intracellular microtherapeutic action will be triggered if and only if precisely targeted DNA or RNA pathologies are detected within individual sick cells. Normal cells will be unaffected. Corrective action shutting down only malfunctioning cells will have the potential of delivering 99% cure rates. Some therapies will be broad based and others will be personalized, programmed using DNA from the patient's own tumor that has been extracted, sequenced, and used to configure "target codes" that can be custom loaded into the detection module of these molecular machines.

Look, I know where this is coming from. And I freely admit that I hope that, eventually, a really detailed molecular-level knowledge of disease pathology, coupled with a really robust nanotechnology, will allow us to treat disease in ways that we can't even approach now. Speed the day! But the day is not sped by acting as if this is the short-term solution for the ills of the drug industry, or by talking as if we already have any idea at all about how to go about these things. We don't.

And what does that paragraph up there mean? "The kinetic behavior. . .will be synthetically constrained"? Honestly, I should be qualified to make sense of that, but I can't. And how do we go from protein-protein interactions at the beginning of all that to DNA and RNA pathologies at the end, anyway? If all the genomics business has taught us anything, it's that these are two very, very different worlds - both important, but separated by a rather wide zone of very lightly-filled-in knowledge.

Let's take this step by step; there's no other way. In the future, according to this piece, we will detect pathologies by detecting cell-by-cell variations in DNA and/or RNA. How will we do that? At present, you have to rip open cells and kill them to sequence their nucleic acids, and the sensitivities are not good enough to do it one cell at a time. So we're going to find some way to do that in a specific non-lethal way, either from the outside of the cells (by a technology that we cannot even yet envision) or by getting inside them (by a technology that we cannot even envision) and reading off their sequences in situ (by a technology that we cannot even envision). Moreover, we're going to do that not only with the permanent DNA, but with the various transiently expressed RNA species, which are localized to all sort of different cell compartments, present in minute amounts and often for short periods of time, and handled in ways that we're only beginning to grasp and for purposes that are not at all yet clear. Right.

Then. . .then we're going to take "corrective action". By this I presume that we're either going to selectively kill those cells or alter them through gene therapy. I should note that gene therapy, though incredibly promising as ever, is something that so far we have been unable, in most cases, to get to work. Never mind. We're going to do this cell by cell, selectively picking out just the ones we want out of the trillions of possibilities in the living organism, using technologies that, I cannot emphasize enough, we do not yet have. We do not yet know how to find most individual cells types in a complex living tissue; huge arguments ensue about whether certain rare types (such as stem cells) are present at all. We cannot find and pick out, for example, every precancerous cell in a given volume of tissue, not even by slicing pieces out of it, taking it out into the lab, and using all the modern techniques of instrumental analysis and molecular biology.

What will we use to do any of this inside the living organism? What will such things be made of? How will you dose them, whatever they are? Will they be taken up though the gut? Doesn't seem likely, given the size and complexity we're talking about. So, intravenous then, fine - how will they distribute through the body? Everything spreads out a bit differently, you know. How do you keep them from sticking to all kinds of proteins and surfaces that you're not interested in? How long will they last in vivo? How will you keep them from being cleared out by the liver, or from setting off a potentially deadly immune response? All of these could vary from patient to patient, just to make things more interesting. How will we get any of these things into cells, when we only roughly understand the dozens of different transport mechanisms involved? And how will we keep the cells from pumping them right back out? They do that, you know. And when it's time to kill the cells, how do you make absolutely sure that you're only killing the ones you want? And when it's time to do the gene therapy, what's the energy source for all the chemistry involved, as we cut out some sequences and splice in the others? Are we absolutely sure that we're only doing that in just the right places in just the right cells, or will we (disastrously) be sticking in copies into the DNA of a quarter of a per cent of all the others?

And what does all this nucleic acid focus have to do with protein expression and processing? You can't fix a lot of things at the DNA level. Misfolding, misglycosylation, defects in transport and removal - a lot of this stuff is post-genomic. Are we going to be able to sequence proteins in vivo, cell by cell, as well? Detect tertiary structure problems? How? And fix them, how?

Alright, you get the idea. The thing is, and this may be surprising considering those last few paragraphs, that I don't consider all of this to be intrinsically impossible. Many people who beat up on nanotechnology would disagree, but I think that some of these things are, at least in broad hazy theory, possibly doable. But they will require technologies that we are nowhere close to owning. Babbling, as the Bio-IT World piece does, about "detection modules" and "target codes" and "corrective action" is absolutely no help at all. Every one of those phrases unpacks into a gigantic tangle of incredibly complex details and total unknowns. I'm not ready to rule some of this stuff out. But I'm not ready to rule it in just by waving my hands.

Comments (46) + TrackBacks (0) | Category: Drug Industry History | General Scientific News | In Silico | Press Coverage

May 8, 2009

Altermune - Real Stuff or Not?

Email This Entry

Posted by Derek

Kary Mullis is an outlier among Nobel Prize winners. Attendees some of his invited talks in the years after his award will know what I’m talking about. These were famously random affairs, with the audience never knowing quite what to expect when the next slide came up on the screen. And his own book, Dancing Naked in the Mind Field, will give you about as much flakiness as you can stand.

But although he's been way off base about a lot of things, he may not be that way about everything. I notice (h/t Biotechniques) that he gave a lecture recently at San Jose State, and instead of hearing about the discovery of PCR, the students got an update on Mullis’s company Altermune, whose website website is intertwined with Mullis's own. The site is worth a look. Mullis has a vigorous writing style, and the rest of the front page is his pitch for his company’s approach to immunotherapy for infectious disease:

We have been slowly developing chemistry- the art of dealing, using instruments we devise, with things that are much too small for us to see. They have plus and minus charges on them that we can't feel; they have oily places on them much too tiny for us to notice oil and they have water-loving patches too small for us to see oil droplets beading up on the water. Microbes need all of these things, specific types of them, in fact, to survive, and none of them are beyond the scope of our instruments and our synthetic tools. That's our advantage. Just in this last century we have come to know these things the way we used to know javelins and swords.

How can we help our immune system? Altermune has a shot at it.

Give its antibodies - its workhorse molecules - bionic arms. That's right, little chemical extensions that allow an old antibody to do new tricks. Altermune, LLC, in collaboration with Biosearch in Novato, CA, this summer, fitted up some antibodies whose job used to be binding to something called galactose-alpha-1,3-galactosyl-beta-1,4-N-acetyl glucosamine, with new bionic arms, synthesized on an Applied Biosystems ABI 3900, arms that can tightly sieze an influenza virion, shake it a little bit for emphasis, and turn it over to a hungry human macrophage for further processing. The change was accomplished with a swallowed drug. No need to send the antibodies back to the factory. Viruses never saw the ABI 3900 coming.

It looks like he’s using DNA aptamers as recognition elements for specific pathogens, which are used to bring on a response from the ubiquitous antibodies that target 1,3-Gal-Gal antigens. Here's the patent on the technique. And I have to say, that’s not necessarily a crazy idea at all. That epitope has been suggested before as a way to boost immune response, and marrying that to an aptamer could work. (Other aptamer conjugates are under investigation). Of course, the problem (as with all nucleic-acid based things) is, how do you dose it (and how long does it hang around once you do?)

Mullis seems to be talking about oral delivery, which is a real challenge. But that makes me wonder about a report from a company called RXi, which claims to be having some success in delivering their RNAi therapy to macrophages through the gut. They're packaging things in beta-glucan particles and taking advantage of a transport system (and of the fact that there are macrophages in the gut wall waiting for whatever comes out of the food supply). Perhaps something like this would do the trick for a immunological approach like Altermune's?

The immune system scares me, to be honest. I think that evolutionarily we've always walked a narrow path between "strong enough to fight off threats" and "touchy enough to get you killed". Versions of the machinery that threw their hosts into anaphylactic shock too easily have been weeded out by strong selection pressure - you probably wouldn't live long enough to pass that blueprint on. But it's still a tricky thing to mess with (ask TeGenaro). Using existing antibodies might be the most sensible way to do it. . .

Comments (13) + TrackBacks (0) | Category: General Scientific News | Infectious Diseases

April 3, 2009

The Mechanical Chemist?

Email This Entry

Posted by Derek

We use a lot of automated equipment in the drug discovery business. There’s an awful lot of grunt work involved, and in many cases a robot arm is better suited to the task – transferring solutions, especially repetitive transfers of large numbers of samples, is the classic example. High-throughput screening would just not be possible if you had to do it all by hand; my fingers hurt just imagining all the pipetting that would involve.

But I wouldn’t say that the process of medicinal chemistry is at all automated. That’s very much human-driven, and a lot of the compounds on most med-chem projects are made by hand, one at a time. Sure, there are parallel synthesis techniques, plates and resins and multichannel liquid handlers that will let you set up a whole array of reactions at once. But you do that, typically, only after you’ve found a hot compound, and that’s often done the old-fashioned way. (And, of course, there are a lot of reactions that just don’t lend themselves to efficient parallel synthesis).

But I remember the first time I saw an automated synthetic apparatus, back at an ACS meeting in the mid-1980s. There was a video in the presentation (a real rarity back then), and it showed this Zymark arm being run to set up an array of reactions, assay each of them after an overnight run, and report on the one that performed the best. “Holy cow”, I thought, “someone’s invented the mechanical grad student”. Being a grad student at the time, I wasn’t so sure what I thought about that.

This all comes to mind after reading a report over at Wired about a robotic system that has been claimed to have made a discovery without much human input at all. “Adam”, built at Aberystwyth University in Wales, seems to have been set up to look for similarities in yeast genes whose function hadn’t yet been assigned, and then (using a database of possible techniques) set up experiments to test the hypotheses thus generated. The system was also equipped to be able to follow up on its results, and eventually uncovered a new three-gene pathway, which findings were confirmed by hand.

And Ross King, leading the project at Aberystwyth, is apparently extending the idea to drug discovery. Using a system that (inevitably) will be called “Eve”, he plans to:

. . .autonomously design and screen drugs against malaria and schistosomiasis.

"Most drug discovery is already automated," says King, "but there's no intelligence — just brute force." King says Eve will use artificial intelligence to select which compounds to run, rather than just following a list.

Well, I won't take the intelligence comment personally; I know what the guy is trying to say. I’ll be very interested to see how this is going to be implemented, and how it will work out. (I'll get an e-mail off to Prof. King asking for some details). My first thought was that Eve will be slightly ahead of a couple of the less competent people I’ve seen over the course of my career. And if I can say that with a straight face (and now that I think about it, I believe that I can), then there may well be a place for this sort of thing. I’ve long held that jobs which can be done by machines really should be done by machines.

But how is this going to work? The first way I can see running a computational algorithm to design drugs would be some sort of QSAR, and we were just talking about that here the other day – most unfavorably. I can imagine, though, coding in a lot of received wisdom of drug discovery into an expert system – Topliss tree for aryl substituents, switch thiophene for phenyl, move nitrogens around the rings, add a para-fluoro, check both enantiomers, put in a morpholine for solubility, mess with the basicity of your amine nitrogens, no napthyls if you can help it, watch your logD - my med-chem readers will know just the sorts of things I mean.

Now, automating that, along with feedback from the primary and secondary assays, solubility, PK, metabolite ID and so on. . .mix it in with literature-searching capability for similar compounds, some sort of reaction feasibility scoring function, ability to order reagents from the stockroom, analyze the LC/MS and NMR traces versus predictions, weight the next round of analogs according to what the major unmet project goals are. . .well, we're getting to the mechanical medicinal chemist, sure enough. Now, not all of these things are doable right now. In fact. some of them are rather a long way off. But some of them could be done now, and the others, well, they're certainly not impossible.

I'm not planning on being replace any time soon. But the folks cranking out the parallel libraries, the methyl-ethyl-butyl-futile stuff, they might need to look over their shoulders a bit sooner. That's outsourcing if you like - from the US to China and India, and from there to the robots. . .

Comments (28) + TrackBacks (0) | Category: Drug Development | Drug Industry History | General Scientific News | Life in the Drug Labs

March 26, 2009

Fan Mail

Email This Entry

Posted by Derek

For those who haven't seen it and might be interested, I wanted to point out this excellent profile of Freeman Dyson in the New York Times Magazine. He's a particular scientific hero of mine, and I'm very glad indeed that he's still around.

And here's some more recent Dyson for those who wish.

Comments (42) + TrackBacks (0) | Category: General Scientific News

March 6, 2009

Tie Me Molecule Down, Sport

Email This Entry

Posted by Derek

There are a huge number of techniques in the protein world that relay on tying down some binding partner onto some kind of solid support. When you’re talking about immobilizing proteins, that’s one thing – they’re large beasts, and presumably there’s some tether that can be bonded to them to string off to a solid bead or chip. It’s certainly not always easy, but generally can be done, often after some experimentation with the length of the linker, its composition, and the chemistry used to attach it.

But there are also plenty of ideas out there that call for doing the same sort of thing to small molecules. The first thing that comes to mind is affinity chromatography – take some small molecule that you know binds to a given protein or class of proteins well, attach it to some solid resin or the like, and then pour a bunch of mixed proteins over it. In theory, the binding partner will stick to its ligand as it finds it, everything else will wash off, and now you’ve got pure protein (or a pure group of related proteins) isolated and ready to be analyzed. Well, maybe after you find a way to get them off the solid support as well.

That illustrates one experimental consideration with these ideas. You want the association between the binding partners to be strong enough to be useful, but (in many cases) not so incredibly strong that it can never be broken up again. There are a lot of biomolecule purification methods that rely on just these sorts of interactions, but those often use some well-worked-out binding pair that you introduce into the proteins artificially. Doing it on native proteins, with small molecules that you just dreamed up, is quite another thing.

But that would be very useful indeed, if you could get it work reliably. There are techniques available like surface plasmon resonance, which can tell with great sensitivity if something is sticking close to a solid surface. At least one whole company (Graffinity) has been trying to make a living by (among other things) attaching screening libraries of small molecules to SPR chips, and flowing proteins of interest over them to look for structural lead ideas.

And Stuart Schreiber and his collaborators at the Broad Institute have been working on the immobilized-small-molecule idea as well, trying different methods of attaching compound libraries to various solid supports. They’re looking for molecules that disrupt some very tough (but very interesting) biological processes, and have reported some successes in protein-protein interactions, a notoriously tempting (and notoriously hard) area for small-molecule drug discovery.

The big problem that people tend to have with all these ideas – and I’m one of those people, in the end – is that it’s hard to see how you can rope small molecules to a solid support without changing their character. After all, we don’t have anything smaller than atoms to make the ropes out of. It’s one thing to do this to a protein – that’ll look like a tangle of yarn with a small length of it stretching out to the side. But on the small molecule scale, it’s a bit like putting a hamster on a collar and leash designed for a Doberman. Mr. Hamster is not going to be able to enjoy his former freedom of movement, and a blindfolded person might, on picking him up, have difficulty recognizing his essential hamsterhood.

There's also the problem of how you attach that leash and collar, even if you decide that you can put up with it once it's on. Making an array of peptides on a solid support is all well and good - peptides have convenient handles at both ends, and there are a lot of well-worked-out reactions to attach things to them. But small molecules come in all sorts of shapes, sizes, and combinations of functional groups (at least, they'd better if you're hoping to see some screening hits with them). Trying to attach such a heterogeneous lot of stuff through a defined chemical ligation is challenging, and I think that the challenge is too often met by making the compound set less diverse. And after seeing how much my molecules can be affected by adding just one methyl group in the right (or wrong) place, I’m not so sure that I understand the best way to attach them to beads.

So I’m going to keep reading the tethered-small-molecule-library literature, and keep an eye on its progress. But I worry that I’m just reading about the successes, and not hearing as much about the dead ends. (That’s how the rest of the literature tends to work, anyway). For those who want to catch up with this area, here's a Royal Society review from Angela Koehler and co-workers at the Broad that'll get you up to speed. It's a high-risk, high-reward research area, for sure, so I'll always have some sympathy for it.

Comments (12) + TrackBacks (0) | Category: Analytical Chemistry | Drug Assays | General Scientific News

January 28, 2009

Science and Its Values

Email This Entry

Posted by Derek

Dennis Overbye had an essay in the science section of the New York Times yesterday, entitled "Elevating Science, Elevating Democracy". That gets across the spirit of it pretty well; it's one of those soaring-rhetoric pieces. It starts off with a gush of at-last-we-have-Obama, but what op-ed in the Times doesn't these days? We're going to be sweeping that stuff into piles and pulling it down out of the trees for months. (Before sending me an e-mail, keep in mind that I'd have a similar reaction no matter whose name was involved; I'm just not a person with high expectations from politicians).

But once he gets past the genuflections, I don't disagree with Overbye's main points. He says that science has a reputation of being totally results-oriented and value-neutral, but wants to point out that there are values involved:

"Those values, among others, are honesty, doubt, respect for evidence, openness, accountability and tolerance and indeed hunger for opposing points of view. These are the unabashedly pragmatic working principles that guide the buzzing, testing, poking, probing, argumentative, gossiping, gadgety, joking, dreaming and tendentious cloud of activity — the writer and biologist Lewis Thomas once likened it to an anthill — that is slowly and thoroughly penetrating every nook and cranny of the world."

We forget what a relatively recent and unusual thing it is, science. In most societies, over most of human history, there hasn't been much time or overhead for such a pursuit. And even when there has, most of the time the idea that you could interrogate Nature and get intelligible, reproducible answers would have seemed insane. Natural phenomena were thought to be either beyond human understanding, under the capricious control of the Gods, or impossible to put to any use. In retrospect, it seems to have taken so painfully long to get to the idea of controlled one-variable-at-a-time experimentation. Even the ancient Greeks, extraordinary in many respects, had a tendency to regard such things as beneath them.

So let's shed the politics and celebrate the qualities that Overbye's highlighting. Run good, strong, experiments. Run them right, think hard about the results, and don't be afraid of what they're telling you. That's what got us to where we are now, and what will take us on from here.

Update: a comment from Cosmic Variance.

Comments (11) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

January 16, 2009

Short Items: Viral NMR, Alarming Rings, Cheap Reading, Etc.

Email This Entry

Posted by Derek

From PNAS, here’s an ingenious method that’s allowed NMR-based imaging of particles as small as viruses. I didn’t even think that this was possible – so now that it is, look for all kinds of variations on it over the next few years, as is the way of NMR techniques. Single-cell MRI? As the authors (from IBM) point out, this is a sudden 100-million-fold improvement in volume resolution compared to conventional NMR. It always makes me smile to see that things like this can happen.

This one should go into my “Things I Won’t Work With” folder immediately. Courtesy of Pat Dussault, whose lab has been turning out alarming stuff like this for some years now, we have six-membered rings made up of two carbons and four oxygens. There is no way to do that without putting on protective gear, needless to say – the only question is which stylish ensemble to wear.

James Tour unveils the off-road version of the nanocar.

And finally, I wanted to pass along this scientific reading suggestion to everyone. If you’re into magnetic resonance properties of silicon isotopes, you can read the book. After all, the list price is only $8539.00 (and don't forget, it's eligible for Free Super Saver Shipping!) But the rest of us can enjoy the Amazon reviews, which range from very satisfied customers (“My only question was whether one copy would be enough”) to very unsatisfied indeed. . .

Comments (22) + TrackBacks (0) | Category: Chemical News | General Scientific News

December 17, 2008

Awkward Conversations

Email This Entry

Posted by Derek

We need a lighter topic today, and I’ve got one appropriate to the season, since many people will be having parties and family get-togethers over the next couple of weeks. And although some of these will be full of scientists, there are others where you might be the lone representative from the world of chemistry, biology, or medicine. That can be a good thing – or not so good, depending on how the conversation turns. A reader e-mailed me an account of a recent encounter with a relative who assured him of the benefits of foot-bath detoxification to cure what ails you. As you'd imagine, he didn't quite sign on to that idea, and the discussion went through a few rocky rapids.

I know that this sort of thing has happened to me several times. I’ve had to deal with the topics of how no, it’s not a conspiracy of the drug companies to make vitamin-based therapies illegal – and how yes, I have been working for X number of years in the drug industry without discovering a single thing that’s on the market, and how that’s statistically rather likely. And I’ve explained how it’s hard to come up with a cure for Alzheimer’s when you don’t even know what causes Alzheimer’s, which argument generally meets with agreement. But that reasonable discussion gets canceled out by plenty of others.

Dealing with the crazier propositions takes some real tact. I’m a pretty even-keeled guy, so I generally take a calm approach, just telling them how it is for me after X years of experience in the drug industry. I've found it's harder for people to spout craziness when there's some reasonable person sitting across the table from them who makes a living on the opposite side of their beliefs. And, truth be told, many of the wilder beliefs in the health field aren't necessarily all that strongly held. Most of them don't stand up to much scrutiny (and contradict each other, to boot), and I've found that people pick up and discard them with relative ease.

But you do run into passionate believers now and then. I'd be interested in hearing from people how they've dealt with conversations like this. My usual progression goes something like:

1. That's interesting - where did you hear about this?
2. No, it's true, I really have been working on those diseases for years now. As far as I can tell, they're pretty hard to deal with.
3. Gosh, that anecdotal evidence sure does sound convincing. Pity the FDA won't let us use any of that where I work. Those nutritional supplement manufacturers sure have it easy since the Hatch-Waxman act, don't they?
4. Hmm, since Fact X seems to be true about Disease Y, based on all that I know about the subject, how do these fit together?
5. Well, you know, the laws of physics/chemistry/math that I learned don't seem to cover that particular effect - have they added some recently?
5. No, I think that if there were any conspiracy that big, I probably would have noticed it at some point. Unless you're suggesting that I'm part of the cover-up?
6. Actually, people in the drug industry die from Disease Y, too. You'd think that if we were sitting on the cure for it, we'd have some sort of employee program or something. . .

Comments (30) + TrackBacks (0) | Category: General Scientific News

December 8, 2008

Enhancing the Brain: Here We Go

Email This Entry

Posted by Derek

Depending on what news sources you follow, you may have heard a lot about it already: taking cognition-enhancing drugs to improve normal brain function. An editorial in Nature has just come out in favor of it, so although I wrote about this back in April, it’s time to talk over the issue again.

Let's define what we're talking about first. We really don’t have anything to selectively affect memory or general intelligence per se, but we do know something about how to affect attention span and wakefulness. So right now, cognition enhancement is mostly going to be found via the stimulants used for attention-deficit disorders, along with Cephalon’s Provigil (modafinil) for narcolepsy. These are the drugs of issue.

Nature started off this latest debate on this a few months ago, when they took an informal survey to see how many scientists used these. The results came in as “more than you might think”, although still a decided minority. One got the impression that these were reached for during grant-writing time in academia, for the most part, which would make their usage pattern similar to what you’d find among the student population. My guess is that the number of people using these in industrial research would be far smaller, for several reasons. For one thing, our work moves in different rhythms. As opposed to academia, we rarely have situations where a Big Creative Work has to be produced (or a huge pile of facts memorized) under time pressure. We do have big reports and presentations that come due, of course, but by the time the big ones are due there have been a lot of smaller ones, and the slides and material are largely summaries of those. It’s not to say that many of us couldn’t benefit from some extra attention to our work, it’s just that the opportunities for such aid aren’t as clear-cut.

Any discussion of this topic has to start with the question of how much good such drugs do. I’m willing to stipulate that for situations like the ones I’ve been describing – a need for long, sustained periods of focus and attention to detail – that these compounds do indeed help. They may be more beneficial for some people than for others, but yes, I think that their effect is real. (If anyone has evidence to the contrary, I’d be glad to hear it – I should also mention that I have no personal experience to draw on).

And that brings up another question, the second big one that always comes up in such a discussion: is it right to do this sort of thing? Now that’s a tangle, because a value judgment has come into the room. And anyone who wants to take a hard line has to deal with the fact that we already have a legal, well-known, widely used drug for cognitive enhancement: caffeine. If that doesn’t increase wakefulness, I’d like to know what does.

The comparison with steroid use in sports will also come up, although I regard that one as partially a red herring. The whole point of athletic competition is different from the point of achievement in the arts and sciences. All sports are essentially artificial constructs that we agree on rules for, and doping makes people worried and/or furious that these rules are being bent. Science, on the other hand, is the real world. If Barry Bonds did indeed break home run records with chemical aid – personally, I think he did – then a lot of people (including me) have a problem with that. But if someone comes up with, say, a proof of the Riemann Hypothesis with the use of modafinil and methylphenidate, well. . .a proof is a proof.

But there is a competitive aspect that the sports analogy does bear on: several junior faculty may all be vying for tenure at the same time, for example. If they’re all roughly equal in ability, does the appointment end up going to the one who uses pharmacological help most effectively? That’s where the same uneasy feeling starts to set in. It’s when you look at head-to-head, human-to-human cases that the arguing really gets going.

We’re going to have more and more of this to deal with in the future. I don’t expect it any time soon, but we’ll eventually be able to do more for memory – and, for all I know, for higher cognition. There are too many therapeutic reasons to investigate such things, and too many reasons for any useful drugs not to quickly escape to the population that doesn’t necessarily have anything wrong with it.

All of these issues are addressed by the authors of the latest Nature commentary, naturally. For example:

"Consider an examination that only a certain percentage can pass. It would seem unfair to allow some, but not all, students to use cognitive enhancements, akin to allowing some students taking a maths test to use a calculator while others must go without. (Mitigating such unfairness may raise issues of indirect coercion, as discussed above.) Of course, in some ways, this kind of unfairness already exists. Differences in education, including private tutoring, preparatory courses and other enriching experiences give some students an advantage over others.

Whether the cognitive enhancement is substantially unfair may depend on its availability, and on the nature of its effects. Does it actually improve learning or does it just temporarily boost exam performance? In the latter case it would prevent a valid measure of the competency of the examinee and would therefore be unfair. But if it were to enhance long-term learning, we may be more willing to accept enhancement. After all, unlike athletic competitions, in many cases cognitive enhancements are not zero-sum games. Cognitive enhancement, unlike enhancement for sports competitions, could lead to substantive improvements in the world."

The editorial comes down to several main points: that we need more solid data on the benefits and risks of such drugs for normal individuals, that competent adults should have to option to use them, and that policies should be worked out to deal with issues of fairness, coercion, and the like.

My own thoughts on this are deeply confused and divided. That’s partly because I’m a weirdo: I don’t drink alcohol, and in fact, I don’t even drink coffee. That goes back to what I’d have to classify as a deep reluctance to mess with the way my brain works through chemical means, a trait that was already well in place by the time I was a teenager, but which was only reinforced as I learned more and more biochemistry. So on one level, I have to think that we really don’t know enough about how the existing cognitive enhancing drugs work, let alone what we’ll know about future ones, and that alone would keep me away from them.

But I can come up with plenty of thought experiments that shake me up: imagine that the risks are better known, and that they're as much as, say, caffeine (but with more benefits). What then? What if such things turn out, many years in the future, to be necessary to work at any reasonably high level in science, since everyone else will be taking them, too? Is part of my problem with drugs that alter brain function a streak of Puritanism - would I feel better about using such things if I knew that they were guaranteed not to be enjoyable? And so on. . .I have to confess, I found such issues a lot easier to deal with inside the confines of old science fiction stories.

Comments (20) + TrackBacks (0) | Category: General Scientific News | The Central Nervous System

September 25, 2008

Protein Folding: Complexity to Make More Complexity?

Email This Entry

Posted by Derek

Want a hard problem? Something to really keep you challenged? Try protein folding. That'll eat up all those spare computational cycles you have lounging around and come back to ask for more. And it'll do the same for your brain cells, too, for that matter.

The reason is that a protein of any reasonable size has a staggering number of shapes it can adopt. If you hold a ball-and-stick model of one, you realize pretty quickly that there are an awful lot of rotatable bonds in there (not least because they flop around while you're trying to hold the model in your hands). My daughter was playing around with a toy once that was made of snap-together parts that looked like elbow macaroni pieces, and I told her that this was just like a lot of molecules inside her body. We folded and twisted the thing around very quickly to a wide variety of shapes, even though it only had ten links or so, and I then pointed out to her that real proteins all had different things sticking off at right angles in the middle of each piece, making the whole situation even crazier.

There's a new (open access) paper in PNAS that illustrates some of the difficulties. The authors have been studying man-made proteins that have substantially similar sequences of amino acids, but still have different folding and overall shape. In this latest work, they've made it up to two proteins (56 amino acids each) that have 95% sequence identity, but still have very different folds. It's just a few key residues that make the difference and kick the overall protein into a different energetic and structural landscape. The other regions of the proteins can be mutated pretty substantially without affecting their overall folding, on the other hand. (In the picture, the red residues are the key ones and the blue areas are the identical/can-be-mutated domains).
This ties in with an overall theme of biology - it's nonlinear as can be. The systems in it are huge and hugely complicated, but the importance of the various parts varies enormously. There are small key chokepoints in many physiological systems that can't be messed with, just as there are some amino acids that can't be touched in a given protein. (Dramatic examples include the many single-amino-acid based genetic disorders).

But perhaps the way to look at it is that the complexity is actually an attempt to overcome this nonlinearity. Otherwise the system would be too brittle to work. All those overlapping, compensating, inter-regulating feedback loops that you find in biochemistry are, I think, a largely successful attempt to run a robust organism out of what are fundamentally not very robust components. Evolution is a tinkerer, most definitely, and there sure is an awful lot of tinkering that's been needed.

Comments (8) + TrackBacks (0) | Category: General Scientific News | In Silico

July 1, 2008

The Gates Foundation: Dissatisfied With Results?

Email This Entry

Posted by Derek

Well, since last week around here we were talking about how (and how not to) fund research, I should mention that Bill Gates is currently having some of the same discussions. He’s doing it with real money, though, and plenty of it.

The Bill and Melinda Gates Foundation definitely has that – the question has been how best to spend it. They started out by handing out money to the top academic research organizations in the field, just to prime the pump. Then a few years ago, the focus turned to a set of “Grand Challenges”, fourteen of the biggest public health problems, and the foundation began distributing grant money to fight them. But according to this article, from a fellow who’s writing a book on the topic, Gates hasn’t necessarily been pleased with the results so far:

”. . .Gates expected breakthroughs as he handed out 43 such grants in 2005. He had practically engineered a new stage in the evolution of scientific progress, assembling the best minds in science, equipped with technology of unprecedented power, and working toward starkly-defined objectives on a schedule.

But the breakthroughs are stubbornly failing to appear. More recently, a worried Gates has hedged his bets, not only against his own Grand Challenge projects but against how science has been conducted in health research for much of the last century.”

My first impulse on hearing this news is not, unfortunately, an honorable one. To illustrate: I remember a research program I worked on at the Wonder Drug Factory, one that started with a series of odd little five-membered-ring molecules. Everyone who looked them over had lots of ideas about what should be done with them, and lots of ideas about how to make them. The problem was, the latter set of ideas almost invariably failed to work.

This was a terribly frustrating situation for the chemists on the project, because we kept presenting our progress to various roomfuls of people, and the same questions kept coming up, albeit in increasingly irritated tones. “Why don’t you just. . .” We tried that. “Well, it seems like you could just. . .” It seemed like that to us, too, six months ago. “Haven’t you been able to. . .” No, that doesn’t work, either. I know it looks like it should. But it doesn’t. Progress was slow, and new people kept joining the effort to try to get things moving. They’d come in, rolling up their sleeves and heading for the fume hood, muttering “Geez, do I have to do everything myself?”, and a few weeks later you’d find them frowning at ugly NMR spectra next to flasks of brown gunk, shaking their heads and talking to themselves.

I’d gone through the same stage myself, earlier, so my feelings about the troubles of the later entrants to our wonderful project devolved to schadenfreude which, as mentioned, is not the most honorable of emotions. I have to resist the same tendency when reading about the Gates Foundation – sitting back and saying “Hah! Told you this stuff was hard! Didn’t believe it, did you?” isn’t much help to anyone, satisfying though it might be on one level. I’m cutting Bill Gates more slack than I did Andy Grove of Intel, though, since Gates seems to have taken a longer look at the medical research field before deciding that there’s something wrong with it. I note, though, that we now have well-financed representatives of both the hardware and software industries wondering why their well-honed techniques don’t seem to produce breakthroughs when applied to health care.

Now the Gates people are trying a new tactic. The “Explorations” program, announced a few months ago, is deliberately trying to fund people outside the main track of research in its main areas of focus (infectious disease) in an effort to bring in some new thinking. I’ll let Tadataka Yamata of the Gates Foundation sum it up, from the NEJM earlier this year:

”New ideas should not have to battle so hard for oxygen. Unfortunately, they must often do so. Even if we recognize the need to embrace new thinking — because one never knows when a totally radical idea can help us tackle a problem from a completely different angle — it takes humility to let go of old concepts and familiar methods. We have seemed to lack such humility in the field of global health, where the projects related to diseases, such as HIV, malaria, and tuberculosis, that get the most funding tend to reflect consensus views, avoid controversy, and have a high probability of success, if "success" is defined as the production of a meaningful but limited increase in knowledge. As a result, we gamble that a relatively small number of ideas will solve the world's greatest global health challenges. That's not a bet we can afford to continue making for much longer.”

What’s interesting about this is that the old-fashioned funding that Yamata is talking about is well exemplified by the previous Gates Foundation grants. After last week’s discussion here about “deliverables” in grant awards, it’s interesting to look back at the reaction to the 2003-2005 round of “Grand Challenges” funding:

”Researchers applying for grants had to spell out specific milestones, and they will not receive full funding unless they meet them. "We had lots of pushback from the scientific community, saying you can't have milestones," says Klausner. "We kept saying try it, try it, try it." Applicants also had to develop a "global access plan" that explained how poor countries could afford whatever they developed.

Nobel laureate David Baltimore, who won a $13.9 million award to engineer adult stem cells that produce HIV antibodies not found naturally, was one of the scientists who pushed back. "At first, I thought it was overly bureaucratic and unnecessary," said Baltimore, president of the California Institute of Technology in Pasadena. "But as a discipline, to make sure we knew what we were talking about, it turned out to be interesting. In no other grant do you so precisely lay out what you expect to happen."

I have to think, then, that in no other grant are the chances of any breakthrough result so slim. It would be interesting to know what the Gates people think, behind closed doors, of the return they’ve gotten on the first round of grant money, but perhaps the advent of the Explorations program is already comment enough. (One round of Explorations funding has already taken place, but a second round is coming up this fall. You can start your application process here).

The next question is, naturally, how well the Explorations program might work – but that’s a big enough topic for a post of its own. . .

Comments (28) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

May 29, 2008

Nullius in Verba

Email This Entry

Posted by Derek

Since I was talking the other day about the analytical habit of mind, this is a good time to link to an article by someone who has it like few other people alive: Freeman Dyson, who is thankfully still with us and still thinking hard. At the moment, he seems to be thinking about something that involves chemistry, physics, economics, and plenty of politics.

He has an article in the latest New York Review of Books that is one of the most sensible things I have ever seen on the issue of global warming. I strongly urge people to read it, because it’s a perspective that you don’t often see. (It ends, in fact, with a small note of despair at how seldom that particular viewpoint comes up). I found it particularly interesting, as you might guess, because I agreed with it a great deal.

Dyson stipulates at the beginning that carbon dioxide levels are, in fact, rising, and that they have been for some time. And he also is willing to stipulate that this will lead, other factors being equal, to a rise in global temperatures. He doesn’t get into the details, although there are endless details to get into, but goes on to make some larger points.

One of them is economic. One of the books he’s reviewing, by economist William Nordhaus, is an attempt to work out the best course of action. Nordhaus is not denying a problem, to put it mildly: his estimate comes out to about 23 trillion dollars of harm in the next hundred years (in constant dollars, yet) if nothing is done at all. The question is, how much will the various proposed solutions cost in comparison?

His numbers come out this way: the best current policy he can come up with, a carefully tuned carbon tax that increases year by year, comes out to only 20 trillion of damage, as opposed to 23 – that is, plus three trillion constant dollars. The Kyoto Protocol, turned down by the US Senate during the Clinton years, comes out to 22 trillion dollars of harm (one trillion to the good) if the US were to participate, and completely even (no good whatsoever) without the US. The Stern Review plan, endorsed by the British government, comes out to 37 trillion dollars of total harm, and Al Gore’s proposed policies come out down 44 trillion dollars: that is, twenty-one trillion dollars worse than doing nothing at all.

As Dyson correctly points out, these latter two proposals appear to be “disastrously expensive”. And the problem with such courses of action are that this money could be used for something better: Nordhaus also calculates the effect of finding some reasonably low-cost method to cut back on carbon dioxide emissions, such as a more efficient means of generating solar or geothermal power, the advent of genetically engineered plants with a high carbon-sequestering ability, etc. That general route comes out to roughly 6 trillion dollars of total harm, which is seventeen trillion better than doing nothing (and thirty-eight trillion better than the Full Albert). That’s by far the most attractive solution, if it can be realized. But doing an extra ten or twenty trillion dollars of damage to the global economy will make that rather unlikely, if we choose to do that.

And there are other effects. To quote Dyson:

” The practical consequence of the Stern policy would be to slow down the economic growth of China now in order to reduce damage from climate change a hundred years later. Several generations of Chinese citizens would be impoverished to make their descendants only slightly richer. According to Nordhaus, the slowing-down of growth would in the end be far more costly to China than the climatic damage.”

But there’s a factor that neither of the books he reviews mentions: that atmospheric carbon dioxide exchanges, on a relatively fast time scale, with the Earth’s vegetation. About eight per cent of it a year cycles back and forth, and that hold out hope for a biotech solution. Engineered organisms could fix this carbon into useful forms, or (failing that) just take out out of circulation completely. But we need to go full speed ahead on research to realize that.

The last part of his review addresses a larger question. Environmentalism, he states, is now more of a religious question than anything else. (Other people have realized that, and many who do bemoan the fact, but Dyson has no problem with it, saying that the ethics of environmentalism are “fundamentally sound”.) But here’s his problem:

”Unfortunately, some members of the environmental movement have also adopted as an article of faith the belief that global warming is the greatest threat to the ecology of our planet. That is one reason why the arguments about global warming have become bitter and passionate. Much of the public has come to believe that anyone who is skeptical about the dangers of global warming is an enemy of the environment. The skeptics now have the difficult task of convincing the public that the opposite is true. Many of the skeptics are passionate environmentalists. They are horrified to see the obsession with global warming distracting public attention from what they see as more serious and more immediate dangers to the planet. . .”

The distressing thing, as he mentions, is that many organizations (including, I'm sorry to say, the Royal Society among other groups of scientists), have decided that the issue is settled and that anyone dissenting from this view is to be slapped down. As for me, I’m not completely convinced by the current climate data, so I probably am to the right even of Dyson on this issue. Here he is, though, willing to stipulate that most of the basic assumptions are true, but finding no place for someone who can do that and still not see global warming as the Single Biggest Issue Of Our Time.

I know how he feels: I consider myself an advocate of the environment, but I think the best way to preserve it is to do more genetic engineering rather than less. Better crops will mean that we don’t have to plow up more land to feed everyone, and we won’t have to dump as many insecticides and herbicides on that land we’re using. That means that I also think the best way to preserve unspoiled spaces is to do less organic farming, and not more: organic farming, particularly the hard-core varieties, uses too much land to generate too little food, and it does so mainly to give people in wealthy countries a chance to feel good about themselves.

And I think the best way to preserve wild areas and biodiversity is to have more free trade and economic development, not to slow it down. Richer countries have lower birth rates, for one thing. (I actually think that the planet would be better off with fewer people on it, but I’m not willing to achieve that goal by killing off a few billion of us).

And finally, economic growth is what’s giving us the chance to find technologies to get us out of our problems. I know that there’s another way to look at it – that the technology we have got us into this problem, and that we should reverse course. But I don’t think that’s even possible, or desirable. I’d rather have engineered plants cleaning out the atmosphere, and I’d rather have electricity from fusion or orbiting solar arrays. I’d rather find cheaper ways to get some of our fouler industries off the planet entirely, and mine the asteroids and comets. I’d rather people get richer and smarter, with more time and resources to do what they enjoy. How we’re going to do any good by putting on hair shirts and confessing our sins escapes me.

Comments (62) + TrackBacks (0) | Category: Business and Markets | Current Events | General Scientific News

May 27, 2008

An Eye For the Numbers

Email This Entry

Posted by Derek

My wife and I were talking over dinner the other night – she’d seen some interview with the owner of a personal data protection service, and he made the pitch for his company by saying something about how out of (say) a million customers, only one hundred had ever reported any attempts on their credit information or the like. And my wife, who spent many years in the lab, waiting for what seemed to her to be the obvious follow-up question: How many people out of a million that didn’t subscribe to this guy’s service report such problems?

But (to her frustration) that question was never asked. We speculated about the reasons for that, partly out of interest and partly as a learning experience for our two children, who were at the table with us. We first explained to them that both of us, since we’d done a lot of scientific experiments, always wanted to see some control-group data before we made up our minds about anything – and in fact, in many cases it was impossible to make up one’s mind without it.

After a brief excursion to talk about the likely backgrounds and competencies of news readers on TV, we then went on to say that looking for a control set isn’t what you could call a universal habit of mind, although it's a useful one to have. You don’t have to have scientific training to think that way (although it sure helps), but anyone with a good eye for business and finance asks similar questions. And as we told the kids, both of us had also seen (on the flip side) particularly lousy scientists who kept charging ahead without good controls. Still, the overlap with a science and engineering background is pretty good.

What I’ve wondered, since that night is how many people, watching that same show, had the same question. That would be a reasonable way to determine how many of them have the first qualification for analyzing the data that come their way. And I’m just not sure what the percentage would be, for several reasons. For one thing, I’ve been working in the lab for years now, so such thinking is second nature to me. And for another, I’ve been surrounded for an equal number of years, by colleagues and friends who tend to have science backgrounds themselves, so it’s not like my data set is representative of the population at large.

So I’d be interested in what the readership thinks, not that the readership around here is any representative slice of the general population, either. But in your experience, how prevalent do you think that analytical frame of mind is? The attitude I’m talking about is the one that when confronted with some odd item in the news, says “Hmm, I wonder if that's true? Have I got enough information to decide?" It's an essential part of being a scientist, but if you're not. . .?

Comments (32) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

May 7, 2008

Science By Country

Email This Entry

Posted by Derek

Update: here's the map that I was imagining, thanks to Andy in the comments section. It's on the Worldmapper site linked to below, but I missed it while putting the post together. Most of my speculations turned out to be reasonable, although Venezuela (for one) looks a bit better than I thought it would, and Iran looks a bit worse. Africa and the Islamic world are, as hypothesized, almost invisible.

I’d like to see a map of the world with country size dependent on the number of scientific publications and patents – perhaps you’d want to use publications per capita, or per educated capita. That's a cartogram, and although there are plenty of interesting ones on the web, I haven't found that one yet. The US would loom large, that’s for sure. Japan might be the most oversized compared to its geography, although Singapore would also be a lot easier to pick out. Western Europe would expand to fill up a lot of space, with Germany, England, and France (among others) taking up proportionally more room inside the region and (perhaps) Spain and Portugal taking up somewhat less. Switzerland would swell dramatically.

South America would be dominated, I think, by Brazil, even more than it is on the map. You’d be able to find Argentina and Chile, but I think some other countries (like Venezuela) would dwindle in comparison. Africa, as it does so often in maps of this kind, would appear to have been terribly shrunk in all directions, with a few countries – Egypt, South Africa – partially resisting the effects. Moving on to Asia, India would appear even larger than it is, unless you went for the per-capita measurement to cut it back down a bit, and China would be a lot more noticeable than it was ten (or especially twenty) years ago.

Another region that would basically disappear would be the Middle East and most of the rest of the Islamic world. Iran would hang in there, smaller but recognizable, and you’d be able to find Pakistan, too. But the Arab countries (with the minor exception of Egypt) would nearly vanish. The figures from the Organization of the Islamic Conference (the multinational group involved) show that from 1995-2005, the Islamic countries contributed 2.5% of all the peer-reviewed scientific papers. That’s all the more interesting when you consider the amount of potential funding that washes around that part of the world.

This disconnect has been noticed by the region’s scientists, as well it might. The OIC has designated a committee of science ministers to help with a multiyear plan for modernizing things, but no one’s sure if any real money will be forthcoming. According to this Nature article (headlined "Broken Promises"), the OIC countries allocate less than 0.5% of their GDP to research and development. Most of the money promised just to fund that science committee never showed up. Lip service is, of course, a feature of politics (and politicians) everywhere, but I don't think I'm out of line if I suggest that it's very close to an art form in that part of the world.

And that's a very short-sighted approach. Many of these countries are sitting on huge amounts of money at the moment, which should be invested against the day that their oil runs out (or against the day that the world decides that it's not as desperate for oil as it once was). That latter day will, presumably, be hastened along by the countries who spend more on research. . .

Comments (21) + TrackBacks (0) | Category: General Scientific News

April 17, 2008

Getting Smarter Already?

Email This Entry

Posted by Derek

There have been several articles in Nature recently about performance-enhancing drugs. But these aren’t steroids or blood-cell therapies: they’re performance enhancers for scientists and engineers. Chief among them are Ritalin (methylphenidate), Provigil (modafinil), and various beta-blockers, to enhance concentration and wakefulness. The whole topic came to the fore last December, in an article suggestively titled "Professor's Little Helper". Here are the results of their informal readership poll. It's not a huge trend, at least not yet. The fraction of their self-selected sample who had never taken any such compound was in the solid 70% range, and you'd expect people with some experience to be disproportionately represented in such a poll. But usage is out there, nonetheless.

The first question to ask in these situations is, do such drugs work? As you’d guess, there’s no controlled data set to work with. There is, under current regulations, absolutely no way that any company with such a compound would run a trial for cognition enhancement in otherwise healthy people. The FDA has made it clear over the years that they are in the business of regulating drugs that help sick people, not ones for people who have no disease at all. In fact, I don’t think that the current regulatory framework even accommodates the idea of making people “better than well”, and if someone proposed such a study, it’s a solid bet that the FDA would turn it down.

So, in the absence of anything rigorous, we have a flood of anecdotal data, which is what the Nature pieces are full of. Take that along with the many reports of students using these drugs, and you have something significant going on, which has been coming on for a while now. Back when I used to work on Alzheimer’s, we used to speculate about what would happen if we ever did come across something that usefully enhanced human memory. I was sure that a large off-label market would develop among college students. I have to admit, I never considered their professors.

But do they work? Well, I’m willing to stipulate that they do, but I’m not sure to what extent. One confounding variable, which will be very hard to address outside of a controlled trial, is the placebo effect. I have to think that there’s a strong one in this area, that if you think you’ve taken something that helps your concentration and memory, that those functions will measurably improve. How much this counts for is impossible to say – but again, I’m willing to stipulate that there are pharmacological effects above and beyond placebo. In other words, I believe that a controlled trial of healthy individuals would, in fact, show improvement in cognition while taking such compounds. How much, and in what particular tasks, and for how long, and across what subgroups of people, and across what particular dosing regimens, and in what proportion to objectionable side effects, I have no idea. But I think that there’s something there.

And there will be more. I feel sure that other compounds will be developed that affect normal cognition in what are (at least under some circumstances) are beneficial ways. They will not, however, be approved for that purpose. That’s a long, long way off. They’ll be approved for Alzheimer’s, or sleep disorders, or some category of attention deficit disorder, which is how we have the compounds we have now.

This situation is similar to various possible anti-aging therapies. There, too, I think that compounds will come eventually that should be able to show benefits, according to what we understand about aging in other species. But they won’t be approved for that. They’ll be approved for diabetes, most likely, considering the strong links between insulin action and lifespan, or possibly for other slow-developing degenerative disorders. But if aging itself is a slowly developing degenerative disorder, what then?

I’ve been meaning to write something about this story for a while, but one of the problems has been that I’m still quite divided about what I think about it. (Normally my opinions come to me more quickly, for better or worse). Some background: people who’ve known me personally for a while generally know that I’m personally very much opposed to chemically altering the way that I think or feel. I never drank in high school, for example, which I can tell you made me stick out a bit in late-1970s Arkansas. Nor did I in college or afterwards; I still don’t drink now. And that personal prohibition goes even more for other recreational drugs, as you’d imagine.

My reason for that has long been that I enjoy my brain the way it is, and have seen no reason to mess up its function for fun. But the advent of cognition enhancing drugs is a scalpel to dissect that line of thought. What if the ingested chemicals add to some of the parts of my brain that I value the most? That “mess up its function” clause has been taken out and flipped upside down. And what if it’s for work, and not for recreation? Is that more allowable, because it’s somehow less frivolous? (All right then, what if I were to enjoy having a better memory, which I likely would?) That gets to a less creditable reason for my objection to alcohol and other such drugs – perhaps I’m not just objecting to them on practical grounds. Perhaps I’m objecting because I don’t want other people to have a good time, at least not like that.

Food for, well, thought. I’m still working this one out, I have to say. The issue of caffeine will come up as I do – I don’t drink tea or coffee, actually, having never wanted to end up in the position of having to drink either to function. But I don’t object to caffeinated soft drinks, although I don’t generally seek them out. But I have, when I’ve needed to stay awake – so how high a horse can I get on, anyway? Caffeine is a good proving ground for positions on the newer compounds.

Comments are, as always, welcome. I suspect that this is one of those issues that everyone has an opinion on. . .

Comments (27) + TrackBacks (0) | Category: General Scientific News

February 12, 2008

DNA Forklifts, DNA Pliers

Email This Entry

Posted by Derek

Manipulating nanoscale objects is a very hot research area these days, but no one’s quite sure whether it should be called physics or chemistry. The single-atom stuff (like the famous 1989 spelling of I-B-M using an early scanning tunneling microscope tip) would probably be the former, while moving whole molecules around would probably be the latter.

Now we’re to the point where you might consider it biology, since several recent papers describe ingenious uses of DNA as nanoscale pliers and Velcro. A report in Science from a group in Munich, demonstrates a nanoscale depot on a chip, formed by short DNA strands bound to its surface. Various molecules are tagged with complementary single strands of DNA. When you bring the two close enough, they hybridize, winding together spontaneously into a small double helix, which Velcros each molecule down to a defined position.

The second key to the work is that each of the molecules has a second, different DNA strand bonded to its other side. This one is complementary to a single strand attached to the tip of an atomic force microscope, so when that moves in close enough, those two hybridize as well. For the moment, the target is bound front and back.

But here's the trick: the two DNA helices are engineered so that the double helix on the bottom opens base-by-base, like a zipper, while the one on the AFM tip shears off all at once. That gives them different strengths, so when you pull up on the AFM tip, you can see the force profile of the "zipper" strand giving way as the attached molecule pulls free. Now it's dangling from the tip of the AFM, its newly freed DNA strand waving in the, uh, nano-breeze, I guess.
This was now moved to another portion of the chip, where more DNA strands awaited. These, like the tip strands, where also in the stonger "shear" geometry, but these were even longer, with more residues to wrap up with that free DNA strand on the molecule of interest. Lowering the two into proximity caused them to hybridize, and now pulling up on the tip caused the tip strand to unwind instead, leaving the molecule stuck on the new location on the chip. The AFM tip could then be sent back to the depot to pick up another molecule, and so on. (The illustration, courtesy of Science for nonprofit use, will give you the idea). The fluorescent molecules they used could then be imaged on the chip, confirming that they'd been arranged as expected.

The whole process took care, as you can imagine. The team kept the number of DNA strands on the tip quite low, in order to have a better idea of what was going on. Under their conditions, about one-third of the time, they picked up just one unit from the “warehouse”, and another twenty per cent of the time they got two at once. In the dropoff step at the new location, they sometimes noticed that no extra force was needed to pull the tip up, which indicated that they hadn't make a connection. In those cases, a shift of the tip assembly a few nanometers one way or another generally brought things within range for a successful transfer. It's not like you can see what's going on - light itself doesn't come small enough to let you do that in the normal sense - so you just have to feel your way along.

This is an early proof of concept, so it's not like we're going to be assembling nanomachines next week through this technique. (The DNA tags, for one thing, are rather large compared to the molecules that they're attached to). But the idea is there, and the idea works. We're starting to move single molecules around to where we want them to go, and making them stay put once they've been delivered.

Comments (2) + TrackBacks (0) | Category: Chemical News | General Scientific News

February 5, 2008

Room At The Bottom, For Sure

Email This Entry

Posted by Derek


Commenting appears to still be hosed around here, which is a shame, because I have some ask-the-readership posts stacked up. Writing posts under these conditions feels like shouting into a void! I hope things will be fixed soon, but it's quite a tangle behind the scenes.

Time is short today, at any rate, so here's a link to an image that I found simultaneously exciting and unnerving. There's a large project going on to make the world's best electron microscope, through several simultaneous improvements in the electron beam's shape and brightness, refinement of the detectors, damping vibrations in the sample stage, and so on.

So here's the latest. Those are two gold crystalline domains meeting each other at the corner - and those ping-pong balls are the gold atoms. You can clearly see them arranging to meet each other's packing structure at the interface, and if you look to the edges you can see some depth data as well. Those resolutions (well below one angstrom) are real, by the way, and the damn instrument is only about half done.

The group reports that when they scan sample multiple times, they can see individual gold atoms moving around between images. The next steps will include moving to lower-energy electrons for use in biological samples, and I can't even guess what we'll see then. More on the project here.

Comments (1) + TrackBacks (0) | Category: General Scientific News

November 15, 2007

And Speaking of Discovering Things. . .

Email This Entry

Posted by Derek

After extolling the joys of finding things out in the post directly below, I couldn't resist linking to this story for those who haven't seen it. Now, this guy is really out there on the edge, and I wish him well with his theory (available here on Arxiv for the mathematically inclined). What I especially like is that he's ready to make some testable predictions.

You know, when Feynman met Dirac, the first thing he mentioned to him was how wonderful it must have been to discover the equation that bears his name. If Garrett Lisi's theory can predict particles out of thin air the way Dirac called the positron, he'll be remembered the same way. Good luck to him, and to those like him.

Comments (7) + TrackBacks (0) | Category: General Scientific News | Who Discovers and Why

November 8, 2007

Dumber in English?

Email This Entry

Posted by Derek

I just came across this article, provacatively titled "Dumber in English". What the author, Stefan Klein, really means is "Dumber In Your Second Language", and he's almost certainly right about that.

I know that when I was doing my post-doc in Germany, I was significantly less nimble in German. I didn't have much practice in the language, and that meant a lot of mental overhead while using it. I never became truly comfortable with it, although I did get better. The throbbing headaches stopped after a few weeks, for example, which was certainly a visible and welcome sign of improvement. After a I started to dream in the language (and once in a great while I still do, with progressively less impressive fluency). I knew that I was really learning the stuff when I dropped a piece of apfelkuchen into a mud puddle, and reflexively swore in German. (Not to fear, the cake was in a paper bag, and was recoverable with quick action).

Scientifically, I was working under a handicap, and I knew it. My secret weapon, though, was the way the chemical literature was (and is) largely written in English. But this is a particularly painful thing for Germans, since their language was once on top of the heap in chemistry, physics, and several other sciences as well. Reading Klein's description of a recent conference in his native country, you can feel it:

". . .All the speakers – six Germans, plus three from the United States and one from Great Britain – were outstanding. And they all spoke either English or, in the case of a German speaker, now and then something similar. Unusual word-choices and serpentine sentences can make a speech seem more brilliant than it actually is.

But who in the audience spoke English? No one. And even the four foreign guest speakers could easily have understood a lecture in German, because simultaneous translation was available over headsets that were readily on hand. As someone from the sponsoring foundation told me, of course it would be better if the local guests would simply speak German. This would increase the public resonance. But the professors had another idea. Their argument: People only take a conference seriously when English is the official language. . ."

He brings up the historical practice of scholarly Latin, and how this dissolved in the 16th and 17th centuries as thinkers began to write in the vernacular. (This, though, actually hindered the flow of information, as far as I can see - a lingua franca isn't such a bad thing). He also worries that science will come to be even more separated from the general run of the population in non-English speaking countries, but I'm not so sure. Most native English speakers don't have much of a connection with the subject, despite every linguistic advantage. There's also the problem of whether some languages will cease to develop their scientific vocabularies, preferring the English terms out of convenience. As far as I can tell, this is already happening - mind you, English borrows terms from the other languages as well, although not to the same extent.

Klein also brings out some examples of concepts that he feels come across better in their original German than in translation. but here I'm not so convinced. Einstein's complaint about "spukhafte Fernwirkung", to pick one, is generally rendered into English as "spooky action at a distance", which seems to me to get the concept across very well, as opposed to Klein's clunky "long distance ghostly effect". There are definitely things that don't translate well from German to English (and across any other language pair you can name), but this isn't one of them.

I don't see anything stopping the rise and dominance of English in the sciences (and to be sure, neither does Klein). I realize that I write from the perspective of a native English speaker, and having had to live in another tongue, I can sympathize with those who have to come to grips with the language. (Especially our ridiculous spelling, although I'll vote for that over German grammar any day of the week). To my mind, the advantages of being able to speak the same language, however roughly, outweigh the problems of a scientific tower of Babel.

Comments (32) + TrackBacks (0) | Category: General Scientific News

October 15, 2007

Enzyme Humility

Email This Entry

Posted by Derek

There was a fascinating comment added to the recent discussion here on ammonia synthesis. It was pointed out that the amount of man-made Haber Process available nitrogen is outclassed by the amount fixed biologically. The legumes do their share, but a lot more is handled by free-living single-celled organisms. What's really startling is the estimate for the total amount of nitrogenase enzyme, by weight, that is responsible for the production of at least 100 million metric tons a year of reduced nitrogen: about twelve kilos.

It's important for us, as chemists, to contemplate figures like that lest we forget how unimpressive our own techniques are in comparison. Not all enzymes are that impressive, but many of them are extremely impressive indeed. One of Clarke's laws gets quoted a lot, the one about any sufficiently advanced technology being indistinguishable from magic. But there's no magic involved - these are things that we could do, if we just knew enough about how to do them.

Enzymes use a variety of effects to work these wonders, but a lot of it comes down to holding the reacting species in one place and lining everything up perfectly. It isn't as important to hold on to the starting materials or products, as it is to interact with and stabilize the highest-energy species in the whole process, the fleeting transition state. Various chemical groups can be brought to bear that activate or deactivate specific bonds, and everything works, at its best, with near-perfect timing. If you want molecular level-nanotechnology, this is it, and there's absolutely no reason why it has to be done inside a peptide backbone. If we understood enough, all sorts of other polymers, with all sorts of new functionality built into them, could presumably do things that Nature has never needed to do, under conditions that we could select for.

But we're unfortunately a long way from that. There's still a tremendous amount of argument about how even model enzymes actually work, with some rather exotic mechanisms being proposed. And if we don't understand what's going on, we sure can't design our own imitations. Making enzymes from scratch brings together a whole list of Very Hard Problems, from protein folding to femtosecond reaction dynamics, and making enzymes out of something other than proteins will be even harder. We're going to need to be a lot smarter, as a species, to figure out how to do it.

But learning more about such stuff is one of the things we do best. At least for the last few centuries it has been, and if we keep it up, there seems to reason why we shouldn't be able to figure out this one, too. Then, at long last, human ingenuity will have pulled even with blue-green algae, the fungi that live in rotting logs, and various sorts of pond scum. The little guys have had a big head start, but we're gaining fast.

Comments (8) + TrackBacks (0) | Category: General Scientific News

October 12, 2007

Unnatural, And Proud Of It

Email This Entry

Posted by Derek

The Haber-Bosch ammonia synthesis doesn’t intrude itself into the public consciousness much, but this year’s Nobel gave it a bit of a push. One thing I’ve noticed, though, is that whenever the topic of artificial fertilization comes up, it always kicks up a small dust storm of comment around it.

These vary widely in the reasonableness. Pointing out that artificially fixed nitrogen moved agriculture from (ultimately) a solar-powered base to (largely) a fossil-fuel base is both accurate and a good starting point for further discussion. See the comments to the Nobel post for an example – a person can argue that the Haber process didn’t require fossil fuels per se, or that we use more of them cooking the food than we do growing it (which may be true), or that we use more of them moving the food around (which I think is almost certainly true, and which opens up another set of questions) and so on.

Other good topics for discussion are how close various parts of the world were to a Malthusian food crisis when the ammonia synthesis came along, the other industrial effects of relatively cheap ammonia, the tradeoff of intensive fertilized farming in smaller areas versus more traditional routes in larger ones, etc. But if you’d like an example of an unreasonable comment, I’ll let this one over at Megan McArdle’s Atlantic Monthly blog stand in for a lot of similar fuzzy-mindedness:

"Higher yields due to the petroleum rich Haber-Bosch method also mean faster soil erosion and increased need of rotation etc. Combined with applying this method for inefficient livestock agriculture - it has destroyed NOT saved the rainforest and other ecosystems. Chemical fertilizer in ecology are like statism for the economy. You can force short-term results but nothing more!

At least 800 million people still go hungry.. their way forward into a sustainable future is less livestock agriculture and (more) organic natural farming.

Haber-Bosch is on the same environmental level as coal, oil! Not good, not sustainable, ideologically toxic for survival. We have to get rid of it pronto if we want our children to have "a nice life".

. . .All the social sciences, all the non-biological sciences like chemistry and physics should drop immediately what they are doing and learn more about their mother (and forget as much as possible about their "father" - you know who I mean?)!"

It’s hard to know where to start with this sort of thing. But I think I’ll do what Richard Dawkins did for Prince Charles a few years ago. Dawkins’s “You’re an idiot” style of debate isn’t always productive (for example, I think he does more harm than good to his cause as an atheist), but in this case I think the board across the nose was a good idea. He pointed out that if we’re going to use “naturalness” as a criterion, then agriculture isn’t going to make the cut, either. And that doesn’t mean factory farming and Roundup-Ready seeds; that means agriculture of any kind beyond remembering where the good patch of wild blueberries is and getting there before the bears do:

I think you may have an exaggerated idea of the natural ness of "traditional" or "organic" agriculture. Agriculture has always been unnatural. Our species began to depart from our natural hunter-gatherer lifestyle as recently as 10,000 years ago - too short to measure on the evolutionary timescale.

Wheat, be it ever so wholemeal and stoneground, is not a natural food for Homo sapiens. Nor is milk, except for children. Almost every morsel of our food is genetically modified - admittedly by artificial selection not artificial mutation, but the end result is the same. A wheat grain is a genetically modified grass seed, just as a pekinese is a genetically modified wolf. Playing God? We've been playing God for centuries!

The large, anonymous crowds in which we now teem began with the agricultural revolution, and without agriculture we could survive in only a tiny fraction of our current numbers. Our high population is an agricultural (and technological and medical) artifact. It is far more unnatural than the population-limiting methods condemned as unnatural by the Pope. Like it or not, we are stuck with agriculture, and agriculture - all agriculture - is unnatural. We sold that pass 10,000 years ago.

Dawkins is correct. We live in an unnatural world, and that goes for a lot of prehistory, too. Our world has been unnatural ever since we started applying our intelligence to it. When humans first started building shelters to get out of the cold and rain, I suppose you could say that this is no more than what an animal does when it digs a den. Killing a mammoth partly in order to use its bones for a house is a step beyond that, but in the same league as what beavers do to birch trees. But clearing land, planting seeds in it, tending and harvesting a crop, and saving some of its seeds to plant again is another order of living. Just because it all happened a long time ago (and because no one yet knew how to write it down) doesn’t make it any more in tune with ancient natural harmonies or whatever. (Try this PDF on for size).

We've been trying to fertilize the soil for thousands of years with whatever was on hand - manure, dead fish, the ashes of the plants that were burnt to make the field. And we've been modifying the genetic profile of our food crops over that same time with awe-inspiring persistence and dedication. (Good thing, too). No, when we move from that to artificial fertilizers and genetically engineered seeds, we’re talking about differences in degree rather than differences in kind. Large differences in degree, true, and worth discussing they are, but not on the basis of either their antiquity or their "naturalness".

Comments (21) + TrackBacks (0) | Category: Current Events | General Scientific News

August 14, 2007

Winning, By Tying Losers Together

Email This Entry

Posted by Derek

A co-worker put me on to an interesting paper earlier this year by Harvard's George Whitesides (with a co-author credit going to a well-known chem-blogger). Whitesides, a perennial favorite in Nobel betting, does a lot of absolutely first-tier physical organic chemistry, an area that I love to read about (and one that I'd probably be an awful practitioner of).

Almost all drugs bind to sites on proteins. Some proteins have only one site (that we know of) that a small molecule will fit into, while others have several. There have been a lot of attempts over the years to go after the latter group by hitting more than one site at the same time - but with only one drug. Imagine two different drug molecules, each fitting into a different site on a single (multi-sited) protein. Now imagine combining them into one compound, by attaching some sort of linking chain between them, and you've got one (larger) molecule that can reach around and fill two binding sites.

This has worked in some cases, at least on a research level (I'm not aware of any drugs that have yet made it to market by taking advantage of this effect, though). (Update: there is a marketed protein, bivalirudin, that binds to two sites on thrombin, but I'm still not aware of any small molecule drugs in this category). You can pick up huge amounts of affinity by this trick, though, to the point that neither of the original "business ends" of the molecule need to be particularly good binders on their own. And since we in the industry are distressingly good at producing molecules that don't bind to things very well, the idea of combining some of these into multivalent wonders is appealing.

But there are a lot of unknowns. Figuring out how to modify the original structures in order to tie them together is, as they say, non-trivial. (If you hang around scientists and engineers much, you know to head for cover when you hear that expression). And what kind of chain should you use, anyway? How long does it have to be, and what happens if it's too long or too short? And what's the linking chain doing, anyway - sticking to the surface of the protein, waving around by itself, or what?

Whitesides and his people have used carbonic anhydrase as a model system, which is an enzyme whose structure and behavior is as well known as these things get. They find, not unreasonably, that when the linking chain is too short the activity of your wonder molecule just gets killed: you're stuck with one end bound to the protein, and a big tail flopping around uselessly, unable to reach the next binding site. The "just-right" chain length is the best, naturally. But (interestingly) you don't pay much of a penalty for being longer than necessary, even several times longer. Apparently the chain will coil around and find something to do with itself as long as the two ends are bound.

And while it's doing this, it doesn't appear to be contacting the protein in any meaningful way. This took a lot of careful experimental thermodynamics to check, but there's no extra binding energy involved with any of the common chains. So if you're going to try this trick, Whitesides's advice is not to worry about what chain to use. Stick with a plain-vanilla linker, as flexible as possible, make it a bit longer (at least at first) than you think you'll need, and you've improved your chances right there. And he has the numbers to back this up, which is what physical organic chemistry is all about: opinions made solid by data. It's good stuff.

Comments (29) + TrackBacks (0) | Category: Drug Development | General Scientific News

July 19, 2007

Hype In Spaaaace!

Email This Entry

Posted by Derek

This week's award for the most straight-faced research whopper goes to. . .the government of Brasil, of all the possible candidates. In their attempts to bounce back from a disastrous explosion at their launch site a few years ago, the Brasilians have successfully fired a sounding rocket with an experimental payload.

I'm not quite sure what exactly was in these experiments - from press reports, it looks like some enzyme kinetics and some DNA repair studies. Both of these were to be looked at under microgravity (aka free fall), which I have to say does not sound like a very fruitful area of research to me - of all the forces that affect enzyme behavior, gravity seems like one of the least likely to show any effects.

And there's the problem that (since this was far from an orbital flight) the payload experienced only about seven minutes of free fall. With a faster enzymatic reaction, you might be able to run something similar on a "Vomit Comet" airplane flight, frankly. And as for the DNA repair work, that was to be after exposure to ambient radiation, which no doubt can be simulated quite well on the ground. But that wouldn't be so good for publicity and national pride, would it?

So, what will these experiments lead to, you ask? I'll let the experimental coordinator field that one, although you may well have guessed the answer already: "Eventually, the results could help us develop new processes and pharmaceutical products to treat cancer." Well, sure - with a sufficiently open-minded definition of the word "eventually". And the word "treat". And probably the word "new", and while we're at it, the word "results" as well.

Comments (17) + TrackBacks (0) | Category: General Scientific News

June 7, 2007

The Chamber of DNA Secrets

Email This Entry

Posted by Derek

There are plenty of headlines today about the large Wellcome Trust-funded genomic study of common diseases. Unfortunately, most of those headlines are misleading. The ones that say "Genes Identified For Common Diseases" are the most common wrong ones, but any that include secrets, keys, new dawns, locks being opened, or mysteries being solved are also full of it. (You'll need to go to people who know what they're talking about for less sensational coverage - try the RSC, for one).

Not that this isn't a fine study, and a very interesting piece of work - far from it. This is just the kind of rigor and effort (14,000 patients, 3,000 controls) that's needed to trace out these sorts of connections. Contrary to popular belief, most genomic effects on disease are subtle and shifty, and tangled up throughly with both environment and with dozens (hundreds? thousands?) of other genetic markers. These folks are doing the right thing in the right way.

But the press, at least some of it, isn't. The genes identified in this study are not enough to tell you if you're going to get a particular disease or not, not by themselves. And they're not going to lead to therapies any time soon, either, because in many cases we have no idea how or why they're connected to the diseases in question. Nor do we have drug candidates that target the proteins that the genes code for, and it wouldn't surprise me a bit if most of them turn out to be un-druggable from the start with our current technology. I speak from sad experience on that issue, like many other folks in the drug industry.

That's not to say that we won't figure out how these things are involved in disease, or how to attack them therapeutically. But we didn't just open a locked chest full of the secret keys to health here - we found fragments of a map that'll tell us where to look for the clues to the pieces of an even bigger puzzle. It's the state of things, though, that this really is an advance, and it wouldn't hurt the public to know.

Comments (6) + TrackBacks (0) | Category: General Scientific News

April 19, 2007

The Big Time

Email This Entry

Posted by Derek

Well, we're exactly on the opposite side of the year for Nobel season, but Paul over at Chembark has the latest odds on the next Chemistry prize. There are a couple of ringers in the list, but it's an excellent reference for big achievements by living chemists. It's also a useful thing for people who are immersed in synthetic organic chemistry to look over, because we sometimes have an exaggerated view of our place in the chemical world. I'll post more on this sort of thing in a few months, but clip and save Paul's post until then. . .

Comments (0) + TrackBacks (0) | Category: General Scientific News

March 23, 2007

Naked Synthesis

Email This Entry

Posted by Derek

There's an unusual article in Nature that several folks have e-mailed me about. It's unusual for several reasons. For one thing, it's synthetic organic chemistry, and there's not much of that in Nature at all - it's an interesting choice of journal on the part of the authors, Phil Baran of Scripps and two of his students, Thomas Maimone and Jeremy Richter. The title also gives away the other odd feature (as a title should): "Total Synthesis of Marine Natural Products Without Using Protecting Groups".

I was talking about protecting groups here just a couple of months ago. In synthesizing complex molecules, they're often necessary, because there will often be several similarly reactive groups exposed at the same time, and you need to be able to distinguish them. Or you'll need to do something severe to another end of the molecule-in-progress, which an amine or alcohol somewhere either won't let you do or won't survive if you try.

The trouble, as any synthetic chemist can tell you, is that protecting groups introduce their own complexities. Ideally, you want to be able to put them on and remove them with no loss of material, but that's impossible. Ideally, you'd want each one to be removable under conditions that won't disturb any of the others, or anything else in your molecule, but that can be a tall order too as they start to add up. And ideally, you'd want all of them to be able to stand up to anything else you'd like to do, until it's time for them to leave, but that's not available in the real world, either. Sometimes a big part of the work (mental and physical) that goes into a total synthesis is figuring out how to manage all the protecting groups.

Baran makes the case that this has gone too far. He's made several complex molecules without protecting anything at all. There's a price to be paid, of course - some of the steps along the way have not-so-impressive yields because of the bareback conditions. But the counterargument is that the overall yield of the synthesis is often higher in spite of this, because there are so fewer steps, and the cost and complexity are cut similarly.

Of course, you can't do this by just plowing ahead with the same reactions that a protecting-group-laden synthesis would use. They're on there for a reason, and that method would send you right into the ditch. Baran tries instead to mimic the biochemical synthesis of these molecules as much as possible, since after all, cells don't use protecting group chemistry, either.

This is an idea with a long and honorable history in organic chemistry, starting with Sir Robert Robinson's startling one-pot synthesis of tropinone back in the 1917. That one is usually taken as the father of all biomimetic syntheses, although it's been pointed out (by no less an authority than Arthur Birch) that this is partly a legend. But it's a legend that has performed function of its reality, leading to a whole series of biologically-inspired syntheses. This latest paper is a call to make biomimetic synthesis the centerpiece of the field again.

I'm sympathetic to that view, but it's not going to be easy. Read closely, the paper shows that this kind of work can be very difficult indeed, even when the biogenic pathways to your target molecules have been studied (which isn't always the case). There are a lot of steps here that required careful coaxing to work in reasonable yields, or at all - no one should confuse the lack of protecting groups with a savings in time. And these difficulties also undermine the claim of reduced cost and complexity a bit, since they represent plenty of time and effort - and if they aren't synonymous with cost and complexity, I don't know what is. Academia may obscure this a bit, since we're only talking graduate student labor here, but it's a real issue.

Where I see this making an impact industrially is in process chemistry. Many times companies work out several parallel routes to an important drug substance, looking for the lowest overall cost. That's where attention to no-protecting-group methods could pay off. Process groups already try to avoid these steps anyway, for the same reasons.

But for the most part, drug substances aren't so complex that they need lots of protecting group manipulation. We could always try to get into more complicated structures through these routes, but this leads to a chicken-and-egg problem. The medicinal chemists generally don't have the time to investigate the picky conditions needed to make no-protection chemistry work, so they're not going to have access to the shorter, higher-yielding syntheses needed to do analoging work. (And there's the real problem that these analogs might need complete re-optimization of the trickier steps each time, which would be a real nightmare). The process chemists would have the time and mandate to work out the no-protection stuff, on the other hand, but if med-chem can't deliver a good drug candidate, then they have nothing to optimize.

The Nature link above is subscriber-only, but you can read the supporting information with all its synthetic details here if you like. It's a pretty big PDF file, though, so be warned. I'd be interested to hear what readers, both academic and industrial, think about this one.

Comments (28) + TrackBacks (0) | Category: Academia (vs. Industry) | Drug Development | General Scientific News

February 22, 2007

Inspirational Reading?

Email This Entry

Posted by Derek

An undergraduate reader sends along this request:

I was wondering if you had some recommended readings for a second year student, eg books that you have read and made a palpable impression on you when you were my age.

That's a good question, despite the beard-lengthening qualification of "when you were my age". The books that I would recommend aren't the sort that would require course material that a sophomore hasn't had yet, but rather take a wider view. I would recommend Francis Crick's What Mad Pursuit, for one. It's both a memoir of getting into research, and a set of recommendations on how to do it. Crick came from a not-very-promising background, and it's interesting to see how he ended up where he did.

Another author I'd recommend is Freeman Dyson. His essay collections such as Disturbing the Universe and Infinite in All Directions are well-stocked with good writing and good reading on the subject of science and how it's conducted. Dyson is a rare combination: a sensible, grounded visionary.

Another author to seek out is the late Peter Medawar, whose Advice to a Young Scientist is just the sort of thing. Pluto's Republic is also very good. He was a fine writer, whose style occasionally comes close to being too elegant for its own good, but it's nice to read a scientific Nobel prize winner who suffers from such problems.

I've often mentioned Robert Root-Bernstein's Discovering, an odd book about where scientific creativity comes from and whether it can be learned. I think the decision to write the book as a series of conversations between several unconvincing fictional characters comes close to making it unreadable in the normal sense, but the last chapter, summarizing various laws and recommendations for breakthrough discovery, is a wonderful resource.

Those are some of the ones that cover broad scientific topics. There are others that are more narrowly focused, which should be the topic of another post. And I'd also like to do a follow-up on books with no real scientific connection, but which are good additions to one's mental furniture. I have several in mind, but in all of these categories I'd like to throw the question open to the readership as well. I'll try to collect things into some reference posts when the dust eventually clears.

Comments (26) + TrackBacks (0) | Category: Book Recommendations | General Scientific News | Who Discovers and Why

October 9, 2006

Forty NMR Magnets and 3000 Proteins Later. . .

Email This Entry

Posted by Derek

A recent issue of Nature (443, 382, 28 September 2006, subscriber link) carried an intruiging article about Japan's five-year "Protein 3000" project, which is now winding down. Carried out under the auspices of RIKEN, the project was designed to use a large-scale NMR facility to solve the structures of at least 3000 proteins, and along the way advance the understanding of protein folding in solution.

Whether or not it succeeded depends on who you ask, because the answer isn't obvious. The project does seem to be on track to make its numerical goals, but according to the article, many protein-structure people think that a large number of the structures that have been solved are, well, junk - easy, closely-related ones that were put on the list to run up the numbers. While the organizers dispute that, as they certainly would, another problem is that understanding protein folding has turned out to be (you know what's coming) harder than expected. The project was supposed to cover a large swath of a hypothetical 10,000 different folds, but now the real number is thought to be two or three times that. So the best case was that Protein 3000 would have worked out about a third of all possible protein folds, but now they're looking at perhaps 5 to 10% of them.

The Japanese government has a real weakness for big programs like this. I think that Protein 3000 has been one of their biggest forays into that area, but in the past they've announced all sorts of gaudy projects in computation and the like, most of which haven't worked out quite as planned. The "Fifth Generation" project is perhaps the most abject failure of the lot, but at least that one seems to have produced a number of researchers who could do something else. But the Protein 3000 business has some folks worried:

Several researchers have also expressed concern that the factory approach at the NMR facility has deprived young researchers there of the skills necessary to solve more complicated and important scientific riddles. It might have "destroyed the next generation", says one.

(Kurt) Wüthrich, who helped plan the NMR centre in 1998 and was a science adviser in 2000-04, agrees that the facility is a wasted opportunity. "A centre of that size should contribute to methodology, but there has been nothing," he says. "It became a one-man show with 40 NMR machines - there is no knowledge."

Not a good review, considering it comes from a man who knows a bit about the use of NMR to attack protein structure. What I find instructive about such things is that these projects are often just the sort that large government-level granting agencies take it into their heads to fund. Sometimes they work out, but the majority of the time they don't.

Comments (7) + TrackBacks (0) | Category: General Scientific News

October 4, 2006

Cheer Up, You Chemists

Email This Entry

Posted by Derek

The Kornberg Nobel seems to have set off some "whither chemistry" noises over here (see the comments to this post). I wanted to highlight an especially provocative one:

Derek, I hope I don't offend my chemist colleagues (I'm myself a former chemist), but as a chemist you have to realize that Chemistry is a science of a lesser public impact. Done at the edges of important matters, it's physics, done at the edges of interesting issues, it becomes biology. You ask for the final explanation of matter and energy and you are a physicist, you are interested in the beauty and complexity of life, you are a biologist. Sorry, chemistry is a practical science, but today its mostly a set of tools.

Hey, I did say it was provocative. As you'd guess, I don't agree, but that doesn't mean that I don't understand this point of view. There seem to be a fair number of chemists with similar sinking feelings, to judge from the letters that show up from time to time in Chemical and Engineering News. The problem is, the same argument by exclusion slice-and-dice can be applied to any other scientific discipline, so long as you define its edges by labeling the things around it as "important" and "interesting".

I could turn things the other way by wondering if, then, some of the important parts of physics are the parts that overlap with chemistry, and some of the most interesting parts of biology are the ones that do likewise. But I don't want to get into a shouting contest about whose work is most useful or exciting, because I don't think it gets us anywhere to talk in those terms.

For me, chemistry is the science that deals with behavior of systems on a molecular level. As you go down to the atomic level, you get into physics (and by the time you're in the subatomic range, you're in physics up to your eyebrows). As you go up from single molecules to larger and larger molecular systems, you start to shade into biology, because the largest and most complicated of those we know about are living organisms.

So rather than bemoan these other disciplines poaching on chemistry's territory, or decide that all the good stuff belongs to them and that chemistry is left with nothing, I'd prefer to think that the field is in an excellent position. We're just at where both those other fields start to get really tough. Look at physics - you can do quantum mechanics on single isolated particles, but once you start bringing in more of them, things get very sticky very quickly. That's why there are all those molecular modeling programs, each full of its own assumptions and approximations, because that's the only way you can approach the calculations at all. Moving up from single particles to atoms and on to molecules is a huge leap in complexity.

And as for biology, the complexity has become more apparent by movement in the other direction. If you thought classical zoology or botany were pretty tangled up, take a look at them on the molecular level! Biology has made tremendous advances through the treatment of its smallest mechanical parts as real molecules behaving chemically. Look at the med-chem concept of a receptor - it was a revelation when people finally realized that this wasn't just a convenient bit of mental shorthand, but a concept that reflected an actual physical entity. And of course, the question of when a collection of molecular machines can be considered a living organism has set off arguments for decades.

No, being in the middle of the range has its advantages. These folks are in our territory because there's so much here to attract them. As chemists, we have to realize this and make the most of it, not sit around moaning about how other people are hogging the spotlight.

Comments (16) + TrackBacks (0) | Category: General Scientific News

Another Chemistry Prize for Biology

Email This Entry

Posted by Derek

As everyone will have heard, Roger Kornberg has been awarded the chemistry Nobel for his work on RNA polymerase. This is certainly deserved, since his lab has been working on this important area for years, gradually zooming in on the enzyme's structure and function through biological and X-ray methods.

But he wasn't on anyone's short list to win the Chemistry prize, and I doubt if Kornberg considers himself a chemist. For some time now, the Nobel people have been using the prize as an overflow from the Medicine/Physiology area, which this morning led Paul Bracher over at the Endless Frontier blog to call for chemistry to colonize the Physics prize. Kornberg wasn't on his long list of candidates with odds, because most everyone on his list was, well, a chemist.

But it is nice to have another enzyme-studying Kornberg from Stanford with a Nobel. Arthur Kornberg is still alive, and still publishing papers as of a few years ago. I hope he's in good enough health to enjoy his son's achievement.

Comments (22) + TrackBacks (0) | Category: General Scientific News

October 2, 2006

RNA Interference: Film at Eleven

Email This Entry

Posted by Derek

Every time a Nobel Prize is announced, reporters try to put in some sort of "news you can use" context. That's usually pretty easy to do with the Medicine/Physiology prize, and usually impossible with Physics. Chemistry falls into a middle ground - as opposed to some of the pure-knowledge physics awards, the chemistry discoveries are being used to do something in the physical world, but explaining what that is can be tough.

How did the popular press handle today's award? I invite readers to share any particularly clueless news stories, but most of the the reports I've heard have stressed the potential therapeutic value of RNA interference. There's often been a list of diseases that might be treated, with no particular timeline given, which is a good thing. NPR at least had some disclaimers in there, mentioning near the end that researchers still needed to find a way to dose the compounds, get them to the tissues of interest, make sure that they weren't toxic, and prove that they do affect the diseases they're targeted for.

Minor details, all of 'em. Right? That's just about 85% of drug development right there, actually, and the fact that these can be lumped together at the end of a news segment might be why (among other things) the "government research discovers all the drugs" idea has such staying power. I think that people see all those hard steps without realizing that they're hard . All that stuff about dosing, toxicity, selectivity, it's all what you do in the last few months before you hit the pharmacy shelves, I guess, along with picking a color for the package.

RNA interference is probably going to have a long climb before it starts curing many diseases, because many of those problems are even tougher than usual in its case. That doesn't take away from the discovery, though, any more than the complications of off-target effects take away from it when you talk about RNAi's research uses in cell culture. The fact that RNA interference is trickier than it first looked, in vivo or in vitro, is only to be expected. What breakthrough isn't?

Comments (14) + TrackBacks (0) | Category: General Scientific News

Nobel Update: RNAi Wins

Email This Entry

Posted by Derek

I'd been predicting for years that RNA interference would be worth a Nobel, and this year the committee did what many expected them to do. But not many people expected them to do it this early - not even Craig Mello himself. And he's being modest in that quote about having an "inkling" that it "might be possible", but that's understandable. Congratulations to him and to Andrew Fire!

I notice that the committee didn't go back as far as the initial observations of the first observations in plants (or in nematodes). The explanation for all these results started with Fire and Mello, and that's where the committee started as well.

Update: Paul Bracher sets the odds for Wednesday's prize in chemistry. I might run some of the numbers a bit differently, but not terribly so, and it's a pretty comprehensive list of possibles.

Comments (3) + TrackBacks (0) | Category: General Scientific News

September 27, 2006

Nobel Fever Is Upon Us

Email This Entry

Posted by Derek

As we move into October, we enter what scientists know as Nobel Season. The Chemistry prize will be announced next Wednesday, so it's time for the annual sport of figuring out who will get it. For those keeping score at home, here's the list of previous winners.

Last year I looked around for a betting market, but the action was pretty thin. I haven't detected any great amounts of money hitting the table this year, either, but Thomson ISI does have their poll going again. But it's next to useless, as far as I can tell, because I don't see any of their listed choices as front-runners for the award this year. (Nature's Sceptical Chymist blog agrees).

For example, I have nothing against Dave Evans and Steve Ley, who are both top-rank synthetic organic chemists. But if they're on the list in that category, there are several others who should be, too - not that I think it's necessarily going to be a synthetic organic year. And it wouldn't surprise me to see Stuart Schreiber eventually win a Nobel, but I think the committee can safely wait on that one, too, since his resume is still lengthening nicely. And as for Tobin Marks, it would surprise me to see another organometallic-themed award right after last year's. No, if Thomson's site had a secure connection to put my credit card down on a bet, I'd take the field rather than their choices and feel very happy about it.

Over at the Endless Frontier, Paul Bracher has his money down on either the green fluorescent protein folks (Tsien et al) or perennial pick George Whitesides. Keep in mind, though, that he works for Whitesides, so he may not have the most objective opinion there. I wouldn't object to a win for him, though, but I'd like to see how the committee would phrase it - the traditional line on a Whitesides pick is that he's contributed to too many areas to pin down. More Nobels have gone to hedgehogs than to foxes.

The green fluorescent protein suggestion is a good one, though. And perhaps this will be the year that the committee recognizes RNA interference, which could land in the chemistry award as easily as anywhere else. My runner-up to those suggestions is nanoscale structures (Stoddart et al.), but I have that pick running substantially behind the other two. That's mainly because the other two have demonstrated some real-world utility, but hey, fullerenes. Add your own picks in the comment section, and let's see who calls it.

Comments (26) + TrackBacks (0) | Category: General Scientific News

September 19, 2006

By Any Other Name

Email This Entry

Posted by Derek

There's a paper in the latest Ang. Chem. that will be of interest to everyone who's into the way that various chemicals smell. And hey, what organic chemist isn't?

It's by a flavor and fragrance chemist, who lists many tables of compounds that have very minor structural variations but completely different smells. One noteworthy example is geraniol, which is a large component of the scent of roses. Adding a methyl group next to its primary allylic alcohol coverts it to an analog with an "intense fungal odor", which I don't think I'm going to be lining to up sample any time soon. And you'd have thought that the smell of geraniol would be pretty robust - you can saturate the allylic double bond, and it's still rosy. Take that compound and substitute an aryl group for the isobutenyl on the other end - still rosy. But don't mess with that primary alcohol.

The take-home lesson is that there are no major SAR trends in odor that you can count on. A substitution that works in one series can do nothing when applied to a closely related compound, or it can take the odor off in a completely unexpected direction. That aryl-for-isobutenyl switch I mentioned, for example, isn't silent if you try it on benzylacetone (4-phenyl-2-butanone). The starting ketone smells "sweet and floral", but the corresponding methylheptenone is described as "pungent, green, herbaceous".

The reason for all this craziness is that there are hundreds of olfactory receptors, most of which appear to respond to huge numbers of compounds as agonists. (There's that induced fit again)! And it's not like the agonists all smell the same, either. There also appear to be multiple binding sites involved, and possibly other protein cofactors as well. The structural complexities are bad enough, but there are probably neural processing effects laid on top of them, which makes the author predict that "consistently accurate prediction of odors will not be possible for a very considerable time". He's quick to point out that it's not like the flavor and fragrance industry has to money to underwrite the work needed to do it, either.

Does this remind you of anything, fellow medicinal chemists? If the perception of smell is the physiological readout in this case, how different is this from all the physiological states we're trying to produce with our small drug molecules? How well do we really understand their binding, and how much can we trust our SAR models? Hey, the fragrance people have big advantages on us - they can immediately test their molecules just by sticking them under their noses, which is like a five-second clinical trial with no FDA needed. And they're still as lost as geese. A lot of the time, so are we.

Comments (24) + TrackBacks (0) | Category: General Scientific News | Life in the Drug Labs

September 10, 2006

If You Want Your Explanations Overnight, It'll Cost You

Email This Entry

Posted by Derek

OK, the votes in the comments to the Explain This! post came out with NMR/MRI as the clear winner, with a strong plurality wanting to make sure that Fourier transforms are part of the explanation. So that's what I'll take on, but it's not going to appear this week. I'm going to try to pitch the explanation to an intelligent lay reader who doesn't have any particular physics, chemistry, or math skills. I'm out of my mind.

In second place were various suggestions about X-ray crystallography, and perhaps that'll be topic number two. Chirality would be tied with that, except there were actually more votes against it than for it, with people finding it not all that hard to explain. (They've clearly never tried to explain to someone whose specialty is running a Morris water maze assay why all the compounds flipped from R to S just because a group changed out on the far end of the molecule). Other multiple-vote getters were Woodward-Hoffman/FMO, structure determination in general, and antibiotic resistance.

Many of the single-vote topics would be good as well, and some of them would be quite tricky. The person who suggested point group symmetry, though, brought back some memories. I'd never covered anything in that area as an undergraduate, for one reason or another. So there I am in my first year, taking an optical spectroscopy class, and on about the second day the professor launches into a discussion of symmetry operations and their relevance to infrared absorption bands (which is considerable).

And this was the first lecture I had ever heard where I understood nothing but the common verbs and the minor parts of speech. I listened to the whole thing with mounting alarm. It had taken me all the way to graduate school to come definitively to the limits of my knowledge, but the pavement ran out right there. I was so stunned I couldn't even take notes - I'd never tried to take notes on something that I wasn't comprehending at all, so I didn't know how.

That evening, I stalked over to the chemistry library and checked out, among other things, Harry Gray's book on group theory, renowned as the first one on the subject "that you could read in bed without a pencil in your hand". And I didn't go to bed myself until I understood just what I'd been listening to that morning, because I didn't enjoy the experience one bit, and wanted to make sure that it never happened again.

Comments (3) + TrackBacks (0) | Category: General Scientific News | Graduate School

August 22, 2006

Explain This, Hot Shot!

Email This Entry

Posted by Derek

Via the excellent Arts and Letters Daily, I found this piece by science writer K. C. Cole on dealing with editors in the popular press. She and others in her field have had their difficulties over the years when writing about things that even the researchers involved are confused about:

Editors, however, seem to absorb difficulty differently. If they don't understand something, they often think it can't be right - or that it's not worth writing about. Either the writers aren't being clear (which, of course, may be the case), or the scientists don't know what they're talking about (in some cases, a given).
Why the difference? My theory is that editors of newspapers and other major periodicals are not just ordinary folk. They tend to be very accomplished people. They're used to being the smartest guys in the room. So science makes them squirm. And because they can't bear to feel dumb, science coverage suffers.

She points out some of the problems - that many scientific discoveries deal with things that are more or less invisible to the ordinary senses, happen on time scales that are too short or too long to be easily perceived, contradict some common-sense notions of what must be right, and so on.

There's also the problem that, as she correctly observes, that sometimes there is no description in lay language that can really explain a topic. My guess is that pure mathematics suffers from this the most: try explaining the Reimann zeta function in one coherent paragraph to someone who doesn't know much math. Following right on math's heels, as usual, is physics, but its weirder aspects can have a gee-whiz factor that makes up for their difficulty. Meanwhile, the fields I spend my time in (chemistry and biology) have their incomprehensible moments, but I think that they're amenable to explanation most of the time.

Which brings up a challenge. I've been trying to think of the most difficult thing to explain in chemistry to people who don't know the field. Since I have readers in both camps, I'll invite the pros to suggest some tough topics, and I'll tackle, in reasonably de-geeked language for the general readership, whichever one gets the most votes. The chemists can then comment on how accurate the explanation really was and suggest modifications, and we'll end up with something that might be useful. If this idea proves popular, we'll run one every so often and put it in a new category page.

I may come out of this looking like an idiot, but being willing to run that risk is an important part of my research style. Let's see how it works in the blogging business. Topics, anyone? I'd suggest something with a good mix of usefulness and broad interest along with general public incomprehension (NMR might be a good example).

Comments (47) + TrackBacks (0) | Category: General Scientific News

August 9, 2006

Ray Kurzweil's Future

Email This Entry

Posted by Derek

Ray Kurzweil's people sent me a copy of his book The Singularity is Near quite a while ago when it first came out. I kept meaning to write about it, but several things kept interfering. One of the things was the book itself.

I'm of two minds about Kurzweil and the worldview he represents. As many will know, he's about as much of a technological optimist as it's possible to be, and I have a lot of that outlook myself. But I wonder - does it extend to my own field of research? More generally, and more disturbingly, am I only optimistic about the areas whose details I don't know very well?

These questions came up again when I read a recent op-ed by Kurzweil in the Philadelphia Inquirer. It's a good summary of his thinking, and it includes this paragraph:

"The new paradigm is to understand and reprogram our biology. The completion of the human genome (our genetic code) project three years ago is now allowing us to do that. This process is also exponential: The amount of genetic data we are able to sequence (decode) has doubled every 10 months, while the price for decoding each gene base pair drops by half in the same time frame (from $10 per base pair in 1990 to less than a penny today). For example, it took us 15 years to sequence HIV, yet we sequenced the SARS virus in only 31 days, and can now sequence a virus in just a few days."

That, to me, is a mixture of accurate information, reasonable optimism . . .and unreasonable assertions. Yes, we're sequencing things faster than ever before, and part of that increase comes through computational advances, which are a ferocious driver of everything they concern. But it's a very long leap from that to saying that such sequencing is allowing us to "reprogram our biology". Reading the DNA letters quickly does not, unfortunately, grant us an equally speedy understanding of what they mean. And we shouldn't forget that sequences are only a part of biological understanding, a realization that the genomics boom of the late 1990s drove home very forcefully and expensively.

Then we come to this:

"Being able to decode the human genome allows us to develop detailed models of how major diseases, such as heart disease and cancer, progress, and gives us the tools to reprogram those processes away from disease. For example, a technique called RNA interference allows us to turn unhealthy genes off. New forms of gene therapy are also allowing us to add healthy new genes. And we can turn on and off enzymes, the workhorses of biology. Pfizer Inc.'s cholesterol-lowering drug Torcetrapib, for example, turns off one specific enzyme that allows atherosclerosis, the cause of almost all heart attacks, to progress. Phase II FDA trials showed it was effective in preventing heart disease, so Pfizer is spending a record $1 billion on the phase III trials. And that's just one example of thousands of this "rational drug design" approach now under way."

Oh, dear. Let's take these in order. First, being able to decode the human genome does not allow us to develop detailed models of how major diseases progress. It allows us to begin to think about doing that, and to be, for the most part, mistaken again and again. Many diseases have a genetic component, or two, or a thousand, but we don't understand them yet, nor their incredibly tangled relationships with development and environment. You'd think we'd know the genetic components of diabetes or schizophrenia, but we don't, and it's not for lack of trying. And as for the diseases for which the genetic component is less important, the sequencing of the human genome has been a non-event.

And yes, there is a highly interesting technique called RNA interference which can turn "unhealthy genes" off. It works quite well (although not invariably) in a glass tube or a plastic dish. A plastic dish, that is, in which you have carefully cultered cells in which you have carefully determined the presence of the gene of interest. And for many interesting conditions, you first need to find your gene, for which see above. Moving out of the cell culture labs, it should be noted that RNAi has significant hurdles to overcome before it can do anything in human beings at all, and may (like its forerunner, antisense DNA) still be destroying venture capital twenty years from now. Readers of this site once voted it the currently hyped technology most likely to prove embarrassing.

As for new forms of gene therapy allowing us to add healthy new genes, well, that's another hope that I'd like to see fulfilled. But there have been a number of disturbing and fatal complications along the way, from which the whole gene therapy field is still trying to recover. For Kurzweil to leave that sentence in the present tense, in the sense of this-is-happening-right-now, is putting it rather hopefully.

And yes, we can turn off enzymes. Some of them. This has nothing to do with gene sequencing or RNA interference, though, or any other particularly new technologies - enzymes as drug targets go back decades, and enzyme inhibitors as drugs go back centuries. Of course, you need to find your enzyme and make sure that it's relevant to the disease, and find a compound that inhibits it without inhibiting fourteen dozen other things, but that's how I earn my living.

And yes, Pfizer hopes to make all kinds of money off torcetrapib, but I'm not aware that they used a "rational drug design" strategy. In the industry, we tend to use that term, when we can use it with straight faces, to mean drug design that's strongly influenced by X-ray crystal structures and computational modeling, but I don't think that this was the case for torcetrapib. Kurzweil seems to be using the phrase to mean "drugs targeted against a specific protein", but that's been the dominant industry mode since the days of bell-bottoms. And if there are thousands of programs comparable in size torcetrapib, they must be taking place on other planets, because there's not enough drug development money here on Earth for them.

Finally, the end of the paragraph. Where does all this lead? Later in Kurzweil's article, he says:

"So what does the future hold? By 2019, we will largely overcome the major diseases that kill 95 percent of us in the developed world, and we will be dramatically slowing and reversing the dozen or so processes that underlie aging."

And here, I think, is where I can clearly differentiate my thinking from his. As opposed to a pessimist's viewpoint, I agree that we can overcome the major diseases. I really do expect to put cancer, heart disease, the major infections, and the degenerative disorders in their place. But do I expect to do it by 20-flipping-19? No. I do not. I should not like to be forced to put a date on when I think we'll have taken care of the diseases that are responsible for 95% of the mortality in the industrialized world. But I am willing to bet against it happening by 2019, and I will seriously entertain offers from anyone willing to take the other side of that bet.

Why am I so gloomily confident? For us to have largely overcome those conditions by 2019, odds are excellent that these new therapies will have to have been discovered no later than 2014 or so, just to have a chance of being sure that they work. That gives us seven years. It isn't going to happen.

So I'm back to wondering: am I a technological optimist at all? I must be, because I still think that science is the way out of many of our problems. But am I only optimistic about things of which I'm ignorant? That's probably part of my problem, yes, painful though it is to admit. Am I willing to be as optimistic as Ray Kurzweil? Not at all. . .

Comments (69) + TrackBacks (0) | Category: Drug Development | General Scientific News

July 20, 2006

Peptide Craziness

Email This Entry

Posted by Derek

Since I was talking about peptide synthesis yesterday, and the usefulness of peptides in general, I thought a few back-of-the-envelope calculations would be interesting. After all, if we're going to be making the things, we should know what we're getting into.

How about a combichem library of the things? Let's see. . .since someone mentioned vasopressin and oxytocin, let's figure that other 9-mers could have some interesting activity. What if we want them all? With the twenty most common amino acids in hand, and our peptide synthesizer machines recently serviced and reloaded, we throw the switch and. . .a mere five hundred and twelve billion peptides later, our library is ready for screening.

Ahem. That's well more than ten thousand times the number of organic substances that are indexed in Chemical Abstracts. This exponential stuff gets out of hand pretty quickly. Storing the stuff will be a problem. At, say, ten milligrams per compound, we're looking at five million kilos of peptides, and that's before the glass vials are added in. Protein folks look aghast if you talk about producing as much as ten milligrams of any given peptide, but hey, if we're going to turn the things into drugs, we have to get ready to work on scale.

And if we're going to make drugs, we're probably going to have to deal with some unnatural amino acids to improve metabolic stability. That has an effect, too, as you'd figure. Adding in one extra gives you an extra two hundred and seventy billion peptides, which is certainly value for your synthesis dollar. If you're going to get the deluxe package, with the D and L forms of the nineteen chiral ones, that'll run your screening file up to 200 trillion total, which is going to put a real strain on the chemical synthesis capacity of the entire world economy. Call ahead.

So a library of 26-mers, the size of Fuzeon, is going to be really hard to handle. That comes to a cool 6.71 times ten to the thirty-third power, which is beginning to get into the realm of really substantial numbers. At ten mgs per compound, we're down to the 6.7 times ten to the 25th metric tons, which is only a bit more than. . .ten thousand Earths. Well, ten thousand Earths made up of an even mixture of the twenty amino acids, that is, rather than boring old inorganic rock.

Let's just say that there's a lot of patent space, and plenty of reduction-to-practice loopholes, and leave it at that. . .

Comments (11) + TrackBacks (0) | Category: General Scientific News

July 17, 2006

Pounding Sand

Email This Entry

Posted by Derek

My chemistry readership is used to thinking in terms of reaction mechanisms. Those of you outside the field who've gone as far as organic chemistry will have come across them, too: pushing electrons for fun and profit. Chemists really do think in those terms, I can tell you - it's not just something they torture the sophomores with.

Here's a page with some good examples of classic mechanisms. (Update: that link may not be able to handle the attention. Other mechanism pages can be found here and here, and there's a well-done Flash site here.) Non-chemists will note mainly the profusion of curved arrows curling around the page, and wonder what we must be getting out of that stuff. The idea, though, is that chemical reactions involve bonds between atoms breaking, forming, and rearranging, and those bonds are formed through electrons. So most of what goes on in organic chemistry can be thought of - very successfully - as the movement of electrons, and that's what the mechanisms are showing schematically.

But reaction mechanisms are also one of the things that chase people out of the field completely as students. The problem is, the lazy way to teach an organic chemistry course is as a Huge Heap of Reactions, to be memorized and tested on. Buy while there's no way around learning and understanding these things, teaching them as if they were species names in zoology is a crime.

There's an easier way, which more competent professors point out. The thing is, electrons don't just zip around randomly. They're negatively charged, so they prefer to go toward things with positive charges and away from other negatively charged ones. The various chemical elements can be more electron-withdrawing or electron-donating, so that means that any bond between two different ones is likely to be an unequal affair. The electrons are going to settle more on the end of the bond that's pulling on them, giving it a bit of a negative charge, and leaving the other end with a bit of a positive one. If you can keep track of full and partial charges, which isn't that hard, you're a long way toward solving any mechanism that a test can throw at you.

That page I linked to has some carbonyl (carbon-double-bond-oxygen) mechanisms, and I'm telling you the truth: the few things on that page are the foundation of umpteen dozen reaction mechanisms, which means that you have a choice when you're studying organic: you can memorize the whole shaggy list, or you can learn the fundamentals and apply them over and over in different combinations. Why anyone would do it the hard way escapes me.

But I've seen people take on a lot of tasks that way. When I was in high school, we still had to memorize and recite poems - not especially good ones, stuff like Longfellow's "The Builders" and the end of William Cullen Bryant's "Thanatopsis", poems fit to give Aaron Haspel the shakes, but poems nonetheless. (Good to see that he seems to be blogging again, by the way). And I recall one guy standing up to take on one of these set pieces, and as I listened to him slowly, haltingly stumble through it ("So. . .live. . .that. . .when-thy. . .summons. . . comes-to. . .join. . ."), my opinion of his skills evolved. At first, I thought that he was terrible at memorizing a poem. And, well, I still thought that when he finished, which was quite a while later. But what I came to realize was that he was a lot better than I was at memorizing a long string of random words, which is what he'd reduced "Thanatopsis" to. He went through all the commas, all the phrases like a snowplow. None of it meant anything; it just had to be shifted by brute force. And that's how he did it, and how some chemistry students do it still. It doesn't have to be that way.

Comments (19) + TrackBacks (0) | Category: General Scientific News

July 10, 2006

Aluminum: Friend or Foe?

Email This Entry

Posted by Derek

One of the comments to the previous post mentioned having some trouble with a procedure out of one of the lesser journals. "Trouble", in this sense, meant "vigorous unexpected fire". But when he mentioned that it involved a mixture with aluminum chloride, I knew to look out.

Chemists everywhere live by thermodynamics. And one of the basic principles is that if a reaction's starting materials are more energetic than its products, then it's favorable. It doesn't mean that it's always just going to take off spontaneously - sometimes the intermediate step is much higher in energy, and the reaction can't get over the hump. But if there's not too high a levee between the two energy states, things will indeed flow downhill for you.

It's a good thing, too, since one such reaction is burning the nitrogen in the air, thereby changing it into poisonous nitrogen oxides. (Correction: late-night brain freeze there - I had in mind the fixation of nitrogen to ammonia, which is energetically favorable but has a high activation energy. Oxidation of nitrogen itself to NO is an uphill process, but under high temperature/high pressure conditions, like those found in your car engine, it does take place). Another one of those is burning aluminum, which also has a good-sized barrier to get past (otherwise using aluminum foil in your oven would be a spectacularly bad idea). The product of that reaction, aluminum oxide (or alumina) is one of the most below-sea-level compounds I can think of, compared to the metal or many of its compounds. Give 'em a chance, and they'll take off on you.

The classic example of this is the thermite reaction: aluminum + iron oxide goes to aluminum oxide + iron. Oh, and some heat. Well, OK, a lot of heat, enough to spray molten iron all over the place. (YouTube) You have to set the reaction off with something pretty hot (burning magnesium ribbon is traditional), but once it gets going, it tosses off enough spare heat to roll right along.

So no, I'm not surprised that some aluminum chloride would take off on someone. Regard all aluminum compounds without a bond to oxygen with a little suspicion. Many of the them (and all the aluminum metal you see) came from alumina, and they're scheming to get back.

Comments (13) + TrackBacks (0) | Category: General Scientific News

December 29, 2005

Outside Reading

Email This Entry

Posted by Derek

Over at Asymmetrical Information, here's some interesting speculation on what would happen if someone actually developed an HIV vaccine that worked. Plus, bonus mix of Paul Krugman-bashing and grudging Krugman approval!

What's the difference between an inventor and a tinkerer? Any why don't more people realize that there is one? Thoughts on this, with reference to the Wright Brother's alarming aircraft engine and the man who built it, here. Bonus Jared Diamond-bashing!

Monitoring mosques and other buildings for radiation, are we? It's interesting that we need to get as close as we are. . .and are we actually pointing any instruments at these sites while we do it? Jay Manifold connects the dots, since the media are doing such a lousy job of it. Bonus two-cultures complaint included, in this link.

You don't see quite as much of it in chemistry, but here's a blast at accelerated publication policies in scientific journals.

Comments (1) + TrackBacks (0) | Category: General Scientific News

December 15, 2005

Attack of the Angry Viruses

Email This Entry

Posted by Derek

I mentioned the anti-science types in Europe the other day, and I should mention that I have some personal history with them. I did my post-doc in Germany, long enough ago that there were two countries by that name. (I came back to the US about two months before the Berlin Wall fell, and had been traveling in Eastern Europe while it was falling apart, but that's another story).

I was doing free radical chemistry at the Technische Hochschule Darmstadt (in one of the buildings shown here, lower center), when one morning I arrived to find fire trucks and ambulances all over the place. Three or four floors above our labs, someone had firebombed the place.

I went up and had a look. They did a pretty low-tech job of it, with a can of gasoline, an immersion heater, and an electrical timer, but they did an awful lot of damage to the labs up there. And why did these particular labs get the treatment? Because they were engaged in work with recombinant DNA, naturally.

A group calling itself the Zorne Zornige Viren (Angry Viruses, or maybe Viruses of Rage) left a note claiming responsibility. I read a copy of the thing, which went on about the military-industrial complex seeking to extend its patriarchal hegemony over untrammeled Nature and the very essences of our beings, la la la. You can imagine how melodious that all sounds in German. They were (as far as I know) never heard from again, but their attitude lives on.

Just take a look at these figures. There's a huge gap between most of European public opinion and the US (and between Europe and many Asian countries as well). I remember seeing similar magazine and newspaper survey results when I lived there. No matter what, genetic engineering always got hammered.

You can speculate for many paragraphs about why this is so, and people have. A fear of eugenics, from the racial theories of the Nazis? A romanticized view of nature and the land, from people who have gradually paved over large parts of it? Worries about private entrepreneurs owning rights to genetic material and running amuck without the State being able to restrain them? Not all the reasons to be cautious are are prima facie wrong, nor are they confined to Europe. Somehoe, though, they've combined there into a solid mass.

It must make it difficult for biotech researchers in Germany and France, though, if people ask them what they do. No doubt they talk about developing new medical therapies or diagnostic tools rather than say "I mess around with DNA all day". Safer to speak of the outcomes than the tools.

Comments (33) + TrackBacks (0) | Category: General Scientific News

December 12, 2005

Play by Play in the Lab

Email This Entry

Posted by Derek

The submissions for tomorrow's Grand Rounds continue to come in, and my wife continues to try to throw me off by referring to it as "Ground Rounds". While working on that, whatever I end up headlining it, I can recommend reading the recent series of posts from Chad Orzel over at Uncertain Principles. Under the heading of "A Week in the Lab", he's going into the details of his research project in atomic spectroscopy. "Slow-Motion Experimental Physics Live-Blogging"!

Update: I originally had a link to one of the series here, but Chad has assembled the whole series of posts in one handy spot.

I assume that he'll eventually get things working, but he probably assumes that, too. That doesn't make the lab work any easier. Although this is very far indeed from the kind of work that I do, the rhythms of research are the same. Fix three things before you do the one thing you were trying to - it's a universal law.

Comments (1) + TrackBacks (0) | Category: General Scientific News

October 25, 2005

Start Your Engines

Email This Entry

Posted by Derek

I'm going to take a break this evening from the med-chem side of my science. There's a paper in the preprint section of the ACS journal Nano Letters that's one of the neatest things I've seen in a while. Jim Tour's group at Rice has been working in this area for quite a while now, and they now report something they call a "nanocar."

It's a single large molecule, built from standard organic chemistry reactions. There are two straight axles, made out of acetylene compounds (which are rod-shaped), and another connecter between running between them. On both ends of each axle is a fullerene (a buckyball), and getting those attached was the apparently one of the trickiest parts of the whole synthesis, which took several years. The other tough part seems to have been hanging enough greasy chains off the various structural parts of the thing so that it could be dissolved in an organic solvent. Here's the synthetic scheme and a drawing of the molecule. (That link currently seems to work for non-subscribers - the full article is here.)

Those fullerenes are wheels. They can turn independently, because the bond between them and the next acetylene is freely rotatable, and that seems to be just what they're doing. By finally making one of these that could be taken up into a solvent, Tour's group managed to get some of these things onto a gold metal surface, which is a perfect background to use for Scanning Tunneling Microscope (STM) imaging. And here they are. (The fullerenes show up very well in STM imaging, and they're pretty much all you can see.) Buckyballs are already known to stick very well to gold, so Tour's people had to heat up the metal to get things moving. Once they got up to about 170 C, though, the molecules - the nanocars - began to roll around.

Now, molecules sitting on metal surfaces move around all the time, but they mostly just slide and hop by thermal wiggling. There are several lines of evidence to show that these are really rolling, though. For one thing, a three-wheeled symmetrical variety was made, and it just spins in place. (That link also has a nifty rendered version of both types of molecule, but those are rather idealized portraits. For one thing, they don't show all the long side chains decorating the frame, which would make the whole car look rather Rastafarian.) The cars also appear to only move along their long axis, with slight pivots as one set of wheels breaks free before the other side does. (The nano-differential has yet to be invented). Finally, the team used the STM tip to drag a nanocar along, and showed that it couldn't be towed sideways - the wheels dug in rather than rolling.

It's easy to dismiss this work as a stunt, which is what I once did with one of Tour's other ideas. But this is the beginning of the real thing. A larger, more functionalized version of the nanocar might carry other molecules along and dump them at will, which is what this group seems to be working on now. These are small steps toward controlled nanoscale delivery, which is a small step toward a nanotech assembler.

We're a long way from that. But for now, there are any number of interesting experiments waiting to be run. You have to wonder how these things will behave on other surfaces, for one thing. If they drive better on some than others, you could imagine directing them around on small roads which have been fabricated by chip-building techniques. There are other molecular forms that could be used as wheels, and other potential ways to move them around rather than just heating them up. Just looking at these structures gave me an idea of my own: how about making the axle part of the molecule by incorporating a structure that would absorb at particular infrared wavelengths? That would show up as motion in the chemical bonds, and might provide a means to make a motor to drive these things. Eventually we're going to have grad students standing around an STM rig, betting on which of their designs will make it across an atomic landscape first. . .

Comments (5) + TrackBacks (0) | Category: General Scientific News

September 7, 2005

The Tiniest Doors Begin to Open

Email This Entry

Posted by Derek

Ahmed Zewail and his group at Caltech are the kings of the very, very short time scale. For many years now, he's been using extremely short laser pulses to accomplish a long list of previously unheard-of results in spectroscopy. (A non-specialist wouldn't go far wrong by thinking of him as a molecular-scale Harold Edgerton.) His work has not gone unrecognized.

And now there's a fighting chance that he and his people have recently accomplished something that would be worthy of a second Nobel: UEM, for Ultrafast Electron Microscopy. They're taking electron microscope snapshots, one trillionth of a second at a time.

And what is this technique good for? Well, electron microscopy has long been used for imaging all sorts of materials and biological samples. Fast freezing of the samples has revealed an extraordinary amount of information in the past, and Zewail's new method basically allows this to happen in real time, at room temperature, under normal conditions. The energies required to do it aren't huge, and it's quite likely that we'll be able to get useful data without destroying delicate targets. We could end up with extreme slow-motion movies of molecular processes, imaged at electron-diffraction resolutions. We're actually going to be able to watch nanotechnology experiments as they happen.

We'd be able to see catalyst molecules moving and rearranging as they do their work, and watch the shifting environment of metal atoms inside enzyme active sites. Subtle changes in crystal structures, happening too fast for us to follow, would become clear. We could conceivably see cell membranes flex and shift as ligands bind to their embedded receptors, and see processes inside cells that no one has ever been able to observe or even suspected were there. People in completely unrelated branches of science are going to be climbing over each other to get access to these machines.

I've never met Prof. Zewail, but his paper isn't the work of a retiring personality. Its abstract states that ". . .the long sought after but hitherto unrealized quest for ultrafast electron microscopy has been realized." That's inelegantly phrased (the quest wasn't the thing that was sought after, for one thing), but I take his point. The concluding section echoes Watson and Crick, surely on purpose:

". . .even biological changes at longer times have their origin in the early atomic motions. It should be readily apparent that such dynamical evolution is critical to function. It does not escape our notice that UEM is a significant advance for this purpose. . .we foresee the emergence of new vistas in many fields, from materials science to nanoscience and biology."

No, he's not a modest man. But this discovery isn't something to be modest about. It isn't bragging if you can do what you say.

(Want more details? This is wandering off into physics, but the fast laser pulses generate electrons through the photoelectric effect. They hit the photocathode of the electron microscope, which is made of the rather exotic material lanthanum hexaboride. One of the keys to getting this to work was to use pulse energies that deliver about one electron per wave packet. That allows the microscope to focus them - Zewail points out that their earlier attempts generated larger "bunches" of electrons which were difficult to focus, and whose pulses broadened out due to the electrons repulsing each other's negative charge. The delicate touch was crucial. Of course, none of this would do much good without modern scintillators and CCD chips, which can detect single electrons after they pass through the samples. For the real fanatic, Zewail's paper is in PNAS 102, 7069.)

Comments (18) + TrackBacks (0) | Category: General Scientific News

August 23, 2005

Outside Reading

Email This Entry

Posted by Derek

A few more links for further reading. . .

Professor Bainbridge wonders about what the Merck verdict says about our jury system. (The quotes he gives from the Wall Street Journal report make me feel like breaking something loud and costly.) There's plenty of comment on the same topic here at Asymmetrical Information as well.

Colby Cosh asks if engineers have become mentally crippled by early exposure to PowerPoint, in light of some disturbing reports from inside NASA. (And an engineer replies that the problem probably comes from the work environment, not the schooling. I don't know whether to be relieved or not. I see an awful lot of the stuff myself, so I hope it isn't taking too great a toll.)

And Medpundit looks back on a Nobel Prize-winning chapter in medicine. But it's not one that a lot of people like to be reminded of. . .

Comments (2) + TrackBacks (0) | Category: General Scientific News

August 16, 2005

For Further Enlightenment

Email This Entry

Posted by Derek

A few varied links for your reading pleasure:

Carl Zimmer does a fine job banging the Intelligent Design folks over the head next door at The Loom. And (via Arts and Letters Daily), here's another handy demolition of the whole concept.

Speaking of A&LD, "Michael Blowhard" pens a well-deserved fan letter to them here, while wondeing why the arts haven't taken more notice of the hard sciences than they have.

It looks like I finally have company in the inside-pharma-blogging world: take a look at the Medicine Vault, written by someone on the other side of the aisle in biology. Welcome!

Why would attempting to measure the dipole moment of an electron give anyone the willies? Chad Orzel explains, in a good example of telling people outside the field what some of the big news inside it is.

Comments (0) + TrackBacks (0) | Category: General Scientific News

August 7, 2005

An Off-Topic Ramble

Email This Entry

Posted by Derek

Note for new readers: I don't talk much about politics on this site, since there are more than enough blogs to cover every political position imaginable. But once in a while we veer off course. . .

The uproar over President Bush's support for "Intelligent Design" seems to have died down a bit. (You can find commentary all over the blog world, naturally - My fellow Corantean Carl Zimmer was, understandably, dismayed. For some cries of distress on the pro-Bush side, try Sissy Willis, Jane Galt, and this roundup at Instapundit.)

I wasn't too thrilled myself. I have no time for the ID folks. I think that the best of them are mistaken, and the worst are flat-out intellectually dishonest. But I wasn't that surprised by Bush's statement, either. It wouldn't surprise me to find out that he doesn't know enough biology to know how silly his support (wishy-washy though it was) makes him sound to people who do.

But I also think that, as a politician, Bush made a back-of-the-envelope calculation that saying this sort of thing wouldn't do him any harm, and (within error bars) it probably hasn't. I'm not sure how much of a slice of the electorate people like me represent (voted for Bush twice, convinced that Intelligent Design is pernicious), but I'll bet it's not too big. And other issues, which frankly - though I hate to admit it - I find to be more pressing, still leave me not regretting my vote in the last election. If Bush goes further in promoting ID teaching, I will of course oppose that in any way I can think of, in the same way I opposed his steel and textile tariffs. That doesn't mean I'm cheerful about the situation, but there's no possible President who wouldn't tick me off about something or another.

I would expect most Presidents to outsource their needs for any knowledge of evolutionary biology, anyway. It's not a job requirement. Now, I know that being smart enough to see problems with Intelligent Design would seem, on the other hand, to be a job requirement, but it depends on what a person turns their attention to. And a review of Presidential history suggests that performance is not well correlated with intelligence, anyway. If anything, the distribution is a bit U-shaped. Dullards like Franklin Pierce and Warren Harding failed, but on the other end of the scale, academicians like Woodrow Wilson failed in different ways.

Aaron Haspel's discussion of "Chet" - friendly, hard-working, well-adjusted, riotously well-paid Chet - is worth reading in this context. And I'll let James Branch Cabell have the last word, in a famous passage from Jurgen, when he meets that fantasy's nearest thing to God:

". . .And of a sudden Jurgen perceived that this Koshchei the Deathless was not particularly intelligent. Then Jurgen wondered why he should ever have expected Koshchei to be intelligent? Koshchei was omnipotent, as men estimate omnipotence: but by what course of reasoning had people come to believe that Koshchei was clever, as men estimate cleverness? The fact that, to the contrary, Koshchei seemed well-meaning, but rather slow of apprehension and a little needlessly fussy, went far toward explaining a host of matters which had long puzzled Jurgen. Cleverness was, of course, the most admirable of all traits: but cleverness was not at the top of things, and never had been."

I'll try to talk a bit about Chets (and George Bushes) as I've experienced them in the drug industry in an upcoming post.

Comments (38) + TrackBacks (0) | Category: General Scientific News | Intelligent Design

March 31, 2005

Why Carbon Matters

Email This Entry

Posted by Derek

One of my correspondents wrote to ask "What makes carbon so special?" That is, how come all the life we know about is based on it?

There are several qualities that we organic chemists (and living beings) admire about carbon. But before counting the ways, let me start by saying that I'm talking about "life as we know it." As that Steven Benner review article that I spoke about a few weeks ago makes clear, you can imagine chemical domains at other temperatures and pressures that could support life of another kind.

But for the only life we've found so far, the Earthly kind, the temperature and pressure space is roughly bounded by the territory of liquid water. Higher pressures will let it stay liquid up to higher temperatures, and we have organisms that will ride right along with them. Likewise, high ionic strength will let you keep a liquid matrix down to much lower temperatures, and we have that covered here on Earth, too.

Inside this range, carbon has a lot of advantages. It forms very stable bonds to itself, first of all. Forming and breaking them (under controlled conditions!) is one of the major challenges of organic synthesis. Carbon atoms can be strung out to give you virtually any size molecule you want; there seems to be no upper limit and there's no reason to expect one. This is important, because a likely requirement for any kind of chemical-based life is large molecules with structural diversity. Life's bound to be complex, and carbon compounds give you all the complexity you can handle - straight and branched chains, rings, whatever you want.

And those bonds come in more than one flavor. While carbon-carbon single bonds form a 3-D tetrahedral lattice (found in its pure form in diamond), double bonded carbons can all flatten out into the same plane. The best natural example is graphite, made up of flat sheets of tiled carbon rings, full of alternating double and single bonds. The sliding motion of those sheets over each other gives pencil lead its properties. And there are triple-bonded carbons, too, which end up in a straight line. Carbon gives you a wonderful 1D / 2D / 3D building set.

There's another key thing about the element. More structural (and reactive) diversity comes from all the ways that carbon can form bonds with other elements. Oxygen, sulfur, nitrogen, phosphorus and many other elements readily form carbon derivatives under Earthly conditions, and these give you the crazy variety of organic chemistry. We've got solids, liquids, and gases, acids and bases of all strengths, nonpolar compounds and polar ones fitted with all kinds of electron-rich and electron-poor zones, and reactivity all the way from rock-solid to burst-into-flames.

I think that it's much more likely that we'll find life that uses different carbon-based compounds than it is that we find life based on siloxanes or some other framework. Organic chemistry is too useful to avoid. Now, organic chemists are another matter entirely. . .

Comments (2) + TrackBacks (0) | Category: General Scientific News

February 23, 2005


Email This Entry

Posted by Derek

Back in the early days of my pre-Corante blog, I wrote a piece about some other kinds of chemistry that might be used in living systems. There's now a wonderful one-stop review for all sorts of speculations on this topic, which incorporates everything I've ever thought of and plenty more. Steven Benner at the University of Florida, who my fellow Corantean Carl Zimmer has interviewed, and two co-workers (here's his research group) published "Is There a Common Chemical Model for Life in the Universe?" in Current Opinion in Chemical Biology late last year. (here's the abstract; I can't find the full text available yet on the Web.)

I can't say enough good things about this article. This is the sort of topic I've enjoyed thinking about for years, but there were still plenty of things in this review that had never occurred to me. Benner goes over the likely requirements for life as we know it, life as we'd probably recognize it, and life upon which we can barely speculate. As a chemist, he's particularly strong on discussions of the types of bonds that could best form the complex molecules that chemical-metabolism-based life needs. Energetic considerations - how much chemical bond energy is available, how soluble the materials are, how reactive they are at the various temperatures involved - are never far from his mind.

He devotes sections to ideas about living systems without chemical solvents (gas clouds, solid states) and the more familiar solvent-based chemistry. There's plenty of water out there in the universe - which is why bad movies about aliens coming to drain our oceans are so laughable - and it's natural enough that we should concentrate on water-based life. But there's plenty of ammonia out there, too, along with methane, sulfuric acid, and other potential solvents like the supercritical dihydrogen found in the lower layers of gas giant planets.

So, is all this stuff out there? Is life something that is just going to happen to susceptible chemical systems, given enough time? If so, which ones are susceptible? Benner's thoughts are, I think, best summed up by his take on Titan:

"Thus, as an environment, Titan certainly meets all of the stringent criteria outlined above for life. Titan is not at thermodynamic equilibrium. It has abundant carbon-containing molecules and heteroatoms. Titan's temperature is low enough to permit a wide range of bonding, covalent and non-covalent. Titan undoubtedly offers other resources believed to be useful for catalysts necessary for life, including metals and surfaces.

This makes inescapable the conclusion that if life is an intrinsic property of chemical reactivity, life should exist on Titan. Indeed, for life not to exist on Titan, we would have to argue that life is not an intrinsic property of the reactivity of carbon-containing molecules under conditions where they are stable. Rather, we would need to believe that either life is scarce in these conditions, or that there is something special, and better, about the environment that Earth presents (including its water)."

As for me, I can't wait to find out. I want Titan rovers, Jupiter and Saturn dirigibles, Venusian atmosphere sample return, instrument-laden miniature submarines melting down through the ice on Europa and Enceladus: the lot. How much of this will I ever get a chance to see in my lifetime? Current betting is running to "none of it, damn it", but things can change. Depends on how easily and cheaply we can get payloads up to (and out of) Earth orbit.

Comments (6) + TrackBacks (0) | Category: General Scientific News | Life As We (Don't) Know It

January 14, 2005

How Often Do We Land on Another World?

Email This Entry

Posted by Derek

I'd like to remind everyone that something very unusual is happening today: we appear to have successfully landed on Titan, Saturn's largest moon. Word came in a few minutes ago that the Huygens probe has sent at least two hours of observations back, which at least means that its parachutes opened.

Huygens carries all sorts of fine spectroscopic equipment to figure out what's going on under Titan's massive cloud deck, along with down- and sideways-pointing cameras and a spotlight. At this point, we don't know if it landed with a thunk, a splat, or a splash, but we'll be finding out later today when the information is sent back to Earth by the Cassini spacecraft (in orbit around Saturn).

Figuring out what we're seeing might take a bit longer. Titan is one of the most alien places you could find in our solar system. Barring some really excellent new technology, which I certainly hope for, this will be one of the few landings on another world that we'll all get a chance to see. It's a great day for the species.

Comments (1) + TrackBacks (0) | Category: General Scientific News

January 6, 2005

More Fun With DNA

Email This Entry

Posted by Derek

I mentioned hooking up small molecules to DNA yesterday. A comment to that post prompts me to write about something I've been thinking about for some time: the work of David Liu at Harvard. I have several of his papers in my files, and he's recently published a long review article in Angewantdte Chemie, for those of you with access to the journal (43, 4848, the International Edition, of course.) Turns out that he has an informative website summarizing the work, too.

In short, what he's been doing is trying to get chemical reactions to go in a much different way than chemists usually do. The inside of a reaction flask is a very weird and specialized environment. We have to really bang on things to make them react - high concentrations, special solvents, catalysts, lots of heat. By the standards of living systems, it's the Spanish Inquisition. Meanwhile, cells make all kinds of things happen by keeping the reactants around in very low concentration (or trickily compartmentalized, a factor not to be ignored), and then sticking them together with other reactants inside the active site of an enzyme. The middle of an enzyme is like a reaction flask that's just big enough for the two molecules, and all sorts of unlikely chemistry happens under those conditions, things that you just can't get away with in bulk solutions.

I should declare my biases here: I find this principle tremendously appealing, and I've had a number of idea spasms in this area myself, which have come on like malarial relapses over the last two years. A number of scattered reports of this kind of thing that have shown up over the last few years; I long to join them. Reducing these brainstorms to practice hasn't been easy, but I continue to think that this general area of research has a huge amount of untapped potential for organic chemistry and drug discovery.

Liu has been taking advantage of the ferocious drive that single strands of DNA have to combine with their complementary partners. He and his group have added chemical linkers to the 3' and 5' ends of complementary strands and decorated them with molecules that could react with each other when they're jammed together by the zipping-up of the DNA ladder. This gives you several interesting possibilities by taking advantage of the huge molecular biology infrastructure of manipulating DNA. Foremost of these is, as I mentioned in the last post, the peerless signal amplification of the PCR reaction, which lets you run everything on microscopic scale and turn up the volume later to see what happened.

Liu's group has tried all sorts of variations on this idea, with different reaction types and different linkers at different positions up and down the DNA chain from each other, and results have been very encouraging. A lot of things are going on. They've found a number of different reactions that can take place under DNA-templating conditions, and they're still expanding the list. They act differently, in surprising ways. Sometimes it's the rate of DNA hybridization that determines the reaction course, and sometimes it's the rate of the small-molecule reaction they're trying to encourage. Along the way, they've shown that some reaction sequences that would normally be incompatible in the same flask can be made to happen in an orderly fashion on the DNA templates.

They've also recently reported using these systems to discover new reactions - splitting and recombining the reactants in classic combinatorial chemistry style, but with that microscale advantage that DNA labeling gives you. You could have thousands of reactions going on in amounts of solvent that a chemist like me wouldn't even notice in the bottom of a flask. Some of these reactions will only work under the DNA-template conditions, which is useful on that side of the research, but not so good for making real-world (that is to say, my-world) quantities of compounds. But some of them look like they can make the leap to non-DNA conditions.

That's just a quick overview - for more details, see Liu's site link above. This is a quickly evolving area, and I'm sure that a lot of neat ideas are waiting to be tried (or even thought of in the first place.) I'm a fan. This is something new, and the more completely new approaches we have to do organic chemistry, the better off we are.

Comments (6) + TrackBacks (0) | Category: General Scientific News

January 4, 2005

Tadpoles to the Rescue?

Email This Entry

Posted by Derek

Speaking of odd ideas that might have applications in drug discovery, there's an interesting one in the latest issue of Nature Methods (2, 31). A group at the Molecular Sciences Institute in Berkeley reports a new way to detect and quantify molecular binding targets. And if you think that this sounds like something we're interested in over here in the drug discovery business, you are correct-o-matic.

This idea piggybacks, as you might expect, on the mighty king of detection and quantification in molecular biology, PCR. The ability to hugely amplify small amounts of DNA is unique, the biochemical equivalent of a photomultiplier , and many people have taken advantage of it. In this case, they also make ingenious use of weird beasts called inteins, about which a great deal of background can be found here. Briefly, inteins are sort of DNA parasites. They insert into genes and are read off into an extraneous stretch of protein in the middle of the normal gene product. But then they quickly clip themselves out of the protein - they have their own built-in cut-and-splice mechanism - and leave the originally intended protein behind them, none the worse for wear.

The MSI group takes the molecule of interest - say, a protein ligand - and attaches an intein to it. They take advantage of its splicing mechanism to have the intein remove itself and attach a stretch of specially whipped-up DNA, which serves as a tag for the later PCR detection. They call this conjugate a "tadpole", for its shape in their schematics (the DNA tag is the tail, naturally.) Said tadpole goes off and does its thing in the assay system, binding to whatever target it's set up for, and you do a PCR readout.

The paper demonstrates this in several different systems, going all the way up to a real-world example with blood serum. What's impressive about the technique is that it seems to work as well as antibody methods like ELISA. Getting a good reliable antibody is no joke, but these folks can make smaller proteins with much worse intrinsic affinity perform just as well. And if you turn around and do the trick starting with an antibody, you can increase the sensitivity of the assay by orders of magnitude. And you get a real quantitative readout, with about +/- 10% accuracy. To give you the most startling example, the authors were able to detect as few as 150 single molecules of labeled bovine serum albumin in a test system.

The "News and Views" piece on all this in the same issue points out that the technique gets round some real problems with the existing methods. Labeling proteins with DNA or fluorescent tags is a messy and imprecise business, and it can be very hard to tell how many labels your target carries (or how many different species are really present in your new reagent.) The intein method is one-to-one label-to-protein, with no doubts and no arguing. Cell biologists are going to have to get used to knowing what they're looking at, but I think that they'll be able to adjust.

The news article calls the technique "ultrasensitive, amplified detection of anything," and that's pretty close. As the MSI authors point out, it removes the limitations of antibody technology: no longer can you detect only the things that an immune system has a reaction to. Screening of protein libraries could provide low- to medium-affinity partners (which is all you need) for all kinds of poorly-studied molecules.

I'd be interested in seeing if the system can be adapted for small (i.e., drug-sized) molecules conjugated to DNA. They wouldn't be tadpoles any more, though - more like eels - and might behave oddly compared to their native state. But even if you stick with the larger protein molecules, important biology may well be a lot easier to uncover. And we've got an endless appetite for that stuff. It's good news.

Comments (5) + TrackBacks (0) | Category: General Scientific News

August 10, 2004


Email This Entry

Posted by Derek

I need to take a moment to remember two extraordinary scientists: Francis Crick and Thomas Gold. Both distinguished themselves by being willing not to care about what other people thought of them and their work, which is a useful spice for the stew as long as you don't add the whole jar.

Of the two, Gold was the harder for his colleagues to take. He worked in a variety of fields in a way that is hardly ever seen in modern science. Along the way, he had some spectacular misfires, but you have to be doing spectacular work to have those at all. And his successes (in things as diverse as pulsars and the bones of the inner ear) were impossible to deny. He may yet be proven right about his final provocation, the idea that geological hydrocarbons are, for the most part, just that: geological and not biogenic. He expanded this idea to the propose the "deep, hot biosphere" which both generates methane and adds biogenic signatures to inorganic petroleum, and that part, at least, is looking more correct every year.

Cosmology, physiology, astronomy, geology - I don't think we're going to see his like again. To be honest, there are many people who will hope we don't. Gold was not inhibited about pointing out the failings, as he saw them, of fellow researchers, and there were many who saved up ammunition to pay him back in kind. I don't think science could function well with a population made out exclusively of Tommy Golds. But it would function even more poorly without any at all.

Francis Crick, the more famous of the two, probably seemed to the public to have dropped out of sight for the last fifty years after the DNA discovery. But molecular biologists know how important he was in the years after that first proposal, helping to work out the RNA code and other fundamental issues. Later on he turned to even harder areas, such as the physiological nature of consciousness and memory. No one person is going to solve those, and Crick didn't. But he took some fine swings at them, and he'll be missed.

Comments (1) + TrackBacks (0) | Category: General Scientific News

June 16, 2004

The Dull Edge of Nanotech

Email This Entry

Posted by Derek

There's a type of paper that's showing up often in the major chemistry journals these days, and it's a type that didn't even exist a few years ago. I can't count the number of reports of nanometer-sized structures that have been described recently. Rods, filaments, sheets, cylinders, shells - you name it and someone's got it. That inorganic salt plugging up your filter? Turns out it's not just an annoyance, it's a publishable nanostructure!

On one level you can see why this happens, with all the publicity that nanotechnology has these days. But that's not what most of the papers are really about. No particular use or general principles are suggested, for the most part, just "We found these things, and they look like this." (You can spot these papers quickly in the abstracts at the front of the journals, because they're invariably illustrated with a photomicrograph of the new structure.)

There's a place for that kind of paper, naturally, but are there dozens of places? Some of these things may turn out to be useful, or at least point the way to something useful, but for now they're largely just being described as curiosities, and they're being published because - well, because they can be. Perhaps some of these groups are hoping that someone, someday, will make a breakthrough that makes their paper look ahead of its time.

The techniques to look for these structures have been around for some years, so it's not like we're just now able to see them. It's just that up until recently, no one has cared all that much. I have to wonder what would have happened if someone had submitted a paper to JACS fifteen years ago about, say, scandium salts that form nanoscale helices when precipitated out just so. Would the editors and reviewers have known what to make of it? Or would they have tossed it back, telling the authors to come back when they had more to say?

There's a lot of serious nanotech work being done in chemistry, but this stuff isn't it. I have to think that these papers are going to look a bit strange and dated in coming years, once this stamp-collecting phase passes. When will the editors at the likes of the Journal of the American Chemical Society, the Journal of Organic Chemistry, Organic Letters,and Angewandte Chemiecall a halt?

Comments (5) + TrackBacks (0) | Category: General Scientific News | The Scientific Literature

March 25, 2004

A Birthday Worth Noting

Email This Entry

Posted by Derek

No time for a real update today, but (thanks to Instapundit) I wanted to recognize Nobel winner Norman Borlaug, whose birthday is today. He should be much better known than he is, since (as the man behind the "Green Revolution") he has beyond a doubt kept hundreds of millions of people from starving to death.

What's remarkable is that he's still out there, doing what he's been doing for the last forty years. Not everyone is happy about it, though, as this Gregg Easterbrook article in the Atlantic Monthly and this Ron Bailey interview from Reason show. More information can be found here, at the site of the Borlaug Heritage Foundation.

A worthy goal for a person would be to attempt to accomplish one-tenth as much.

Comments (0) | Category: General Scientific News

March 9, 2004

Nuclear Fusion, Wordsworth, German Cooking. The Usual.

Email This Entry

Posted by Derek

I've been remiss in not mentioning the new paper that's coming out in Physical Review E from the group that's reported possible sonochemical fusion. Their original paper from two years ago was the subject of one of my early blog posts (see the March 4 entry.) I'm very happy to hear that this work is still going on, and has been further refined. This increases the odds that there's something here worth studying. Physical Review is not a pushover of a journal (and neither is Science, where the first paper appeared), so the Purdue/Rensselaer effort has already made it through tougher scrutiny than other unconventional fusion claims. (I know that they've had trouble getting their papers through, but that's to be expected.) This groups seems to be doing this the right way, responding to critics by quietly improving their work and not rushing out to claim Instant Free Energy, Persecution by the Powers That Be, and all the rest of it.

I've mentioned before that the Pons-Fleischmann debacle of 1989 is a very fresh memory with me. I've tried several times, unsuccessfully I think, to write about it in blog posts and other places. I'll give it one more try, with apologies to those who've heard me speak about it before.

It's hard for me to convey what a bolt from the blue that news was. I was living in Germany, doing my post-doctoral work. That Easter Sunday I heard the news on Armed Forces Radio, which I had playing in my lab. (Yes, I was in the lab.) I'd like to be able to see my reaction - I'm sure my head jerked up abruptly to stare disbelieving at the radio. The report credited the Financial Times newspaper, so I trotted down to my car and drove to the train station to buy a copy. I still have it, the original color of the newsprint somewhat altered by time and oxygen.

The weather was nice that day; winter was finally breaking apart in Germany. I headed back to my lab for a while, reading the paper as the sun came in through the windows, before going off to an Easter meal with my labmate and his family. I remember his father asking him in German, using a slang term equivalent to "Yanks": "Have you heard? The Amis have done nuclear fusion!" He was smiling. And I was proud, I was excited, and I didn't know anything more than I'd read in the newspaper. All I could say was that something huge might have happened.

It hadn't. That whole castle began to crumble back into an entropic sandpile as the tide came in over it. It took months, it took years, but it's pretty safe to say that dream is as good as dead, despite occasional odd reports. (Or because that's all there are.) But for a while there, I knew what Wordsworth was talking about when he wrote "Bliss it was in that dawn to be alive", although in a better cause, I hope. I was absolutely elated by the news, by what it could mean - cheap energy, a transformed world, oh, the usual. But just as much, what excited me was the thought that discoveries like this were still out there to be made. The world was strange and it had surprises up its sleeve.

I'd waited to hear more about the sonochemical fusion work ever since it came out, but after a few weeks everything was quiet. No arguing, no counterclaims - I'd already had sinking feelings of regret that another attempt at a breakthrough wasn't working out. So it was another surprise when I saw the news. Winter's beginning to break here - I'd been able to enjoy the temperature when I went out to get the paper from the yard. And here it was, the headline about "fusion results replicated."

The hair stood up on my arms and on my neck. But this time, instead of pacing around alone in my lab, I tried, in the kitchen of our house, to explain to my two small children why I'd jumped up like that. I've been somewhat altered by time and oxygen, but I'm glad that such things might still happen, and that I can still react like this when they do.

Comments (0) | Category: General Scientific News

January 8, 2004

A Request From Biology

Email This Entry

Posted by Derek

A fellow researcher, working over at The Competition, sent along a couple of good questions. He's a biologist, and was reading the posts here earlier this week about compound repositories. He writes:

"I would like to pose to you a question I have tried for years to get chemists to address: why has no one yet come up with a better "universal" solvent than DMSO for dissolving compounds for routine screening in assays? As you point out, DMSO has numerous liabilities for this purpose (perhaps more on the biological side), and yet we all continue to use it routinely, because there doesn't seem to be a better alternative. In the various screening labs in which I have been involved for the past dozen or so years, we have tested a number of possible alternative solvents on an ad hoc basis, none of which seemed any better. Surely modern organic chemistry can do better?"

All I can offer my colleague is this: modern organic chemistry may not be quite as powerful as you've been led to believe. Perhaps the problem is that you've been listening to too many of us modern organic chemists. We do tend to go on.

He's right that DMSO has its down side, and there are some that I didn't even mention. For one, DMSO and air make for a decent oxidizing system, enough to cause trouble in electron-rich molecules. Things will start to change color on you in a DMSO solution that's been left open. And it does have its biological problems. Too much DMSO in an assay system will cause the proteins involved to change their conformations, probably inactivating them. At the very least, your data start to go haywire.

Is there anything we mighty chemists can do about this? Well. . .actually. . .no, probably not a whole lot. The problem is, anything that has "universal solvent" properties is going to have "universal denaturant" properties when it comes to large biomolecules. Proteins, carbohydrates, and nucleic acids are made to hang around with water. Anything that isn't water is going to cause trouble, sooner or later.

Coming up with a solvent that acts just like water, but isn't, may well be impossible. Water's just too weird. Its boiling point and viscosity are way off the estimates you'd get from looking at related things like ammonia and hydrogen sulfide, due to its extraordinary hydrogen-bonding powers. And it's those bonds that do the trick with biomolecules. It's sui generis - there's no other molecule that small that can do hydrogen bonds that strongly, at those angles, in two directions at once.

DMSO gets by because it's also small (although not as small as water.) It's the smallest sulfoxide possible, so it has the most character. The key is that the sulfur and oxygen atoms in the sulfoxide bond have a lot of charge on them - the oxygen's nearly a minus charge; the sulfur's nearly a plus. That dipole lets it really line up with any polar groups a molecule might have, and its two methyl groups give it a chance to dissolve some hydrocarbons that water won't accept. (Larger sulfoxide analogs just add greasiness, and are less powerful solvents. That's the wrong direction, and you can't go any further the other way.)

And all the other attempts at DMSO substitutes tend to follow that same path, things that are polar because of their high dipole moments. Some of the also-rans are N-methylpyrrolidone (NMP), DMPU, and the toxic HMPA. None of them are as good as DMSO, and they all suffer from its disadvantages. There may well be some funky structures out there that haven't been given a fair hearing, but I sure can't think of many right now. I'm afraid that we're just going to have to live with DMSO, and respect water's magical powers for what they are.

Comments (0) + TrackBacks (0) | Category: Drug Assays | General Scientific News

December 17, 2002

Looking Back, Looking Forward

Email This Entry

Posted by Derek

I've been Christmas shopping for my two kids (ages 4 and 2 1/2) and have seen plenty of things that they'll be getting when they're older. Like chemistry sets - although the ones they sell now (standard chemist's complaint coming) are wimpy and underpowered.

But there are ways of fixing that. When I was around 9 years old, I was given a chemistry set augmented through the efforts of my father, who rounded up some more interesting chemicals then the ones provided. For example, there was a 100 gram cardboard container of copper (II) sulfate, whose deep blue crystals I was immediately taken with. (I haven't had to use that compound in several years now, but every time I see it in the lab I recognize an old friend, unchanged.)

There was some sodium potassium tartrate, which meant, years later, that I was probably the only first-year chemistry student at my college who knew what "Rochelle salt" was. I still use that one once in a while, too, to complex out aluminum from a reduction - since it doesn't have a distinctive color, though, I have to remind myself every so often that I'm using a compound that I've known since I was ten. Dishing out its rod-shaped crystals is more of a reminder than the solution can be.

Rochelle salt is pretty innocuous. But my father had bought some potassium permanganate, which is a rather high oxidation state for a kid to have on his shelf. You can get in some actual trouble with permanganate, since it would just as soon slide down to the mud of manganese dioxide and ditch plenty of energy along the way. I was always amazed by its color, the very definition of purple as it dissolved in water. (If you left the solutions standing around, though, they would find a way to turn back into muck.) As I experimented with it, I came across several mixtures that gave it room to run, sometimes violently, with plenty of heat and fizz.

Some of those involved elemental sulfur, so I'm sure I got some whiffs of sulfur dioxide and other odd gases along the way. I used plenty of that stuff in a research project at my former company, but its smell induced no Proustian recollections - getting a good dose of it was more like standing in a steaming extraterrestrial swamp. What takes me back immediately, though, is powdered sulfur itself. Every time I come across it, I think "Now that smells like a chemistry lab!"

I didn't have any of the things that I use more often now - no hexane, no ethyl acetate or any of the other dozen solvents under my fume hood. The only organic solvent I had was some carbon tetrachloride, for a butterfly kill jar, and I never used it with the rest of the chemicals, since I knew that the salts wouldn't dissolve in it. Certainly I had no air-sensitive reagents, and probably a good thing, too. (I was impressed to read in Oliver Sacks's Uncle Tungsten that he had actually prepared things like phosphine as a boy - that's something I'd think twice about handling even now.)

Actually, my ten-year-old self would have been a bit puzzled by a career choice in organic chemistry, as opposed to inorganic (although the news that I was a working scientist would have gone over well.) I had my father's college copy of the CRC Handbook, from back in the days when it really was a handbook, and I used to skip over the organic chemical tables and the half-breed organometallics. Those sections of the book had to wait a few years to become intelligible to me (like the tables of integrals - what an odd feeling it was to leaf through those after taking calculus and suddenly finding myself able to read them.)

That handbook is up on a shelf in this room as I write, over my left shoulder. And my two children are asleep down the hall. A few years from now, we'll all sit down together.

Comments (0) + TrackBacks (0) | Category: General Scientific News

December 1, 2002

Place Your Bets

Email This Entry

Posted by Derek

Something I mentioned in a post last week got me thinking. . .does anyone want to put some money down on whether the European Union will accept the new strain of rice I was speaking about? After all, it's genetically engineered, no doubt about it - what's more, it has genes that didn't even come from plants at all, but were spliced in from bacteria. Sounds just like the sort of thing that they've been putting their feet down about.

That wouldn't be much of a problem, normally - not a heck of a lot of rice gets grown in Europe (well, Arborio strains in Italy, yeah, but most of the rest of the continent isn't really warm enough.) And its not like the enhanced cold tolerance of the new plants will convince European farmers to start growing it, either, because - genetic fears aside - the EU already produces more food than it knows what to do with.

No, the problem is that other, poorer, countries have been leery of growing genetically modified crops because they trade with the EU. And the Europeans are worried that some of these modified strains might make it, by mistake, into their own countries. You may have read about Zambia (not a country that can really afford to turn down free food) rejecting offers of grain from the US because of fears of European retaliation. A recent effort by Denmark has dragged several other European countries, kicking and screaming, into accepting small amounts of inadvertantly mixed genetically-modified grain, but at a very strict level. Perhaps more African nations will feel safe to feed their starving populations with free food, once everyone in Brussels thinks about the situation a while longer in some really good restaurants. (A cheap shot, I know, but this sort of thing really gets on my nerves.)

So, how about it? Will Europe nervously sidle away from evil Franken-rice - part grain, part bacteria, all terrifying? Or will they have come slightly back to their senses by the time this livesaving innovation is released to the public domain?

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 26, 2002

Unequivocal Good News

Email This Entry

Posted by Derek

You may have noticed the recent report about the creation of an engineered stress-resistant rice. This looks like a real triumph of plant biotechnology, and it's the best news I've seen in some time.

The idea behind this work has been floating around for some time (and the Cornell team that succeeded has been at it since at least 1996 themselves.) There's a sugar called trehalose (chemically, two glucose molecules attached roughly head-to-head) that's been known for many years as a biological preservative. Everything that can survive severe drying, from yeast and bacteria on up, seems to produce this stuff under stress. The best guess about its function is that it replaces the water that would normally surround key proteins and cell membranes, and stabilizes them until (and during) rehydration. Trehalose tends to form a noncrystalline glassy solid state, which probably is what happens inside the cells. (By the way, it's completely nontoxic, and found in many foodstuffs already.)

Plants have been engineered to produce it before, but there have been problems. If the sugar is produced all through the plant, there's often some stunted growth or other odd effects - these plants showed drought tolerance, but that correlated pretty well with how weird they looked. One key was to get the gene expressed only in chloroplasts, the cholorphyll-containing organelles that do the metabolic heavy lifting (in the same way that mitochondria do it outside the plant world.) That has the added advantage of making the gene(s) much harder to transfer to other plants in the wild.

The Cornell team managed to get a lot of control over how the sugar is expressed - with different genetic promoters, they can cause it to show up in different parts of the rice plant, or under different conditions (only under stress, for example.) That and an improvement in the gene that was used seem to have done the trick.

So what sort of rice plant is this? One that actually seems to do a better job of photosynthesis, for reasons that aren't really clear yet. One that can take salt-water conditions, stand 10-degree lower temperatures, and stand up to ten-day droughts. Any of these will kill a normal rice plant, but these survive. This is going to open up huge marginal areas to cultivation.

Do you suppose the European Union will ban these plants? Can you just see activists pulling them out of the ground? Do you think it's any coincidence at all that this result was realized in a country that sets researchers free? The Cornell group has already announced that they're going to release this technique to the public domain, as a benefit to mankind. The same technique looks to be applicable to corn, soybeans, wheat - you name it. If the promise of this work is realized, a huge step has been taken to alleviate human suffering. I'm as happy as can I can be about this, and I'd like to salute the people who made it happen. And to take a little time to reflect about what tremendous things can be accomplished, even here at the beginning of our knowledge. . .

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 24, 2002

A New Form of Hype, uh, Life?

Email This Entry

Posted by Derek

Craig Venter and his wife Claire Fraser had a controversial effort going a few years ago, known as the "minimal life" project. The question was: what's the smallest number of genes an organism can have and still function? They got reasonably far along with it, then shelved it for a number of reasons. I'd seen interviews with Venter recently where he was mentioning it again, so the news that the project is underway again didn't come as a shock.

You wouldn't want to look for a minimal gene set starting from something as complex as a mammal - or any decent-sized organism, for that matter. They picked a really small single-celled creature called a Mycoplasma. It's a good choice, because they're pretty minimal organisms to start with - they're so small (and can be so hard to detect) that over the years they've been a major pain in the neck as a contaminant in cell culture labs. The species that they chose, for example, has only 517 genes. The plan is to knock them out systematically, one at a time, and see if the resulting organism can survive (and if it can, what its limitations might be.) It's a large project, but not a prohibitive one - before they project was mothballed in 1999, they'd already narrowed the list down to about 300 genes.

There could be some complications: for example, depending on the order in which things get knocked out, you could end up assuming that a particular function is non-removable when in fact it could be part of a system that has to be taken out all-or-none. These patterns will probably become apparent as the work goes on, and should provide some interesting information.

Once they get down to the minimal instruction set, we get into the territory that unnerves people. The amount of DNA that we're talking about is still going to be long, but in relative terms it might well be short enough to produce in a DNA synthesizer. If you do that, and toss it into a cell that's had all its genetic material stripped out, you could bootstrap your own organism. That'll be weird in two ways: it'll be a new species, made up on the spot, and it'll have been made, to some extent, from reagents on the shelf.

There shouldn't be anything unsettling about that, but to many people there is. It's true that a virus has been recently been produced in the same way, but most people (including me) don't really think of viruses as being living organisms. This experiment doesn't bother me, partly because they're still relying on all the already-built cellular machinery to accept the new genetic material. Building a whole cell from scratch would be a much greater effort, one that I really don't think anyone can swing yet. But they will, at some point. . .vitalism dies hard: it's going to be interesting to see the press coverage when it finally bites the dust in wide-screen stereo sound.

Editorials are already appearing - the link takes you to the Washington Post, which seems not to have been keeping up with the pace of molecular biology. It's a little late to worry about "a living thing that is at least partly a human creation," guys. The bacteria that have been engineered into making interferon and insulin for us for years now are partly a human creation, you know, as are uncounted recombinant cell lines throughout academia and industry. If you want to get picky about it, chihuahuas and sweet corn are partly human creations, too: we just took longer to make those because we didn't have very good tools.

Some of that coverage is going to be breathless what-if-the-new-life-form-escapes stuff, no doubt. I think the answer should be clear to anyone who's thought about the biology: what will happen is, the organism will be outcompeted very quickly and die out. Think about it - if it were easy for an organism to survive in the wild with such a small genetic code, there'd be some critters out there doing it. Perhaps when life was just getting going it was possible to get away with it, but not now, after billions of years of fine-tuning. As Neal Stephenson colorfully puts it near the beginning of his Cryptonomicon:

Like every other creature on the face of the earth, (he) was, by birthright, a stupendous badass, albeit in the somewhat narrow technical sense that he could trace his ancestry back up a long line of slightly less highly evolved stupendous badasses to that first self-replicating gizmo - which, given the number and variety of its descendants, might justifiably be described as the most stupendous badass of all time. Everyone and everything that wasn't a stupendous badass was dead.

No, this is going to be one finicky creature, able to survive only where everything is built for its pleasure. For that reason, I'm not sure about its usefulness, either, as a platform for adding new functions. Venter's been talking about using this to engineer an organism that will be able to make hydrogen for fuel cells, or take up carbon dioxide to ameliorate greenhouse warming. I'd have to classify both those reasons as, well, hooey. There are already plenty of organisms that will take up carbon dioxide. You probably have some big ones growing in your back yard, and they're a lot more robust than this organism will be. It'll still be frail enough to be a tremendous headache to culture and keep happy, unless you start adding stuff back into it to make it more robust and free-living. And if you're going to do that, why not start off with something that's already been optimized for being robust and free-living? Don't get me wrong: there are a lot of good scientific reasons to do this work. I just worry that the explanations offered for the general public are - at the very least - inadequate.

The whole project could be explained in terms of cars and trucks: what we have here is an attempt to disassemble a small car down to the most primitive conveyance possible, by removing parts one by one until nothing extraneous remains. This stripped-down go-cart will indeed be a new vehicle, one that's so simple that it could be built from things lying around the house, stuff that you wouldn't normally associate with cars at all. People that think that you need a huge factory to build a car will be amazed. But this thing won't stand a chance on the open road, and will probably barely make it around your back yard on a warm day. You'll be able to turn it into any sort of motorized thing you want, by adding stuff back on to it - but you might be better taking one of the more complicated, capable cars that zip around on the main roads and work on them instead. (It's been a while since I came across a metaphor I could, uh, ride that far. . .)

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 20, 2002

What I, um, Meant to, um, Say

Email This Entry

Posted by Derek

I was talking with a friend at another company, and we both had occasion to recall executives we've heard who seemed unable to give a coherent speech. You've heard the sort of thing: unfocused thoughts drift by, like plastic bags being blown around an abandoned lot. . .no thought makes it all the way through a sentence before another one lands on top of it, splintering it with an irreversible crack. . .main points are composted under a heap of irrelevant clippings. I've spoken about this before, in reference to Sam Waksal and people who are very smooth at presenting their own work in the best possible light. This is at the other end of the scale - people who should, you'd think, be a lot slicker than they are.

How does someone get to a position like that and remain so inarticulate? This question comes up in politics as well, and in both cases I think it's because the person must be a lot better one-on-one or in small groups. There are plenty of people who can handle themselves well in a conversation who can't give a decent talk. (Not that I really can relate to that - I'm rarely tongue-tied, although I can think of a few times when I would have been better off that way.)

There are plenty of professors that bring the same question to mind, of course. I can't safely quote from the executives that I'm thinking of, but here's a sample of one of the worst professors of my experience. (Note: the subject matter has been changed to a cake recipe, to protect the guilty.)

"OK, you remember that last time we were going to learn how to mix a - well, I think I told you that we were going to try one of these, and if I didn't, then - this is a little like the stuff that we're actually going to get to next week, except that that doesn't have so many eggs in it, because eggs, well, eggs are a tricky thing because they have, they have the protein in them that makes stuff - well, that's not something that we're going to get into for a while, but at any rate you may know that the egg white has a lot of, a lot of properties that are really useful when you try to whip things up with a lot of air in them, which is, which isn't what the cake today really has, actually, because this one starts out with this mixture that I think I told you about last time - I'm not sure if we finished the entire thing, so just try to remember where we left off and sort of, sort of. . ."

I'm not exaggerating. I can round up witnesses that heard this person lecture - not on baking, but by the time he got through making a batter out of his chosen field, it might as well have been. You'd have been able to learn just as much about layer cakes as you could about the subject matter of the course. I would sit there for the entire hour and no note-taking impulse would ever trigger my hand to move. I've seen some good lecturers, and plenty of mediocre ones, but I've only come across a couple that actually could do you harm. You could feel yourself becoming less intelligent as you sat there; the only way to handle the course was to make sure to miss as many lectures as possible.

So, at what point does someone really think like they talk?

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 14, 2002

Left and Right, Revisited

Email This Entry

Posted by Derek

I've posted a correction in my original post below. And I've also located the paper that got me thinking about the whole thing. These folks knocked out a protein that's needed for nodal cilia to form in the embryo - what they got were nonviable mouse embryos that were left/right randomized. The flow produced by the normal cilia goes to the left, and they believe that this is a key left-right differentiation pathway. (You can see Quicktime movies of the cilia here.)

So although I botched the prokaryotic/eukaryotic cilia/flagella details, I think my point stands. I've had some interesting e-mail about my thoughts on human perception of chirality - as soon as I wrap my brain around them, I'll post some more. For all six or eight of you that care, that is - now that's service!

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 12, 2002

Y'all Are Going to Think I'm Nuts, But. . .

Email This Entry

Posted by Derek

. . .here's a question that has bothered me: How do we know our right from our left? No, really. The more I've learned (and internalized) about chirality, the more tricky this question gets. (Until you've thought about handedness and non-superimposibility for a while, these things just seem natural, of course. You have to train yourself to get this weird.)

But what we learn in chemistry is that chiral objects cannot be distinguished by an achiral probe. Fro example, you can't use plain silica gel columns to separate enantiomers; you have to pay through the nose for columns with chiral stuff on them. The thing is, we humans use chiral probes every day (our hands,) so we take the ability to discriminate chiral objects for granted. We shouldn't, though, because the next question is: what chiral probe do we use to tell which hand is which?

There, that's the original question restated. We're bilaterally symmetric, right? In stereochemistry terms, we're meso, with our own built-in reflection plane, and we shouldn't be able to distinguish chiral objects. Of course, we're not really that symmetric. We have identifying marks on each arm and hand, usually, that would give the game away. And faces often have the same sort of thing (sometimes a deliberately applied "beauty mark," which is interesting when you consider that research seems to say that the most beautiful perceived faces are the most symmetric ones.)

But I don't think that that's the real answer to my question. Where we really start to lose symmetry is in our internal organs. As everyone knows, the heart is on the left side (except in rare cases!), and the other organs follow suit in their own positions. The organ that I'm thinking of is the brain, which looks rather symmetric, true, but is about as full of handedness as an organ can get. If you're right-handed, as is well known, you do a lot of your verbal processing in your left hemisphere, and a lot of non-verbal work in your right. And your eyes each feed into the crossover hemisphere, (which has allowed some really alarming experiments with brain surgery patients that we'll have to talk about some time.)

And that's where I think the origin of our ability to perceive chirality lies. Our information-processing organ itself is chiral. But let's keep going: it's worth asking how the brain (and the rest of our internal arrangement) got that way, when you consider that we started out from a single cell (which then divided straight down the middle.) A lot of research has gone into answering that, and determining the earliest stage at which the blastula breaks symmetry.

I believe that the latest theory is that molecular signals and growth factors are believed to circulate around outside of the developing cells in a handed fashion, and that this may be the origin of the asymmetry. So where does this chiral flow come from? Well, it's driven by cilia on the cell surface, and it's well known that these always turn in one direction. (The mechanism (PDF file) is fascinating; it's a true molecular motor. You can just picture it as a Victorian-era machine, all polished brass and oiled fittings.) Correction: This is a bacterial flagellum, not a eukaryotic one. Our flagella and cilia work differently, so this picture (though still very interesting) isn't relevant.)

And the components of this machine are all different proteins. Which means that the direction of their motion relative to each other is determined by their three-dimensional shape, which is determined by the twists and turns of their constituent amino acids. . .which are chiral. And we're back to single molecules again.

So that's how you can tell your right from your left, as far as I can see. Simple, really.

Comments (0) + TrackBacks (0) | Category: General Scientific News

November 4, 2002

The Good Old Days of Really Bad Teeth, Revisited

Email This Entry

Posted by Derek

I was happy to see that Instapundit linked to my anti-Rosseau rant the other day. I hope it was therapeutic for everyone. I've received some interesting e-mail in response to it (none, yet, from any dentists.) There was one today from an archaeologist, though, who pointed out that the Indian populations who depending on corn typically ground it by hand between stones. This introduced a generous amount of grit into the resulting mean, which really did a severe job on the customer's teeth over the years. (Having seen and handled some of the grinding stones, I can attest to their grit-supplementing powers. The ones I remember were worn into sloping bowl shapes in the middle, and all that extra rock powder had to go somewhere. . .)

This wear allowed decay to set in even easier, and to cause quicker damage to the tooth once it did. The sugar content of the corn just added fuel to the bacterial fire, too. One thing that I hadn't thought about is what all this powdered rock did to the GI tracts of the consumers. It seems like a surefire recipe for intestinal trouble - I mean, fiber's a good thing and all, but no one's suggesting that folks eat handfuls of polishing compound. Does anyone have any information (or informed speculation?)

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 29, 2002

Et in Arcadia Ego

Email This Entry

Posted by Derek

There's a backlog of pharmaceutical news to catch up on, but I couldn't resist linking to this article from today's New York Times. It's a pet subject of mine, and the only fault I can find is the tone of surprise that comes through in it.

It's titled "Don't Blame Columbus," and it reports on studies on health and life expectancy in Pre-Columbian America. It's the most comprehensive look at the subject yet. Most of this information comes from the bones, naturally, but there's a lot of good information there. Unfortunately for the previous owners of said skeletons, the story they tell is often one of anemia, osteomyelitis, tuberculosis and malnutrition. But at least you didn't have to put up with that for too long: living to the age of 50 was a rare accomplishment - 35 to 40 was more like it.

The study found a long-term decline in health as the populations grew in different areas, which is interesting. But any surprise people have at the general results surprises me. When my brother and I were small children, we accompanied our parents to achaeological digs back in Arkansas. My father was a dentist, and he was there for some forensic work on the teeth of the Indian remains. What he told me back then has stayed with me: these folks had lousy teeth. They had cavities, they had abcesses, impactions, the lot. (The weakened condition of their gums due to lack of Vitamin C probably had a lot to do with it.)

So, growing up, I knew that the Hollywood depiction of Indian life was rather idealized. For one thing, all the movie actors had great teeth. And the young braves weren't like those 24-year-old actors - they were maybe 14. And the ancient medicine man, he wasn't 80 years old at all. He was in his 40s; he just looked 80. You never saw extra tribesmen in the background, hobbling around because of poorly set broken bones or clutching their jaws in pain. No skin problems, no infections, not even so much as a bad allergy - no doubt about it, the tribe to belong to was MGM.

You can imagine how I feel about the rest of the cheap thinking that goes along these lines. Oh, the way preindustrial cultures loved the land, lived in harmony with it while everyone ate the wholesome diet of natural purity and stayed true to those simple values that we've lost touch with. . .spare me. I'm with Hobbes: the life of man in the natural state was solitary, poor, nasty, brutish and short. And let's not forget it.

I'd like to blame Rousseau for the whole thing - after all, he's the usual suspect for introducing the whole Nobel Savage concept. (He extended the concept to children, too, of course. Who knows what the history of philosophy would be like had he actually raised any of his brood instead of farming them all out?) But I think that the sources of this mistake - which it is, a terrible one - go deep into human nature. No matter where you go, it seems that there's always a myth of the Golden Age, the simple, pure time when everything was right.

Manure. Fertilizer. The only thing worse than mourning this illusion is trying to do something about it: you could always set up some wonderful political system to bring Arcadia back. The last hundred years have been a stupefying object lesson in what you get when you try.

Well, enough venting for one evening. I seem to have taken off into the clouds of political theory, not bad mileage considering that I started from a pathologist's report. Tomorrow we'll be back on the ground, I promise.<

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 20, 2002

Faces In the Clouds

Email This Entry

Posted by Derek

In the last post I mentioned the tendency people have to look for causes. It's innate; there's nothing to be done. We're conditioned by the world of our senses: a leaf falls in front of us, so we look up to find the tree. And this works fine, most of the time, for the macroscopic objects that we can see, touch, hear and smell.

It stops working so well on the microscopic scale. (And it goes completely to pieces on the really submicroscopic scale, when the colors of quantum mechanics start to seep through into the picture, but that's another story.) When we don't have direct sensory experience of the steps in a process, our intuition can be crippled. You can learn your way around the problems, but that has to be a conscious effort - the rules that we've all been practicing since birth won't be enough.

And this is where a lot of really bad ideas are born. Take the idea of a "cancer cluster." If we see a pile of large rocks with no others around, we assume that something moved them there. If we see a group of similar plants, we assume that they've grown there together from seeds or roots. But what is there to say when there's a group of cancer victims in a given area?

The temptation is overwhelming to say "something put them there." But it doesn't have to be so. People who haven't thought much about statistics don't usually have a good feel for what "random" means. It doesn't mean "even scattered in no particular pattern." It means "no particular pattern, and let the chips fall where they may." Looked at locally, a large random distribution isn't even at all - it's lumpy and patchy. Show a dozen untrained eyes a large scatterplot of random numbers and they'll never guess that there's no design behind it. Surely that bunch down there means something? And that swath that cuts over this way! Imposing patterns is what we do.

Discriminating between these accidental groups and any that might have a cause is fiendishly difficult. Generally, the only proof is statistical - you end up saying that you can't reject the null hypothesis, that this group is not larger than you would expect by chance. So in the absence of any hypothetical cause, there's no reason to assume that it's anything other than noise. Does that convince anyone? No one that really needs the convincing.

Statistics are all that'll save you, though, because the alternative is just noise and advocacy: trying to settle arguments by who's louder and more convinced that they're right. People who understand the math get upset when they argue with people who don't, because they can't make themselves understood. Their best evidence is in a language that the other side can't speak. Likewise, the advocates get terribly frustrated with the statistics-mongers, because they seem to be in the business of denying what's right in front of their eyes.

And that's why scientists and engineers are so happy to talk with other scientists and engineers. It's not that there aren't arguments - oh yeah, plenty of 'em - but there's at least a chance that you can convince people with data. Outside of those fields, I've come increasingly to think, the chances of doing that are often minimal.

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 16, 2002

Cloning's Growing Pains

Email This Entry

Posted by Derek

Ian Wilmut and his colleagues have an interesting review in a recent issue of Nature (no web link) on the status of mammalian cloning. It's still so difficult that it almost qualifies as a stunt. Several species have had the nuclear-transfer technique that produced Dolly the sheep applied successfully (if you can use that word for a technique that has at least a 95% failure rate.) But others haven't, and it's not clear why some work and some don't (for example, mice and rats, respectively.)

What the article makes very clear is that the animals produced in this way are far from normal, and that we don't even have a good handle on the extent of their abnormalities yet. (In cases like cattle, we're going to have to wait years to see how they age, for one thing.) They point out that close examination of even the young cloned animals turns up differences, many of which are probably deleterious.

Why all the problems? Shouldn't the genetic material in the new nucleus just pop right into the cell and go to work? These experiments have been a dramatic demonstration that any scheme that treats a cell and its nucleus as separate entities has serious shortcomings. There are epigenetic influences at work (changes in inheritance by means other than changing DNA sequence,) and we're just barely starting to understand them. Subtle chemical changes in the DNA bases and their associated proteins can lead to large differences down the line, and it appears that some of these signals get scrambled and mismatched during the nuclear transfer.

These are not secrets; the workers in this area have been very upfront about all these problems. (Wilmut published an article not long ago titled "Are There Any Normal Cloned Mammals?") Many researchers are using the technique because these problems exist, actually - it's a good way to study phenomena that would otherwise be hard to unravel. But all this makes me think that those persistant reports of a cloned human baby are probably nonsense.

They'd better be. With the state of the art being what it is, anyone who has actually tried this on humans is going to have a lot to answer for.

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 13, 2002

Nobelity and Lesser Nobelity

Email This Entry

Posted by Derek

When I referred to Nobels this year as being well-deserved, that got me to thinking. How many scientific Nobels haven't been? If you go back to the early years of the awards, there actually are some stinkers. And there are a few mild head-scratchers, like Einstein winning for the photoelectric effect (rather than the still-controversial-at-the-time relativity.) But in recent decades, it's hard to find many problematic Chemistry, Medicine, or Physics prizes. As a chemist, when I look back over the list of laureates in my field, I don't find much to argue about.

The three-awardee limitation has caused problems now and then. And the timing of the awards in general has been arguable - sometimes the committees just wait too long beforing honoring someone. (Maybe that's why the Karolinska folks went out on a limb by honoring Stanley Prusiner and the prion hypothesis a couple of years back, probably the most out-on-the-edge medicine Nobel ever. Fortunately, the hypothesis seems to be holding up.) And there are always people that could have won, but never did.

But just check out the other prizes - you couldn't ask for a clearer example of the differences between the sciences and the humanities. Whatever controversies there are in the science prizes start to just look like quibbling compared to what goes on with Literature and Peace. Think of the percentage of those that have been won by people considered by many to be frauds or windbags. Then subtract out the nonentities, and make allowances for clearly deserving candidates who never won - and what do you have left?

You find blunders on the order of say, tapping Maurice Maeterlinck over Tolstoy for Literature. And while we're on the subject, how about ignoring James Joyce, ignoring Vladimir Nabokov, Jorge Luis Borges. . .write your own list, it's easy. Meanwhile, just in English-speaking awards, we have Sinclair Lewis (hrm,) John Steinbeck (hrmmm,) Pearl Buck (hrrrrrrmmmmm.) Other languages get to join the fun, too - how about Dario Fo in Italian? Why not give it to George Carlin while you're at it, a reasonably close equivalent in English?

I won't even get started on the Peace prize. Deserving people and organizations have won it, but so have have blood-drenched maniacs, delusional self-promoters, and insufferable twits. (You can attach your own names to those as you wish; I can assure you that I'm thinking of my selections right now.)

No, the science prizes are oases of sanity compared to those two: Literature winners who have produced nothing except rivers of drivel, and Peace winners who wouldn't know peace if it crawled up their leg. Heck, as long as they award the Chemistry prize to someone who knows what the periodic table is, we're ahead of the game.

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 10, 2002

Another Stuffed Shirt

Email This Entry

Posted by Derek

Talking about the Nobels brings to mind a story from Sydney Brenner, one of those honored with the Medicine prize this year. He related this story in a column he did for Current Biology a few years ago (8 (23), 19 Nov 1998, R825 if you want to look it up.) He was visiting a company in Japan ("W---- Pharmaceuticals") that made some sort of herbal brew made from fermented garlic, which tasted just as awful as you 'd guess. It had to be given in capsules, but the dose was large enough that they couldn't sell them filled without losing many of the packages to breakage and leaks. So (turning this into a marketing tool,) they sold the stuff as a kit, with empty capsules and a dropper to make your own dose.

Brenner mentioned that he'd like to try the stuff, so they trotted off and brought him one. While he was mixing up his garlic dose, he seems to have had an inspiration: he swallowed it, then cried out, gave a strangled gurgle, and pitched off his chair onto the floor.

Well, that got everyone's attention, as you can imagine. He relates that he kept one eye partially open to gauge the effect of his performance, and what he saw was a stunned roomful of Japanese businessmen with the blood draining from their faces. He claims to have noted a couple of expressions that he interpreted as preliminary thoughts about what to do with the body.

He let them off the hook pretty quickly, which was probably wise, springing to his feet, waving and laughing to the hysterical relief of his hosts. As he says

"I am quite famous in Japan for this, and every now and then, somebody comes up to me, shaking their head, nudging me and saying "W--- Pharmeceuticals!"

My kind of guy! And the Nobel he shared is another well-deserved one. The study of the roundworm C. elegans has been an extremely useful technique, since it's multicellular, but not too much so. You can follow the fate of every single one of its cells as it develops, and some rather odd stuff happens. For example, as it turns out, not all of them make it. Particular excess cells die out at particular times, and this programmed cell death (apoptosis) has now been the subject of more research articles than you can shake an Eppendorf vial at. (That's what the mention of "cancer treatments" that the prize got in the popular press meant - tumor cells generally should have fallen on their metabolic swords and died at some point, but mysteriously haven't.)

This work has set off discovery in all sorts of other areas, too. There are a surprising number of cellular pathways that are conserved all the way to humans, and it's a heck of a lot easier to study them in the worms. Looking for these is almost a guarantee of working on something fundamental, because anything that's similar across that sort of phylogenetic gap is bound to be pretty important. Getting a crib sheet to the key pathways along with a fine model organism, all in the same research program - that's how to do it, all right.

Comments (0) + TrackBacks (0) | Category: General Scientific News

October 9, 2002

Nobel Time!

Email This Entry

Posted by Derek

Congratulations to John Fenn, Koichi Tanaka, and Kurt Wuerthrich for sharing the 2002 Chemistry Nobel. The common theme is characterization of proteins and other macromolecules, and the discoveries are (respectively) electrospray ionization for mass spectrometry, laser desorption for the same, and 2-D NMR techniques.

I'll write more on this tonight, but for what my opinion's worth, I'd say these are well-deserved accolades for some important techniques that otherwise wouldn't be as well recognized.

Comments (0) + TrackBacks (0) | Category: General Scientific News

The Bigger They Are

Email This Entry

Posted by Derek

The Chemistry Nobel this year doesn't include any household names, even by the standards of my branch of the science. But (as I said this morning,) I think the award is a good one. The ability to deal with large molecules like proteins as molecules is a relatively recent development. Before these sorts of methods were worked out, you stepped into another world when you worked with such things. The precision of "real" organic chemistry (such as it is!) disappeared.

A newly discovered protein might weigh, oh, 60,000 or so give or take a few hundred units (or a few thousand.) That's pretty fuzzy, when you compare it to the world of small molecules, which can be measured out to four decimal places. (Doing that, you have to correct for picky things like the 1% abundance of isotopic carbon-13 atoms rather than the usual carbon-12 - not the sort of thing that kept the protein chemists up at night, that's for sure.) And the 3-D structure of your new beast? Good luck! Maybe it would come to you in a vision. . .failing that, you could try to crystallize it and hope for the best in an X-ray analysis. But many proteins don't crystallize (or at least don't crystallize well on human time scales,) many that do don't give good data, and even the ones that can don't always give you realistic structures. After all, the proteins floating around in your cells aren't packed in a crystal lattice with billions of their identical twins. You'd better hope they aren't, anyway. They're surrounded by water, other proteins, lipids, and who knows what.

So, before the mass spec techniques of today's prize were developed, you could put a big ol' protein into a mass spectrometer, sure - and get an extraordinary mess of fragments out the other end. That's not always bad, of course (one of the points of mass spectra is the information that the fragmentation pattern provides,) but you'd like to be able just to see the mass of the parent, too. Now we can. Ridiculously huge molecules can be made to fly off, intact, into the hard vacuum of the mass spectrometer, there to be sorted by mass and charge. A few years ago, some lunatics even tried this on an intact virus. (PDF file.) They ionized it (without destroying it, thanks to these methods) and flew it down the mass spectrometer. When they collected the virus particles at the other end, they were still infectious - the only viruses to survive an ionizing flight in a vacuum, if that's the verb to use with a virus. (Unless, of course, they're raining down on us from space, a possibility this experiment does not dispel.)

The same goes for NMR, the veg-o-matic analytical technique of the organic chemist (thanks to all the tricks you can play with it.) Here's a brief history: The original method (Nobel Prize!) showed you the hydrogens in a molecule, and that's still the first thing we do. Want to see the carbons, instead? You can tune it for that, as well as plenty of more exotic nuclei. Then, in the 1950s and early 1960s, it was discovered that the splitting of the NMR lines (coupling constants) would tell you the angle between the two adjacent hydrogens that caused it, and suddenly 3-D structural information began to be extracted (as well as another Nobel or two.) Then the nuclear Overhauser effect was exploited (if you don't know how NOE works, I'll need to see payment - in cash or precious metals - before I explain it. Inquire within.) That tells you if particular hydrogens are close together in space, no matter how the rest of the molecule might be connected. More 3-D structural information started to come into focus.

The next big thing was 2-D NMR spectra, where you could extract (among other things) all the possible coupling constants or all the possible NOEs simultaneously. (These sorts of techniques will take most smalleunknown molecules and nail their structure up on the wall in matter of minutes or hours, the sort of thing that used to take years of hair-pulling effort.) Now we're getting to the area of today's Nobel: Applying these techniques to really complex molecules, like proteins, allows a look their real structure. That's the structure in solution, with whatever added molecules you want or need. In short, it gives you a look at the real animal, instead of a stuffed and mounted version (which, as mentioned above, is more like what X-ray crystallography does.)

There are limitations. Some kinds and sizes of proteins don't give good spectra, and many of them live in environments too complex to (yet) be reproduced in an NMR experiment. But the field's moving right along. If we're going to realize the promise of medicinal chemistry, we're going to need as much of this sort of work as we can get. The molecules of the living cell aren't special - they're big, they're complex, they do amazing things - but they're just molecules. It's good to be able to work with them that way.

Note: for a very nice technical discussion of today's prizes, see this PDF from the Royal Swedish Academy. Try to avoid most newspaper articles, since (as usual) the subject matter of the awards will be unrecognizably diluted.

Comments (0) + TrackBacks (0) | Category: General Scientific News

August 14, 2002

Our Friend the Phosphate Group, Redux

Email This Entry

Posted by Derek

By the way, just to introduce some medicinal chemistry into this week's postings, I should point out that there's another way in which kinases outnumber phosphatases: the number of inhibitors known. It's true that we went a long time without good structural classes of compounds to inhibit kinases, but the dam burst some years back.

Now we've beaten several classes of heterocyclic structures completely into the ground, and the patent landscape looks like Yasgur's farm after they got finished holding Woodstock. But we do have kinase inhibitors, and plenty of 'em. So where are the phosphatase blockers?

Look around the literature, and you see all sorts of odd and funky structures, but no unifying themes. The first outfit that finds a general drug-friendly structural template to go after these targets will have quite a franchise on its hands.

Comments (0) + TrackBacks (0) | Category: General Scientific News

August 13, 2002

Our Friend the Phosphate Group

Email This Entry

Posted by Derek

As for phosphorylation, I've had some folks write to talk about the importance of phosphate cleavages for cellular energy production, and about the conformational effects of phosphorylation. All that's well taken - but I guess what I was getting at yesterday is that (for example) sulfation would seem to be a perfectly reasonable way to modify proteins. Why didn't life end up using it?

Perhaps the phosphate energy part is the key. That's such a basic mechanism that enzymes to handle phosphate groups must be archaic indeed. It could be that evolution just found a use for them, since they were there anyway, and that competing methods of post-transcriptional modification (like sulfation) never got off the ground. Of course, there's always glycosylation - wonder when that kicked in, evolutionarily?

Comments (0) + TrackBacks (0) | Category: General Scientific News | Life As We (Don't) Know It

June 16, 2002

A Strange Compound, In Strange Places

Email This Entry

Posted by Derek

A recent paper (Ang. Chem. Int. Ed.41 1740, for those with chemistry libraries at hand) illustrates some interesting things about "natural" and "unnatural" compounds.

It's well known that polychlorinated molecules (DDT, PCBs and others) are quite stable and persistent. Glenn Reynolds over at Instapundit stirred up some folks a few days ago with a reference to DDT, but no matter whose views you subscribe to, there's no denying that the stuff hangs around. One reason is that the compounds are quite lipophilic - they don't sit in aqueous solution waiting to react with things, and they tend to accumulate in lipid tissues of animals, which takes them out of circulation. Another reason is that polychlorinated compounds just aren't all that reactive in general; they're poor substrates for many standard reactions.

(As an aside, that's one of the reasons that the atmospheric effects of chlorofluorocarbons took such a long time to be recognized. These compounds are almost completely unreactive under biological conditions, and not much more lively under even forcing artificial ones. But no one had thought much about what would happen if they got into the upper atmosphere, where they could be hit by hard ultraviolet, the sort that doesn't make it to the ground. . .)

At any rate, there's been a clear distinction between polychlorinated man-made compounds and others which occur in nature. There are thousands of them, actually, many made by marine organisms (who have plenty of chlorine, bromine, and iodine to play around with in seawater.) These tend to be more water-soluble and reactive, though, and haven't been found to persist in tissues.

Until now, it seems. The paper I referred to gives leading references to an odd compound called Q1. It's been found in biological samples all over the world, and sometimes in some seriously large concentrations. Analysis showed it to have an empirical formula that doesn't correspond to any compound that's ever been reported. The paper has the real structure, confirmed by two different syntheses, and it's a rather odd-looking chlorinated bipyrrole.

The thing is, no one knows where the stuff comes from. And (for once) it doesn't appear to be us, since the compound was completely unknown until now. Nothing even that close to it is produced industrially. The closest things are some other halogenated pyrroles found in some marine bacteria, lending credence to the theory that this is a biogenic material. No one's found one like this, though.

As I've spoken to people about this, reactions have been interesting. Most of my fellow chemists have found the structure intruiging, and the idea that it's a natural product pretty weird. But I've had a couple of colleagues react by saying "That has to be coming from people." When I point out the features mentioned in the paragraph above, it doesn't seem to convince them. "It'll turn out to be from us," I've been told.

Well, I'm with the authors of the paper in not thinking so. Nature's got a lot of surprises, and making something that looks just like a synthetic pollutant doesn't seem like too much of a stretch. Think of it as a polychlorinated prank, pulled off by some dinoflagellate or red algae.

Comments (0) + TrackBacks (0) | Category: General Scientific News