« Maitotoxin: It's On, All Right |
| Stable Helical Peptides Can Do It All? »
June 29, 2010
The Ideal Synthesis
Phil Baran of Scripps has a paper out on the "ideal synthesis" of complex molecules. It's mostly a review of a number of his group's own syntheses, but it's done in light of his definition of "ideal": all bond-forming steps, with no protecting group manipulations or oxidation-state maneuvering.
That's a tough standard, but many biosynthetic routes reach 100% against it. I think that the highest figure from one of the Baran group's own syntheses is 84%, but he emphasizes that comparing these figures across the synthesis of different molecules isn't too meaningful, since they each carry their own issues. Comparing different routes to the same molecule is what he has in mind; it's a pity that no one else is ever going to make maitotoxin.
He also emphasizes that "ideality" isn't the only consideration in a synthesis. It gets at some key issues, but others (availability of reagents, ease of experimental procedures or purifications) can trump ideality out in the real world. You certainly see that in process chemistry in the drug industry. A reliable procedure that always gives the same (but lower) purity will win out over a temperamental one that sometimes gives wonderful material but sometimes craps out. And an elegant-looking route that gives a small amount of an intractable impurity isn't so elegant, compared to a slightly longer one that delivers material that's easily cleaned up.
The same goes for reagents. Ideally, you'd want to be able to buy all of them, and cheaply, too. But that's where the comparison with those 100% ideal biosynthetic routes breaks down. The enzymes that accomplish them are nothing if not bespoke reagents, doing one thing only but nearly perfectly. And there's that matter of a billion years of evolutionary overhead to factor in to the development costs. Of course, the other great thing about enzymes is that they're catalytic, and can just keep turning over reactions constantly. If they were one-time-use, like many of our reagents from the catalogs, it wouldn't matter how incredibly high-yielding and specific they were; the horrendous waste of time and material required to produce them for just one transformation would rule them out. Average those expenses out over the turnover numbers of a typical enzyme, though, and things look very good indeed.
I think that Baran's criteria are well worth keeping in mind, although I also think that most synthetic chemists already think this way, to one degree or another. I always gritted my teeth when I put on a protecting group during my total synthesis days, because I knew that I was adding another step (and more potential trouble) down the line when it had to come off again. Mind you, I was putting the thing on to avoid what I saw as even more immediate trouble, but I guess that's one of the things that Baran is saying, that it's time to try to stop making such deals if we can.
+ TrackBacks (0) | Category: Chemical News
POST A COMMENT
- RELATED ENTRIES
- Pay-to-Delay: Not Necessarily Illegal, But Not Long For The World
- GPCRs Are As Crazy As You Thought
- Aggravating Aggregators
- Making Changes Inside Merck's R&D
- One. . .Million. . .Pounds (For a New Antibiotic?)
- A Beta-Secretase Inhibitor Hits the Skids in Alzheimer's
- The Supreme Court Rules on Myriad
- Watching DNA Polymerase Do Its Thing