About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: email@example.com
May 31, 2006
You have to figure that Merck is getting tired of restating and re-explaining its Vioxx numbers. I certainly am. The only people who aren't fed up with it, I'm sure, are the hordes of lawyers for the plaintiffs. They're munching popcorn and waving pom-poms as Merck staggers around in circles.
The latest episode is a statistical mixup of the APPROVe trial, which I last wrote about here. In the NEJM publication of the results and in Merck's submission of the data to the FDA, they mention using log(time) as a variable in their primary statistical method. But they report statistical tests in the paper which used a model with plain old linear time as a variable. Merck says that "The results of diagnostic steps specified in that data analysis plan indicate that the linear test is an appropriate method to assess changes in the relative risk over time", although they'd surely rather not have to backtrack and make that argument.
This issue affects the measurement of the change in relative risk over time, not the magnitude of the risk itself. Merck's taking pains to point out that the overall magnitude of those relative risks were described correctly. That's fine, I guess, as far as it goes. The problem is, Merck has already been making a big deal out of that change in risk with time, namely that patients weren't at risk unless they'd been taking Vioxx for at least 18 months. So this is, unfortunately for them, a relevant issue.
What's the difference come to? For the difference between risk levels before and after the 18-month threshold, Merck reported a p-value of 0.01 using linear time, but if you run the method the way it's actually outlined in the paper (log time), you get p = 0.07, which is certainly worse. In fact, in my experience, you start losing your audience at p values of 0.03 or 0.05, and that's what seems to be happening. When Merck says that this error does not affect the conclusions of the study, they're only partly correct. What it affects are the believability of the conclusions, and once again, the revision makes things look worse for them.
Honestly, guys. What's with you these days?
+ TrackBacks (0) | Category: Cardiovascular Disease
May 30, 2006
Perchloric acid almost makes my list by itself, although technically I can't quite include it, since I've already used it. I used the commercial grade, which is 70% strength in water, and it's pretty nasty stuff. It'll chew through your lab coat and give you burns you'll regret, as you'd expect from something that's rather stronger than nitric or sulfuric acid.
But it has other properties. The perchlorate anion is in a high oxidation state, and what goes up, must come down. A rapid drop in oxidation state, as chemists know, is often accompanied by loud noises and flying debris, particularly when the products formed are gaseous and have that pesky urge to expand. If you take the acid up to water-free concentrations, which is most highly not recommended, you'll probably want to wear chain mail, because it's tricky stuff. You can even go further and distill out the perchloric anhydride (dichlorine heptoxide) if you have no sense whatsoever. It's a liquid with a boiling point of around 80 C, and I'd like to shake the hand of whoever determined that property, assuming he has one left.
Perchlorate salts show similar tendencies. The safety literature is just full of alarming stories about old lab benches that had had perchlorates soaked into them years before and exploded when someone banged on them. They're a common component of solid rocket fuels and fireworks, as you'd figure. As with other lively counterions, the alkali metal salts (lithium, sodium, etc.) are comparatively well-behaved, with things heading downhill as you go to larger and fluffier cations. I've used things like zinc and magnesium perchlorate, but I would refuse, for example, to share a room with any visible samples of the lead or mercury salts.
People have made organic perchlorate esters, too, which doesn't strike me as a very good idea - unless, of course, you're actively searching for a way to blow up your rota-vap. Which is exactly what happened in the paper I saw on the synthesis of ethyl perchlorate, as I recall. If you'd like to make your mark, this seems to be a relatively unexplored field. The problem is, the mark you're most likely to make is in the nature of a nasty stain on the far wall.
Perhaps the most unnerving derivative I know of is fluorine perchlorate. That one was reported in 1947 (JACS 69, 677) by Rohrback and Cady. It's easily synthesized, if you're tired of this earthly existence, by passing fluorine gas over concentrated perchloric acid. You get a volatile liquid that boils at about -16 C and freezes at -167.3, which exact value I note because the authors nearly blew themselves up trying to determine it. The liquid detonated each time it began to crystallize, which is certainly the mark of a compound with a spirited nature.
The gas, meanwhile, blows up given any chance at all - contact with a rough surface, with tiny specks of any type of organic matter, that sort of thing. The paper notes that it has "a sharp acid-like odor, and irritates the throat and lungs, producing prolonged coughing". My sympathies go out to whichever one of them discovered that. No, if it's all the same to science, I think I'll let others explore the hidden byways of perchlorate chemistry. . .
+ TrackBacks (0) | Category: Things I Won't Work With
May 29, 2006
I was struck by a point that came up in the comments to the last post, about how since discovery organizations are going to have a certain percentage of failures, why not use that as a measurement of whether or not they're doing their job? Perhaps there should be a "failure quota" - if too many things have worked, perhaps it's because you're playing things too safe.
It's an intriguing idea, but I can see a few potential problems. For one thing, you'd need to be able to distinguish between playing it too safe and being really good (or really lucky). For another, there are quite a few organizations that are spending all their time trying to play it as safe as possible. If your research budget is running a bit lean because you don't have that many good products out there, then you may not feel like taking many extra risks. In that situation, the whole phrase "too many things have worked" just doesn't even parse.
It would be useful, though, for drug discovery organizations of any type to be a bit more realistic about how many of their efforts are going to fail. I mean, everyone knows the statistics, but everyone pretends that it's not going to be their own project that goes down. This is wishful thinking. Clearly, most of the time it is going to be your own project, because most project don't make it.
This isn't a license to give up. We should still do whatever we can think of to keep it from happening to our projects. But we shouldn't be amazed when our best efforts fail.
+ TrackBacks (0) | Category: Who Discovers and Why
May 25, 2006
Raymond Firestone is a retired medicinal chemist with a long and distinguished career, most recently at Bristol-Meyers Squibb. He's never been very shy about speaking his mind, in person or in print, and it's nice to see that time has not mellowed him. A colleague, under the e-mail title of "Ray Firestone being Ray Firestone" pointed out a letter from him in a recent issue of Nature, in which he responds to the idea that the Bayer-Schering deal (and others like it) are necessary for innovation:
My experience, during 50 years' research in big pharma, is the opposite. Large companies are always inefficient because their command structure makes them so. Any organization with many layers, where power flows from the top down, works against innovation look at the widely reported depletion of big-budget companies' pipelines.
The reason is that people in the middle layers, who neither control events nor engage in discovery, are too afraid to respond favorably to genuinely new ideas. If they encourage one and then it flops, as most innovations do, they are marked for demotion or dismissal. But if they kill novel programs, no one will ever know that a great thing died before it was born, and they are safe. . .Nowadays most of the innovation takes place in small outfits, because it is not crushed there.
I can't say that he doesn't have a point, because I've seen just what he's talking about. But the flip side, which unfortunately isn't as common, is that some large organizations have been able to innovate because they're big enough not to mind a little failure here and there. And large organizations provide more places for people (and projects) to hide for a while, which is occasionally beneficial.
Anyway, if anyone has Firestone's e-mail address, feel free to send him to this recent post, which should make him feel right at home.
+ TrackBacks (0) | Category: Who Discovers and Why
Things were pretty disrupted on the servers around here yesterday, and we seem to have lost a few of the most recent comments. I couldn't get on last night to put up a new post, and won't be able to until this evening (work, y'know).
For my readers with a lab right outside their door, my advice is to go set up something weird. It always helps to have something going that's off track from your regular work. Mind you, the stuff I have going on in that category is in the process of ruining my health, because we still haven't been able to analyze my control experiments. But if you pick something that doesn't depend on one critical piece of machinery, you should be fine.
+ TrackBacks (0) | Category: Blog Housekeeping
May 23, 2006
There are a number of reagents that you used to be able to buy which are no longer around. Some of these have just fallen out of favor, but a compound has to go pretty far down the list before no one sees any profit in selling it. The more common reasons for the disappearance are a bit more dramatic.
A notorious example is "Magic Methyl" (methyl fluorosulfonate). Flurosulfonate is about as good a covalent leaving group as nature provides, and Magic Methyl was accordingly one heck of a way to methylate anions that turned up their noses at anything difficult. Problem was, though, that it also tended to methylate the user. There was at least one fatality in the 1970s from a not-very-large spill of the stuff, and by the time I got to grad school it had been pulled from commercial supply. It's never coming back, either. You can still make the stuff and use it yourself, and people do once in a while (not to mention things that are even more reactive, although that one's not volatile, at least). But there are research organizations that forbid even that.
There are substitutes, but nothing's quite in the same league. Methyl triflate is the closest thing going, as far as I know. It's an open question as to how much less nasty that one is - you can still buy it by the gallon. No one's been killed by it, but if someone dropped a bottle near me I'd still hold my breath and dive out the door.
Dess-Martin reagent is one that's appeared and disappeared over the years. It's a useful oxidizing reagent, which tends to react very cleanly and on some substrates that are hard to work with otherwise. Making it has always been a nerve-wracking process, though. The reagent itself shouldn't be heated, but is reasonably well-behaved. But the intermediate compound in the synthesis (IBX) has been known for some time to be erratically explosive, especially if it's allowed to dry out. It's sensitive to impact, which always made for a good time when it was time to get it out of the funnel after filtering it.
The fun didn't stop there. The last step in the synthesis, right after the IBX formation, was famously wonky, and has only been ironed out in recent years. Or so I'm told - I made a couple of hundred-gram batchs of the stuff, fifteen years ago, going two for three in attempts on the last step, and do not plan to do so again. You can buy the reagent at the moment, but it's been dropped from catalogs before (as Aldrich did during the 1990s).
+ TrackBacks (0) | Category: Things I Won't Work With
May 22, 2006
The New York Times has a good article today on the Vioxx data that I was talking about here last week. Check the graphic of the Kaplan-Meier charts especially; it's a good illustration of the problem. Merck is technically correct that the latest data still don't show a statistically meaningful difference between the Vioxx group and placebo until at least 18 months. As the article makes clear, they're hitting that theme very hard.
But Merck is also living in a dream world if they think that's going to help them much at this point. The problem is, the data look as if they're trending worse from a much earlier stage, and finally reach significance at the later time points. No lawyer in the world is going to walk away from that without driving it into the jury's heads that the danger is plain to see, yes, right there from the beginning, and don't talk to me about p-values when anyone can just look at this chart - your chart! - and see what's really going on. . .etc. We live by statistical arguments in the drug industry, but the people who are being called to jury duty sure don't. If I were one of the plaintiff's attorneys, I'd use the voir dire to make sure that anyone who knew anything about statistics never saw the inside of the jury box.
What's worse, to nonscientists, making statistics the centerpiece of your defense sounds shifty. People don't trust them; it's not for nothing that there are all those variously attributed quotations about "Lies, damned lies, and statistics". Now, if someone asks "Why are you so sure?" about something where I work, the answer "p less than point-oh-oh-five" will stop the questioner in their tracks. Not so in most workplaces, where that answer would make you sound as if you're dodging the question. And let's face it, the only p-values that strong that Merck can show are the ones that work against them.
The other problem is that a statistical approach is valid for large samples, the larger the better. But the jury isn't looking at a large sample. They're not there to decide how much Vioxx might have raised aggregate cardiovascular risk in certain subgroups, they're there to decide if it caused a heart attack in that guy sitting over there. The attorneys are going to keep things as personal as possible.
+ TrackBacks (0) | Category: Cardiovascular Disease
I know I have some people stopping by to see how the experiments I described on Saturday have turned out. Well, the runs that I did late last week were not kind to the instrument they ran through, so one of my colleagues is now trying to get the machine back to its usual state. I'm ready to go as soon as things look normal, which could be this afternoon. If not, I'll put everything in the freezer and we'll run 'em tomorrow. And yes, the suspense is getting to me.
+ TrackBacks (0) | Category: Birth of an Idea
May 21, 2006
During long meetings, my thoughts turn to all sorts of useful topics - pressing things like, "If we ever meet intelligent aliens, what will they know about chemistry compared to us?" (I'm having to make some assumptions with that thought, of course, because any aliens that can send us so much as a ham sandwich from another star system already have us totally outclassed). But the question doesn't have to involve any space travel; you could just as easily ask what we'd be doing now if the history of the science had gone differently. Did it have to evolve the way it did?
For example, there are an awful lot of old carbonyl-condensation reactions - aldol, Claisen, Dieckmann, etc. Are these inevitable early discoveries? You could make a case for "yes", because the starting materials are often such basic organic chemicals (aldehydes, esters), and their reactions would probably be among the first things explored. Besides, the reactions of stabilized carbanions are a cornerstone of organic chemistry, and even if things got a bit out of order you'd think that this would have to still be the case, The same goes, and more so, for nucleophilic substitution. I don't see any sort of organic chemistry getting very far without the discovery of things like the Williamson ether synthesis and the Finkelstein reaction, and the principles behind them.
The wild cards would probably be organometallic reactions. Grignard reagents might be an example of things were discovered earlier than they should have been. We still don't know all the details of their formation and reactivity, a hundred years on. And on the other side, did it have to take so long for the palladium couplings we all use to be discovered? After all, palladium was already known to do a lot of interesting organic chemistry, even fifty years ago. But as late as the 1980s, palladium-catalyzed carbon-carbon couplings were a bit exotic. Think, though, of what the field would look like if someone had stumbled over the Suzuki coupling in, say, 1949. . .
The history of oxidation and reduction, though, could easily be moved around, since there are so many means to accomplish similar ends. It's possible to imagine a world where the early organic synthesis papers aren't so full of Jones reagent and the other chromiums, but where some sort of permanganate or ruthenium reagent was the favorite. As for reduction, like him or hate him, where would boron reagents have been without H. C. Brown? ("Probably more widely used", I can hear some people muttering. . .)
That brings up the whole topic of personality. Historians frown on the "great man" viewpoint, but inside one scientific discipline it's hard to ignore it. Organic synthesis would certainly exist if R. B. Woodward had never been born, but it's for certain that it wouldn't look the way it does now. . .
+ TrackBacks (0) | Category: Chemical News | Drug Industry History
May 20, 2006
Well, it's about two in the afternoon here on Saturday. I don't blog from work, but this isn't exactly a workday, is it? I'm here to set up my control experiments that I spoke about, and let me tell you, it's quiet around here. There's not a lot of work done on weekends in industry - in fact, it's discourated, for insurance reasons. Just from the standpoint of common sense, it's not a good idea to come in alone and set up a big dialkylzinc reaction or a high-pressure hydrogenation when there's no one in sight.
But I'm not using anything nasty in these experiments - heck, I could probably drink some of the solutions, although that would be a pretty expensive cocktail, and three hundred microliters wouldn't be very refreshing. Right now I'm waiting for some stuff from the freezer to come up to room temperature. Then, as Portnoy's therapist said, we may perhaps to begin.
(Ten minutes later): The frozen stuff is about thawed out, and in the meantime I went down and borrowed a half-mL of reagent from the biologists downstairs. (There's no sign of life in their labs today, either). It's a common enough chemical, but it's not something you'd find as easily in a chemistry lab. (Keep in mind that biologists have things like reagent-grade olive oil in their cabinets). They had a one-liter bottle of what I needed sitting around, so I think my 500 microliters won't be a problem.
(Five minutes later): OK, things look ready to go. I've got some fresh solutions made up, labled by hand on the sides of the glass vials in the traditional blue Sharpie. Now to get things in the vials. I'm running ten vials today - five experiments, each in duplicate. There's a repeat of the vial thirty-three run that looked good last Thursday, of course, and a repeat of the corresponding blank control. Then I'm running three more controls, each of which should knock my unusual effect back down to nothing in a different way. If these go off the way I hope, it'll be pretty convincing evidence that I'm right.
Of course, as I've written before, these are nerve-wracking experiments to set up, because (looking at them another way), what I'm trying to do is try as hard as possible to kill off my exciting results. If I were dealing in mystic revealations here, once would be enough - heck, that first moment of inspiration several years ago would be enough. But for scientists and engineers, no one believes in anything until it's been done again, over and over, and until it's resisted strenuous attempts to make it go away. If's perverse, but it works. Now to the lab bench.
(Over an hour later): Man, that was unpleasant. Took a lot longer than I figured. For one thing, I messed up one calculation and had to redo a few vials. Another problem is that since there are five arms to the experiment, each with two vials, it means that I couldn't save much time by making stock solutions and portioning them out. Each vial was more of a hand-crafted affair. But they're all done, and sitting on my lab bench, where they'll stay until Monday morning. During the day I should be able to get them analyzed, and if all goes according to plan, I'll know Monday afternoon if I'm looking at something wonderful, or yet another handful of dry leaves and lint.
+ TrackBacks (0) | Category: Birth of an Idea
May 18, 2006
A quick request: I'd be interesting in hearing anyone's experiences with any of the various adaptive clinical trial designs. I'm starting to work on an article on the subject, and thought it would be worth hearing some real-world experiences. Feel free to e-mail me or use the comments - thanks!
+ TrackBacks (0) | Category: Clinical Trials
This morning I got the results in from the first experiments that I spoke about here. Most of them did nothing at all. Nothing in the blank controls, nothing in the experimental wells.
There were forty vials to examine, and there was nothing to report for quite a while. But vial number 33, that one appears to have worked. If the reading from it is accurate. I can hardly believe what I'm seeing,
But einmal ist keinmal, especially where wonderful results are concerned. I'm coming in over the weekend to set more controls and repeats to have them done on Monday, which is the next chance I'll have to get anything analyzed. If I can make it happen again, I've just had the most interesting and important result of my entire scientific career.
And I just can't tell you how surprised I am at that possibility.
+ TrackBacks (0) | Category: Birth of an Idea
May 17, 2006
I mentioned yesterday that my opinion of Merck and their handling of the Vioxx cases isn't very high these days. The reason for this is the press release that the company sent out a few days ago on follow-up data to the APPROVe study, which is the one that caused the company to withdraw Vioxx in the first place.
That study was looking at possible use of Vioxx for the prevention of precancerous colon polyps. That may sound slightly insane if you're not following the field, but there's some biochemical rationale that suggests a role for inhibition of COX-2 against colon cancer. (This would be another huge market, naturally, which is why Merck - and Pfizer - have both looked into it). As the world knows, the study also showed clear evidence of an increased cardiovascular risk after 18 months of Vioxx use., and that's what started us all on the bumpy road to where we are today.
The APPROVe study was designed to have a one-year follow-up period to evaluate how long any colon-related benefits persisted. Unfortunately, it wasn't really designed (or powered, as the clinicians say) to address cardiovascular safety, so everyone just has to take what they can from the data we have. Merck, naturally, takes the current data to mean that Vioxx is doing just fine. They point out that in the post-drug follow-up year, the cardiovascular risk for the group that was taking Vioxx doesn't seem to be statistically different from the group that had been taking placebo.
Which is fine, as far as it goes. A more objective look at the data, though, show that they didn't miss statistical significance by all that much. The numbers seem to be all against Vioxx, which is enough to make you wonder if the lights would have truly flashed red in a more statistically appropriate study. As it is, Merck is in the position of saying that a study which wasn't expected to show a statistical difference between Vioxx and placebo heart safety didn't show a difference - and that that's good news.
Even if the numbers had gone the company's way, statistical arguments are a notoriously hard sell for the defense in front of a jury. Having a bunch of muddy but trending-ugly data is one of the worst things that could have happened to Merck, actually. No one knows, from these numbers, just when the effect of Vioxx on cardiovascular risk might wear off. It's a playground for the lawyers - can't you just hear it? "Isn't it true that more patients had heart attacks on Vioxx? Even during the year after they'd stopped taking the drug? No, no, I didn't ask you for a lesson in statistics - just tell me if more people had heart attacks or not!"
No, no courtroom help there. I hope, for Merck's sake, that no one at the company believes there is, and that no one's charging them by the hour to try to convince them otherwise. At this point, they're going to need something better, and I'm not sure where they're going to get it. It's past the time when we can usefully argue about whether Vioxx should have been withdrawn, about what its risk-benefit ratio is, and whether Merck should be facing thousands of lawsuits or not. They are, and more than this latest batch of data will be needed to fight them.
(See also Jim Hu's comments).
Update: According to today's WSJ, things have gotten even muddier. Here's the subscriber link, and this is a Reuters summary.
+ TrackBacks (0) | Category: Cardiovascular Disease | Toxicology
May 16, 2006
The Wall Street Journal ran an interesting article by David Armstrong the other day on the New England Journal of Medicine and the Merck/Vioxx affair. It's subscriber-only on the WSJ site, but the Pittsburgh Post-Gazette picked it up here. It brings up an angle that I hadn't completely considered:
While Merck has taken the brunt of criticism in the affair, the New England Journal's role in the Vioxx debacle has received little attention. The journal is the most-cited medical publication in the world, and its November 2000 article on Vioxx was a major marketing tool for Merck. . .Internal emails show the New England Journal's expression of concern was timed to divert attention from a deposition in which Executive Editor Gregory Curfman made potentially damaging admissions about the journal's handling of the Vioxx study. In the deposition, part of the Vioxx litigation, Dr. Curfman acknowledged that lax editing might have helped the authors make misleading claims in the article. He said the journal sold more than 900,000 reprints of the article, bringing in at least $697,000 in revenue. Merck says it bought most of the reprints.
The article goes on to detail the role of a public relations consultant in the release and timing of the "Expression of Concern", which I've expressed my own concerns about. The journal seems to have been worried about its own name, and seeking to put the focus back on Merck. And some of these efforts may have gone a bit over the line. Remember the infamous missing data?
Perhaps the most sensational allegation in the journal's expression of concern was that the authors of the November 2000 article deleted heart-related safety data from a draft just two days before submitting it to the journal for publication. The journal said it was able to detect this by examining a computer disk submitted with the manuscript.
The statement was ambiguous about what data the authors deleted, hinting that serious scientific misconduct was involved. "Taken together, these inaccuracies and deletions call into question the integrity of the data," the editors wrote.
In reality, the last-minute changes to the manuscript were less significant. One of the "deleted" items was a blank table that never had any data in it in article manuscripts. Also deleted was the number of heart attacks suffered by Vioxx users in the trial -- 17. However, in place of the number the authors inserted the percentage of patients who suffered heart attacks. Using that percentage (0.4 percent) and the total number of Vioxx users given in the article (4,047), any reader could roughly calculate the heart-attack number. . .
. . .Many news organizations, including The Wall Street Journal, misunderstood the ambiguous language and incorrectly reported that the deleted data were the extra three heart attacks -- which, if true, would have reflected badly on Merck. The New England Journal says it didn't attempt to have these mistakes corrected.
So, the matter of the missing heart attacks, which was the subject of a lot of heated language around here, appears to be closed. This sheds an interesting light on last December's "reaffirmation" of concern, where the NEJM made so much of the heart attack data and how it should have been included. Just about everyone who read that came away thinking that the whole fuss was about the deletion of the three MI events in the Vioxx treatment group. As you'll see from the comments to that post, many of us spent our time arguing about whether they should have been included or not, what the clinical cutoff date was, and so on.
We could have saved our breath. The heart attacks weren't deleted from the manuscript, and those who thought that they had been were responding to a well-thought-out public relations campaign. My opinion of the NEJM is not being enhanced by these revelations, let me tell you.
Problem is, my opinion of Merck isn't at its highest level these days, either. More on that tomorrow. . .
+ TrackBacks (2) | Category: Cardiovascular Disease | The Dark Side | The Scientific Literature | Toxicology
May 15, 2006
Drug companies (the big ones, anyway) are companies just like the rest of them. They come with all the baggage that companies making kitty litter, microcircuitry, and frozen pizzas have - all the assistant vice presidents, all the memos and paperwork, and all the management fads. Oh, especially those.
My theory has long been that the business-school people in the industry come in, get a good hard look at how things really work, and assume that something must be terribly wrong. Surely things don't have to be this way - all the craziness, the risk-taking, the uncertainty. If you people would just implement some modern management techniques, maybe we could straighten some of this stuff out!
Maybe not. A lot of things are just plain old unstraightenable, but the managerial consultant has not been born who will tell you that. No, at many drug companies, a new fad comes rolling along every few years - some new buzzword-laden scheme that promises to re-invent, re-do, re-invigorate and basically make things work like it says in the three-ring binder that comes with the off-site course where you learn it all.
But there's something holding these ideas back at a lot of science-driven organizations. The contempt that most of the scientific staff has for "modern management techniques" is hard to underestimate. Problem is, we're used to having to prove our hypotheses, and show data (with appropriate controls, yet) in support of them. But I've suspected for years that most of the management fads that sweep through the world have nothing to back them up at all, and this suspicion has been confirmed by an article by Matthew Stewart in the latest Atlantic (subscriber-only) called "The Management Myth".
Stewart, an ex-consultant, goes into some of the history and current state of the racket, and it's exactly as I'd pictured it. Most of the time, there's no data (other than anecdotal fairy tales) to back up the latest six-sigma-good-to-great-seven-habits-of-continuous-improvement craze. When people infrequently try to gather the data, it's often impossible to do. And even when it's possible, the numbers often do not confirm any hypothesis at all:
"In many of my own projects, I found myself compelled to pacify recalcitrant data with entirely confected numbers. But I cede the place of honor to a certain colleague, a gruff and street-smart Belgian whose hobby was to amass hunting trophies. The huntsman achieved some celebrity for having invented a new mathematical technique dubbed "the Two-Handed Regression." When the data on the correlation between two variables revealed only a shapeless cloud-even though we knew damn well there had to be a correlation-he would simply place a pair of meaty hands on the offending bits of the cloud and reveal the straight line hiding from conventional mathematics."
(He also mentions something that hadn't occurred to me, that the best and highest-paying customers for all this voodoo are companies that are on their way down. They'll try anything. So if you're working for a company that's going crazy with this stuff, you might want to consider that a useful alarm bell).
+ TrackBacks (0) | Category: Business and Markets
May 14, 2006
I haven't given any updates on my side project experiments recently. I've been preparing a number of starting materials and getting things ready for another big run. I'm using a number of systems that people use for other (more normal) purposes at work, but I'm bending things around so much that everything has to be re-checked. And I don't have priority over anyone, which is as it should be for something this speculative, so I have to work in between what everyone is supposed to be doing. Finally, I think everything is in order. I'm setting up a new round of experiments tomorrow.
It'll be a few days before I know if anything has worked, though. The experiment itself is rather lengthy, and the analysis isn't trivial, either. I actually have two or three different variations of the idea all about ready to run, so it's going to be a real flurry of activity by the long, slow standards I've been working by. I wanted to take more risks in my research this year, and here they are, reporting for duty.
Are any of these things really going to work? I wish I could evaluate the chances better, because that would help me figure out what to run next. As it is, this is such terra incognita stuff that I just don't know what to expect. I shouldn't complain about that, though, since that's what being a scientist is supposed to be about. It's an odd feeling to be living it, though. There's nothing quite like it.
I've been out on several edges of knowledge over the years. Plenty of chemists experience the no-one's-ever-made-this-molecule edge (in industry, of course, we count on that being the case). You can get out to that territory pretty quickly, even now. Then there's the discovery of a new reaction, the no-one's-ever-made-something-this-way edge of knowledge. I've been in on one or two of those, too, and there are research groups that make it their whole business.
But this one is really out there, to the point where colleagues raise their eyebrows at me when I explain it to them. This, though, is where I've wanted to be ever since I started doing research. Win or lose, I feel privileged just to set experiments like these up. Here we go.
+ TrackBacks (0) | Category: Birth of an Idea
May 11, 2006
I've been spending a good part of the last couple of days rota-vapping down toluene. Why would I do such a thing, you ask? Because I ran a big column in the stuff, first time I've ever done that, I think. The chemists in the audience will guess that I examined many alternatives before settling on this one, and they are correct. Toluene's not especially toxic or smelly, but it is rather high-boiling compared to most of the solvents that we use for purifying things. You have to turn your water bath up and allow for plenty of time when you're taking a lot of it off. In this case, it turned out to be by far the best solvent for separating two closely running compounds.
Here's how I got there - it's a good illustration of a day-to-day chemical problem. I wandered through all sorts of common (and uncommon) solvent mixtures, checking each by thin-layer chromatography (a quick way to see if things separate). In ethyl acetate/hexane, the standard brew, you could just tell that there were two things in there after running a TLC plate three times. Switching to ether/hexane, which sometimes does the trick, was no help. Straight dichloromethane gave too fast-runnig a spot, and mixtures of dichloromethane-hexane didn't separate anything. Chloroform/hexane, on the other hand, wasn't too bad - not as good as toluene, but at least you could see some daylight in between the two spots. Isopropanol/hexane did no good at all.
I was trying here to run a bunch of different solvent types. Hexane is a common theme, since it's pretty much the plain vanilla of solvents. More polar stuff is added to it, and you see what happens. In this case, an ester co-solvent did a little bit of good, but ether and alcohol additions did nothing. Chlorinated solvents showed some promise - well, at least chloroform did. But it's an oddity, more polar than the others of its kind. Toluene was the only aromatic solvent I tried (it's really the only convenient one for chromatography), and something about its flat shape and electron clouds did the trick. This stuff is brutally empirical.
So toluene it was, with a little ethyl acetate at the end to speed the latter spot along. I got some mixed fractions, naturally, but quite a few clean ones, certainly enough for my purposes. (If I get bored or desperate, I can go back and re-run the mixed ones).
+ TrackBacks (0) | Category: Life in the Drug Labs
May 10, 2006
There's been a lot of press coverage the last week or so about two new routes to Tamiflu (oseltamavir). Roche famously starts from shikimic acid, most of which they get from Chinese star anise, and the new syntheses are attempts to get around that bottleneck.
E. J. Corey's getting more attention than Masakatsu Shibasaki, partly because he's a Nobel winner and partly because he's made a point of placing his synthesis in the public domain. (Shibasaki's applied for a patent). It's nice to see organic synthesis make the headlines, but unfortunately, a lot of the coverage has been of the "Nobel Prize Winner Solves Tamiflu Problem" sort. I've also seen several stories that suggest that Corey's route opens the door (at last, right?) to mass production.
Not so fast. Roche has already been producing rather large amounts of oseltamavir, although they'd be glad to find a better route. And it's not like they haven't been trying themselves, as this PDF will make clear. And it's far from clear that Corey's route will be of commercial value, even though his overall yield, as given, is about 27%, which news articles are saying is roughly twice the yield from shikimic acid. (Note, though, that that Roche PDF claims a higher yield than Corey's - I'm not sure who's right).
Let's get technical and take a look at the chemistry. First off, the repeated claim that Corey's route starts from two of the cheapest feedstocks available - butadiene and acrylic acid - is only partly true. The key Diels-Alder reaction actually uses trifluoroethyl acrylate, which is substantially more expensive than acrylic acid, although admittedly ten times cheaper than the same amount of shikimic acid from the same source. Moving on, there are eleven steps, and according to the supplementary material for the paper (where the full experimentals are), steps 1, 3, 4, 5, 6, and 8 have chromatography in their workup. The others are run through a plug of silica or are taken on crude, which tells me that Corey's students probably tried to do the same with the remaining steps but took a hit on the yields. Every chromatographic purification adds a great deal to the cost of a process route, needless to say.
There are other wrinkles. Steps 1 and 2 start at -78 degrees before coming up to more process-friendly temperatures. Step 8 is a slow addition at -40, and step 9 is an inverse addition at -20. Low-temperature reactions are certainly doable on scale, but again, they'll add to the cost and complexity. Those last two steps involve an acylaziridine intermediate, whose thermal stability would need to be checked out, and could partially negate the advantage of not using azide in the route.
The scale of the reactions in this paper is in the ten-gram range, which is fine, until you get to steps 8 and 9. Those low-temperature reactions are shown on 300 and 160 milligrams, respectively. That tenfold drop in scale indicates another area that would need to be checked out; there can be a huge difference between something that works on a couple of hundred mgs and a useful process, especially in the cold.
All this isn't to say that Corey's route doesn't work, or that it can't work on scale. But it's important to keep in mind that the kind of chemistry done in his lab is about as far from industrial scale as you can get. It may be that the more interesting features of his route (the catalyzed Diels-Alder, for example) could be combined with some of Roche's own process ideas and turned into something feasible. But for now, this is an interesting route that's a long way from solving anyone's Tamiflu shortage.
To be fair, Corey himself isn't responsible for some of the hype, except I wish he wouldn't let himself be quoted as saying that the thinks that the Tamiflu production problems are "solved". Headline writers know nothing about organic chemistry or drug development, and they run with what's in the press releases. Of course, there's the larger question hanging over all of this: will Tamiflu even do anyone any good if there is a human outbreak of avian flu? And that, nobody knows.
+ TrackBacks (1) | Category: Chemical News | Infectious Diseases
May 9, 2006
Many readers will have heard of the years-long campaign in England against Huntingdon Life Sciences, a research animal breeding and testing company. (These tug-of-war articles from Wikipedia on HLS and the campaign against it are detailed overviews, as well as a good example of that site's simultaneous strengths and weaknesses).
Now shareholders of GlaxoSmithKline, one of Huntingdon's customers, are getting anonymous letters from activists, threatening them with release of (unspecified) personal information if they don't sell their shares. These are similar tactics to the ones these groups used when HLS was trying to list on the Hew York Stock Exchange last year. You'd think that these attacks would have slowed down after the recent convictions of several anti-Huntingdon activists for terrorist activities, but apparently not.
In that case, names and addresses of researchers and investors were listed on a web site as well, but the defendants claimed that they had nothing to do with the violence and harassment that often followed. This defense was undermined by the evidence of their own statements, some posted on the web and some caught on videotape, friendly things like "The police can't protect you!"
Now, if anyone has been writing passionate, outraged books and screenplays about the researchers who've been carrying on through all this, I've missed them. That's because no one likes the idea of animal experimentation - it's not going to sell popcorn at the multiplex, that's for sure. And, to be frank, it's not like those of us who design, order, and carry out the experiments are high-fiving each other about how many rats we've gone through, either.
It's true: I don't actually like the fact that every successful modern drug has risen to its place on top of a small mountain of dead animals. But not liking doesn't keep it from being true, and not liking it doesn't mean that I have an alternative, either. I don't. What the animal rights campaigners - the more rational ones, anyway - don't seem to realize is that tens of millions of dollars are waiting for the person who can come up with a way of not using so many mice, rats, and dogs. (The less rational ones wouldn't care even if they knew).
They're expensive, you know, animals are. We don't just have them running around in rooms with a bunch of straw on the floor. They live in facilities that are expensive to build and expensive to maintain, and you have to hire a lot of people whose only job is to take care of them. The anti-testing people seem to have visions of drug company employees cackling at the thought of getting to use more animals, when the truth is that we'd dump them in a minute if we could.
But here's the hard part: we can't. Not for now, and not for some time to come. We don't know enough biology to do it. As it stands, if you were able to model every relevant system in a rat, well enough to use your model for predictive screening, you'd have basically built a rat yourself. We get surprised all the time when our compounds go into animals, and every time it happens, it shows how little we really know.
No, the system we have isn't pretty, and it sure isn't cheap, but there's nothing yet that can replace it. In the meantime, the rats die or the people do. I don't have a hard time choosing.
+ TrackBacks (0) | Category: Animal Testing
May 8, 2006
So, let's say that some jungle-extract natural product has shown good activity in some drug screens. What next?
The first thing to do is define "good activity". Screens can be done at several levels. The most abstract is against a purified protein (enzyme inhibition, for example), and that can be a long way from anything useful in a living system. You'd get excited if you have activity against an enzyme that a lot of people are interested in but no one's been able to get good activity against - protein tyrosine phosphatase 1B, for example, a fine diabetes target that's evaded all attempts so far at useful small-molecule inhibition.
A more stringent test is against living cells. Activity there shows you that you at least can deal with cell membranes (penetrating them or hitting targets on their surface) and that you can get the desired effect in the presence of all the other potential binding sites that a living system presents. Cancer assays are often done with living tumor cells, with the readout being slowed growth or outright cell death. (That latter one isn't as good as it might sound, since most things that will kill cancerous cells will kill every other kind of cell, too).
But let's assume that this natural product has shown something that's really worth following up. You're going to need more of it for animal models and toxicity testing. Now comes a tough decision: make the compound, or extract it? The former is often out of the question, since many of the structures we're talking about are so complex. Academic groups make them, of course, for the sheer challenge of it, and often talk about how their routes have opened up new possibilities for analogs of the original compound. But how often does that really happen? No one's going to do med-chem analoging in the context of a thirty-step synthesis.
The back-to-nature option isn't always available, either. Most natural products are found in such low concentation that insane amounts of the source material would be needed. In other cases, particularly with marine natural products, the source organism may have disappeared from its original collection site over the years. Some of these things have never been found again.
The most successful natural product drugs have been those that have a high-volume plant source somewhere in their pipeline. The Vinca alkaloids, for example, come from the fast-growing Madagascar periwinkle. (You could say the same thing about opium and cocaine, too, I guess, which have been successful in their way). You can't always find a good source of the final compound, though.
The next best thing is to find an advanced intermediate that a plant produces in quantity. That's the foundation for the modern steroid industry - it turned out that you could have Mexican yams make most of the steroid backbone for you, which is a wild story that's been told many times. A more recent example is taxol. Extracting the final compound from the Pacific yew tree just wasn't feasible on scale, not least because it was found in the bark and couldn't be obtained without killing the trees. Folks were looking at a yield of about a half-gram of drug per forty-foot tree - mind you, that's still easier than making the compound from scratch. But an advanced intermediate could be extracted from the needles, and Holton's group worked out a synthetic route from it.
Actually, those semisynthetic route might be the best overall, because it gives you a handle to generate analogs (the way the total synthesis routes do, in theory). Natural products are rarely given the kind of exhaustive structure-activity workout that smaller man-made molecues get - or if they do get one, it takes years to slowly work through the possibilities. You won't be able to ring all the changes based on what you get from the plant, but it's better than nothing. Taxol, for example, has had a great deal of work done on it, mostly in attempts to make the damn stuff more soluble. As it is, it has to be dosed in a vehicle that's almost worse than the drug for toxicity. That's a cruel trick of Nature, all right - insoluble compounds are something we can make on our own.
+ TrackBacks (0) | Category: Drug Development
May 7, 2006
It's been too long since I added another to my list of "Lowe's Laws of the Lab". They're something I came up with in graduate school, late one night in the lab. Looking back on them, I can tell I was in a bit of a bad mood when I put them together, but that doesn't narrow things down very much. I wasn't at my best for four or five years there, from what I can see.
Today's law is: You are in real trouble if someone knows more about your project than you do. That's a realization that hits people at some point in their graduate school career - preferably not much past the midpoint. It marks the transition from being a student to being a working scientist. After all, when you're still a student, other people are expected to know more about what you're doing than you do yourself; you're supposed to be learning from them.
But that has to change at some point. It's not that you suddenly get as smart or as experienced as the better grad students or post-docs in the group, let alone your PhD advisor. More talented people might be better at your project than you if they devoted all their time to it, but they're not doing that: you are. No, you get to where you know the ins and outs of your own project, your corner of the research world, better than anyone else. With that comes the realization that no one else is going to get your project done for you, and no one else is going to get you out of grad school. If you don't reach that level of involvement and expertise, something has gone wrong, and things will continue to go wrong for you.
That's because you need that experience if you go on to a further career in research. If you're going to be any good at your work, you have to be willing to become the expert on what you're doing, and not rely on others to have things figured out. Because what if they don't? This happens rather often, which is another valuable lesson that grad school is supposed to teach you. (Independent work isn't just for PhDs, either. Experienced Master's level employees at a drug company are expected to work more and more on their own as time goes on, too, and will be considered more valuable the more that they can do so).
I don't mean that it's a good thing to bull around the place, telling everyone that you know best what to do and to get out of the way. You never stop learning the research trade; anyone who thinks they've seen it all is mistaken. But I am saying that the opposite sort of behavior is a very bad sign. "What do you think about this?" is a fine question to ask people, but it should never shade over into "Tell me what to do". And "I don't know; that's his department" or "I never got around to understanding that part" are statements that should get any lab head or project leader removed from their position. If you don't know these things, who will?
+ TrackBacks (0) | Category: Lowe's Laws of the Lab
May 4, 2006
Well, you'll no doubt have heard that Ariad won its case against Eli Lilly today. The jury found that Ariad's patent was valid, that Lilly had infringed it, and that Ariad was owed royalties, retroactively and in the future. This isn't the sort of case (or the sort of universe) where the jurors would get interviewed by Larry King and Oprah, but I would very much like to know what this jury knew, or believed that they knew, about intellectual property and cell biology. Ah well, at times like this, my wife reminds me that I thought that O.J. Simpson would be convicted, too.
I'll not rant; my position on this has been as clear as I can make it for some time. I hope that Ariad and its shareholders enjoy their victory while they can. Said shareholders should realize, though, that it's very unlikely that one dime will be coming to the company until several other legal proceedings finish up. First off, Lilly will appeal this verdict. Then there's the reexamination of Ariad's patent that Lilly requested. And there's another Lilly trial - a bench trial this time, no jury - on whether Ariad's patent is enforceable. And let's not forget Amgen's recently filed suit in Delaware.
I find it very unlikely that all of these will go Ariad's way, since at least three of them will be judged by people who know what they're talking about. In the meantime, I haven't covered the short position I spoke about the other day, even though I'm about $1500 to the bad right now. The stock didn't take off like a skyrocket today, and I think that the legal uncertainty around today's verdict will keep it from doing so over the next few months. I have time and collateral. Who knows, as things go on I might short some more, and I'll post it if I do. Court of Appeals for the Federal Circuit, over to you.
+ TrackBacks (0) | Category: Patents and IP
May 3, 2006
So, as mentioned, the DC Circuit Court of Appeals came down with an interesting ruling (PDF available here). Here's the background of the case, as summarized in the majority opinion:
The Abigail Alliance for Better Access to Developmental Drugs ("the Alliance") seeks to enjoin the Food and Drug Administration ("FDA") from continuing to enforce a policy barring the sale of new drugs that the FDA has determined, after Phase I trials on human beings, are sufficiently safe for expanded human testing (hereafter "post-Phase I investigational new drugs"). More specifically, the Alliance seeks access to potentially life-saving post-Phase I investigational new drugs on behalf of mentally competent, terminally ill adult patients who have no alternative to government approved treatment options . . .
The Alliance contends that the FDA's policy violates the substantive due process rights to privacy, liberty, and life of its terminally ill members. The complaint presents the question of whether the Due Process Clause protects the right of terminally ill patients to decide, without FDA interference, whether to assume the risks of using potentially life-saving investigational new drugs that the FDA has yet to approve for commercial marketing but that the FDA has determined, after Phase I clinical human trials, are safe enough for further testing on a substantial number of human beings. . .
As you may be able to tell from the direction that's taking, the majority opinion says that yes, the FDA does violate due process in these cases. They're reasoning from the Glucksberg case, in which the Supreme Court laid down some guidelines for such claims, but not being a con-law scholar, I'm not qualified to address this line of thinking. They also work from precedents which hold that patients have a due process right to refuse life-saving treatments, and hold that there's a similar right to access potentially life-saving ones. "In both cases", the majority opinion says, "the key is the patient's right to make her decision about her life free from government interference. . .the Alliance seeks for its members the same right of access enjoyed by those terminally ill patients lucky enough to secure a spot in Phase II trials. Accordingly, we hold that the district court erred in dismissing the Alliance's complaint . . ."
I'm torn by this line of reasoning. My libertarian streak likes the talk about being free from government interference, but my drug-industry experience keeps suggesting some practical difficulties. (For one thing, if we're telling the government to get lost, why mandate Phase I trials in the first place?) There's a pretty strong dissenting opinion from the recently-appointed Justice Griffith that goes into these problems. First off, he has a problem with the line of reasoning that the majority used:
. . .It does not help the majority's cause that the Supreme Court has rejected several similar challenges. . .To be sure, the Supreme Court has not addressed the constitutional argument raised by the Alliance. But contrary to the tradition asserted by the majority, there is a tradition of courts rejecting arguments that the Constitution provides an affirmative right of access to particular medical treatments reasonably prohibited by the Government.
It's the last paragraph of his dissent that I find rather persuasive, though:
The majority's new right to procure and use experimental drugs raises a number of vexing questions that are now constitutional issues, potentially insulated from the tug and pull of the political process. If a terminally ill patient has such a right, are patients with serious medical conditions entitled to the benefit of the same logic and corresponding access? If an indigent cannot afford potentially life-saving treatment, would the Constitution mandate access to such care under the right recognized by the majority? Can a patient access any drug . . .if she believes, in consultation with a physician, it is potentially life-saving? Would the majority's right guarantee access to federally-funded stem cell research and treatment? Perhaps most significantly, what potential must a treatment have in order for the Constitution to mandate access? Because the majority does not answer this last question, the District Court faces an impossible task on remand.
These are just the sort of problems that I think are glossed over by Justices Ginsberg and Roberts. Where do we draw the line? There are all sorts of things that make it through Phase I that wipe out in Phase II and beyond for lack of efficacy. It's all very well to talk about potentially life-saving therapies, but that potential is, in many cases, pretty damn well hidden. What sort of Phase I study is enough to trigger this right to treatment? (And who pays for it, for that matter, and how is that figure arrived at?) And it's important to realize that Phase I only studies acute safety, for the most part. Keep in mind that compounds drop out in Phases II and III for safety problems that only showed up in larger samples and longer trials. The threat of lawsuits is bad enough already, with drugs that have made it through a lot more than Phase I. How are we going to fare with even earlier compounds?
Justice Griffith is correct in seeing this as a practice that can only expand. The demand is certainly there. I'm more willing than I was a few years ago to see what a safety-trial-only system for pharmaceuticals would look like, but this isn't, to my mind, the way to get there. We're not running Phase I trials in a such a way that they can stand on their own for drugs to go right into patients - perhaps not even patients that are dying. If we want to change that, let's change it - but from the ground up, not by going through a hybrid regime that might give us the worst of both approaches.
+ TrackBacks (0) | Category: Clinical Trials
Since I made a big point out of this in a comment to the last Ariad post, I wanted to update things. The jury is out - literally - on the case versus Lilly right now, and I notice the stock creeping up a bit today.
So, I am now short Ariad stock, 1000 shares at 5.56. This isn't a recommendation that anyone else should run out and do the same, but I thought it was important that I mention it since I've written about the company. I'll post again on the topic after the verdict comes in (which will be when I close out this position, one way or another, I should think).
+ TrackBacks (0) | Category: Business and Markets | Patents and IP
I'll address this more in tomorrow's post, but everyone should know that there was a potentially very interesting ruling yesterday by the DC Circuit Court of Appeals. Do patients have a right to drugs after Phase I? We might be heading back to the days before efficacy testing was required. . .read this post at Marginal Revolution to get up to speed.
I've spoken about this issue a bit before. Back in 2002 I was very doubtful about the whole idea, but my opinion has been slowly changing. I didn't expect that this is how things would change, but you should never underestimate the courts. More anon.
+ TrackBacks (0) | Category: Drug Development
May 2, 2006
The natural products topic (which I'll return to in a couple of days) has me starting up a companion to the "Things I Won't Work With" category. This is the first in the new category of "Things I'm Glad I Don't Do".
I mentioned ciguatoxin, and I notice that one of the comments to that post was from someone who'd had a brush with the stuff. My sympathies - it's supposed to be really awful. A number of warm-water fish species can have dangerous concentrations of the compound in them, and it's probably one of the most common non-bacterial sources of food poisoning. The thing is, the fish themselves don't make the stuff. They concentrate it from marine algae, who produce a lot of extravagantly crazy molecules.
So if you want some ciguatoxin yourself, you fool, you, a good source is an organism near the top of the food chain. Moray eels turn out to be a good bet. But you don't just turn one of them upside down over a beaker and squeeze his tail. No, the isolation is a bit more involved:
The moray eels (ca. 4000 kg) were collected from the Tuamotu Archipelago and from the Island of Tahiti in French Polynesia. The viscera (125 kg) were homogenized and extracted with two volumes of acetone twice. After filtration, the extract was left at -20 "C for 1 day to precipitate oily residue. The supernatant was evaporated to dryness and partitioned between diethyl ether and water. The ether layer was condensed and suspended in aqueous 80% MeOH, followed by defatting with hexane. The methanolic layer was condensed, dissolved in acetone. . .
The prep goes on in this vein, through six different columns, one after the other. Now, imagine joining this research group (which was Yasumoto's, in Japan). It's your first day in the lab, and here comes one of the post-docs carrying a couple of blenders in his arms. Behind him, another one is wheeling in the bags of frozen eel guts. It's moray margarita time, and will be for some time to come.
One other aspect of this isolation deserves comment, because I don't think you could do it like this today. In the final column or two, the paper outlines a brutal but effective method for cutting fractions to get the ciguatoxin: take a sample from each cut of the column and inject it into a mouse. If it doesn't die immediately, that fraction doesn't have any ciguatoxin in it. Gloves recommended.
+ TrackBacks (0) | Category: Things I'm Glad I Don't Do
I've been out of town for the last few days, and just got back around midnight last night. So there will be no new post today - I'll have my hands full just staggering around my lab trying to figure out what's going on. The Wonder Drug Factory seems to have done just fine in my absence.
I told my lab associates that they were under orders not to discover anything while I was gone, so I could be back in time to take the credit. Motivational speeches like this are important. I can't quite reach the levels of a colleague of mine, though, who used to lean out of his office and call out "Work! Work! The harder you work, the faster I get promoted!"
+ TrackBacks (0) | Category: Blog Housekeeping