Lagniappe (science, business, and culture)
Thursday, August 29, 2002
More "How Not to Do It"
Several of us started in on a "stupid lab tricks" conversation at lunch the other day. When chemists get together, we always know that that topic's available if we run out of things to talk about. Anyone with reasonable organic lab experience has a stock of favorites.
One of mine is a fellow grad student who was trying to save money by recycling acetone, the wash acetone that he used to clean out his flasks. He had a four-liter round bottom with a side-stopper on it, rigged up to a big distillation head, and the thing was always cooking away.
Of course, the acetone in the pot got nastier and nastier as time went on, as he kept refilling it with whatever gorp he washed out of his dirty glassware. During the time I knew it, it was a deep, opaque chocolate brown with sort of purple overtones - not the usual color you look for in your acetone supply. The stuff he distilled off was pretty decent, but even so. . .
Well, eventually this guy took a vacation (for the first time since I'd joined the group.) He took off for a few days, and turned off the still before he left. There it sat, just as ugly in repose, until he came back and flipped on the power to the heating mantle. All was quiet, for a while.
I was right around the corner when I heard it: a loud "PING-whUUUrrrsh - splattt!" This alarming noise was followed by a really fearsome stench, a knock-you-back brew of who knew how many stinky carbonyl byproducts (no doubt including, as fellow organic chemists reading this have figured, a generous helping of mesityl oxide.) I charged through this to find that the still had blown the side-arm stopper clear across the room - the pinging sound was its ricochet off the wall - and the foul brown concoction had come geysering out after it. The far wall was a Rorschach blot of dripping slime, the source of the eye-crossing aroma.
The problem - as those who've made similar mistakes well know - was when the heating was turned off. All the grunge in the pot had, for the first time in months, finally had a chance to settle down to the bottom of the flask. Where it coated the pile of boiling chips down there with resinous mung, rendering them useless. Which allowed the whole shebang to superheat once the power was turned back on, until something finally oozed aside long enough to allow the first bubble to form. And then, as Louis said, aprez-moi, le deluge.
A solid majority of lab-accident stories start out "We had this solvent still. . ." That's why you won't find any of them in any large industrial environment. It's just not worth the opportunity to add to the story file!
Wednesday, August 28, 2002
Consequences of Aneuploidy
Man, with headlines like that, I can't think of why I'm not pulling in thousands of hits a day. Anyway, I wanted to follow up on yesterday's posting by emphasizing that aneuploidy hasn't been ignored for all these years. It's just that the chicken-and-egg question about its role in cancer is heating up.
For example, the "micronucleus test" is often done alongside the Ames test. It's a direct measurement for this sort of chromosome breakage and malformation. The "micronucleus test" can be done with various cell lines, but one popular method is to administer the compounds to rodents and check their red blood cells (erythrocytes.) This looks at the effects on the precursor bone marrow stem cells, and depends on the odd fact that erythrocytes have their nucleus removed while they're developing.
If severe genomic damage occurs, odds and ends of chromosomes often end up lumped together in a separate "micronucleus" floating around detached from the main one. The micronucleus doesn't get pushed out of the erythrocyte, though, and it ends up standing out dramatically in a finished red blood cell. You can do such tests on other cells in culture as well, but those variations haven't been as well validated as the in vivo one.
If there are human cells with a tendency towards induced aneuploidy (as Duesberg and others claim) then there are some new possibilities for a useful screening test. One hitch might be that cultured cell lines often forget their origins after a while and act differently - not being exposed to all the various extracellular signals they're used to probably explains a lot of this. It's worth a look, though, particularly if aneuploidy does turn out to be an early event in carcinogenesis.
Tuesday, August 27, 2002
Aneuploidy, or What're A Few Chromosomes, More or Less?
My recent posts about the Ames test (see July 29 and July 30) went into some detail about its use for estimating the mutagenicity of a compound. And mutagenicity is of course a bad thing, because increased DNA damage is generally held to be a factor in carcinogenesis.
But just how that works is the subject of some major disagreement. To wit: How many mutations does it take? How quickly do cells mutate under normal conditions, anyway, and do different types mutate at different rates? How much are those rate differences influenced by the environment? Are all cancers caused this way?
These question all bear on a current hot topic: do cells become cancerous due to some specific small mutations, or because of wide-ranging genomic instability (aneuploidy)? The latter hypothesis is having its inning recently, as witness some major articles in the July 26 issue of Science. This genomic instability is, by all accounts, a real junkpile of broken and reglued chromosomes. It really makes you respect what you can to the basic machinery of a cell and have it still function. Aneuploidy is found in a wide range of cancer cells - but is it a cause of cancer, or is it an effect?
The specific-mutation hypothesis has had a deservedly long run. Many candidate genes have been discovered, and they've been the subject of massive research efforts. At this point, there's little doubt that they're important, although their importance varies greatly between tumor types. Typically, these genes code for proteins that are important for cell growth - either positive factors or negative ones. Mutations that switch positive growth factors permanently on can lead to a cancer, as can those that wipe out the braking functions of a negative growth factor.
An example of the former is the ras oncogene, important in bladder tumors, among others: a single DNA base switch leads to an amino acid substitution (valine for glycine) in the expressed protein. That sends this protein into a permanent "on" state in its cellular signaling cascade, with out-of-control growth the result. And a good example of a negative-control mutations is p53, a protein whose normal function is in one of the cell-cycle checkpoints. It's part of the error-checking machinery. If p53 is functioning correctly, cells with many types of DNA damage are prevented from going on through cell division. Several mutations have been shown to impair p53 function, though, and these are found in a wide variety of aggressive tumor lines.
And it's not just single point mutations like these, either. Fusion proteins (front part from one gene, back part from another, for example) can lead to the same problems, as can mutations that don't change a protein's structure, but change the amount of it that's expressed in the cell. It usually takes more than one of any of these mutations to really set off a tumor.
Where things start to get messy, is when you try to figure out where those changes are coming from, and if they're always associated with cancer. The aneuploidy advocates believe that some cell types are prone to genomic instability, giving them a much greater chance of racking up enough mutations to become cancerous. Even that's subject to dispute. Does that mean instability toward point mutations, or toward large-scale chromosome breakage and shuffling? There's evidence pointing both ways.
While many chemical carcinogens are known to produce mutations, others don't seem to cause any. (Those are presumably the ones that an Ames test would miss, a prospect that keeps toxicologists up at night.) Peter Duesberg's group at Berkeley claims that such compounds do cause aneuploidy, though, and that this state is one of the early events in tumor development. And yes, that's the same Duesberg whose HIV theories keep the adjective "controversial" glued to his name. There's a group at Johns Hopkins whose work supports this sequence of events, too - and work from Harvard that argues against it. It's going to be a while before anyone gets this all sorted out.
I think it's likely that answer is going to turn out to be a mix. Gross chromosomal disturbance would certainly seem likely to cause either cell death or conversion to a cancerous state. Making this a one-size-fits-all precondition, though, seems like overreaching. But that goes for the small-mutation crowd, too. They have some good examples on their side, but there are plenty of cancers that don't fit into the established categories very well. It seems quite plausible that some cell types are more susceptible to chromosomal abnormalities than others. At the very least, some would be expected to be more susceptible to smaller mutational changes. At some point between these two, the explanations start to converge.
As a medicinal chemist, what I'm interested in are new drug targets. That's where aneuploidy is still a bit young, as a well-investigated hypothesis, to offer me anything to work on. Are there particular enzymes that could be targeted to reduce genomic instability, active-site switches to throw on or block? No one knows yet. I'll watch the debates with interest, but the parties involved should call folks like me when those questions are closer to being answered.
Monday, August 26, 2002
Muddying the Water for Fun and Profit
I've been meaning to comment on some recent reports in the Wall Street Journal about the lengths that stock analysts have gone to get information on clinical trials. The main example was one David Risk of Sterling Financial (primarily a short-selling outfit, and quite sceptical of official company information.) Back in February, he signed on as a patient in a trial of a sleep-disorder drug from Neurocrine Bioscience, saying that he fit the profile that they were looking for. After his acceptance, he spent his time quizzing everyone he could buttonhole, then bailed and issued a "sell" on the stock. This was based on one verbal report of a bad reaction in one patient.
Other examples in the article had analysts calling the physician in charge of a trial, pretending to be fellow MDs, and asking for details on enrolling patients (while really trolling for inside data.) One Boston outfit, Leerink Swann & Co., pays physicians involved in clinical trials to have "discussions" with analysts (who pay Leerink Swann, of course.) These discussions supposedly don't violate confidentiality agreements, but I'd like to know what useful information could change hands in a conversation that didn't.
This sort of thing strikes me as being over the line. And the thing is, I like selling stocks short. I'm a bear by temperament; my facial expression in the stock market is a permanently raised eyebrow. Investors should view company press releases with suspicion, because most of the time it's fully deserved. Biotech drips with hype and falsely raised expectations. But that doesn't justify this behavior, which is indefensible on several grounds. Legally, the Sterling analyst entered the trial under false pretences, and he had to violate his non-disclosure agreements to write the report he did. If someone wants to make a case out of that, they probably could. I could add that he wasted the time of the administrators of the trial, and that these things are hard enough to run without jokers joining in.
On the scientific side, it's really idiotic to grab onto individual data points the way he did. As it turned out, the patient with the bad reaction to the Neurocrine test drug also tested positive for opiates, and was kicked out of the trial for violating its protocol. His case probably had no bearing on whether the drug was working or not, or how safe it was. It's a recurring pattern, though: the same analyst put out a strongly negative report on a Regeneron clinical candidate for obesity because one patient came down with Guillain-Barre syndrome during the trials. Did this have anything to do with the drug? Causality's a tough question, but the patient had had a recent flu vaccination and an upper-respiratory infection (both of which are risk factors for G-B.) No other patients have had the syndrome. There seems to be no reason to assume a connection between the two.
I can't stress this enough: finding out if a drug is safe is very difficult. Finding out if a drug is effective is very difficult. And that's if you're the one running the clinical trials.The only data that mean anything are those from rigorously controlled studies, done on as many patients as possible. And once the numbers come in, you have to sit down for an extended session of head-banging statistics to be sure that you know what they mean. Sure, you can go around picking out tiny bits of positive news (like some companies do) or tiny bits of negative data (as these examples have done.) But both of these are dangerous, stupid, and irresponsible. The people in the WSJ's article go on about how they're just trying to "uncover the truth." The truth is, they're just as bad as any deceptive PR department.
More Cosmology, You Say?
A fellow scientist from Princeton, who knows the field a lot better than I do (no strenuous feat, that) has posted more about the MIT preprint I mentioned on the 20th. Check it out if you want to hear more reliable opinions than mine!
Sunday, August 25, 2002
Back For Some More
Well, I'm back from a few days vacation with the kids. If there was any major science story breaking in the news, I sure didn't hear about it. Not that I could hear very much over the background level of my 2- and (almost) 4-year-olds, mind you.
Tomorrow, I can already tell, will be one of those "where-am-I" days at work, since it seems like I've been gone a lot longer than I really have. I'm sure there will be no shortage of folks helping to orient me, unfortunately.
I'm assuming that my company wasn't bought/sold/broken up for scrap while I was away. August isn't much of a Wall St. deal-making season (since so many grand pooh-bahs are off vacationing, of course,) but things will probably start percolating around as we head into fall. When I count up all the rumors I've heard about various companies, it's almost equivalent to just picking from the whole combinatorial set. I'm trying to think of a match that I haven't heard speculation about. . .Glaxo SmithKline and Denny's? Lilly and Carnival Cruise Lines? It's been that jumpy.
Needless to say, most of the rumors haven't made much sense to me - no, not those two, the real, well, more real rumors. That doesn't mean that some of them won't come true, unfortunately for those involved. I'm deeply sceptical of most mergers, even ones that bring in a big seller (like Pfizer's acquisitions of Warner-Lambert and Pharmacia.) Those should have been called Pfizer's acquisitions of Lipitor and Celebrex, frankly. As far as I can see, Pfizer's headed for the Red Queen's Race (from Alice in Wonderland) - having to run as fast as you can just to stay in one place. And that's a goodmerger. The bad ones have amazing potential for harm.
Wednesday, August 21, 2002
I'll be off on another short vacation for the rest of this week, so there will be no new postings until Sunday night (EST.) Here's hoping that some good news, for once, breaks when I'm not around to comment on it!
Tuesday, August 20, 2002
Here Comes the Pharmaceutical News Again
AstraZeneca's revelation yesterday was quite disturbing. Their experimental cancer therapy (Iressa) doesn't seem to offer any added benefits when given in chemotherapy combinations, at least in non-small cell lung cancer. That's a type of tumor that the approach really should have worked in, and one that the compound had shown some efficacy in as a monotherapy. The data were particularly unexpected, given that preclinical models had shown that the compound should have added to the effects of standard chemotherapy agents (taxol, cisplatin, etc.) So much for the model systems.
This news sure didn't do AstraZeneca any good, but it really torpedoed the stock of competitor OSI, whose similar compound (Tarceva, developed with Genentech) is a much greater percent of that company's future. Two others in the field, Abgenix and everyone's favorite, Imclone, also declined.
The reporting on this story in the general press has been pretty poor (but with the Wall St. Journal doing a much better job than the New York Times, for example.) The problem is that it's a fairly complex biological issue, and some of the key issues are still unresolved.
All of these companies are nominally targeting EGFR, the epidermal growth factor receptor. But they're doing it in different ways, and there's more than one type of EGFR, too. The two that are most well-understood are HER1 (technically the only one that's really called EGFR) and HER2 (there's a HER3 and a HER4, too.) When activated, these stimulate cell growth and proliferation. The activating molecules can be just floating around, but in some cases the tumor cells are thought to make their own activating ligands, a particularly vicious mechanism. An added complication is that the active form of the receptor is actually a dimer of two HER types, and they seem to be able to mix-and-match. This fact is surely significant, but the details are still obscure.
HER2 is known to be overexpressed in many breast cancers, and overexpression of either it or HER1 seems to make a tumor more difficult to treat. How much the overexpression levels correlate with activity isn't well understood, either - the "tone" of the system seems to vary in different sorts of cancers.
One way that drug companies have gone after these targets is through antibodies. The antibodies bind, presumably, to a big swath of the surface of the receptor, blocking it from being activated. There's a possible bonus here, since this sort of antibody technique also seems to be toxic to the cells. There's a commercial HER2 antibody from Genentech, Herceptin, which seems to have decent activity (alone or in combination) in HER2-rich tumors. Imclone's Erbitux is an antibody to HER1, and Abgenix is the other antibody player.
The other way to attack these receptors is through small molecules. No one's ever found anything reasonable that'll do the job of an antibody (that is, physically blocking the receptor.) There is a target downstream, though. The receptor phosphorylates itself to activate, and this kinase activity is suitable for a small molecule inhibitor. That's Iressa, OSI's Tarceva, and some others from GSK, Novartis, Pfizer, and others.
These drugs vary in their potency against HER1 and HER2. Iressa and Tarceva are pretty similar - more HER1 than HER2. (thus the swoon in OSI's stock, which I'd say was justified, given the data.) GSK's GW-2016 is very potent against both, and Pfizer's CI-1033 is not only potent, but binds irreversibly. That's normally not a mechanism you'd want to see, but in oncology all the gloves are off.
As far as comparing the antibody therapies with the small molecules, though - well, the Wall St. Journal today suggested that Imclone could be in major trouble. I think that that's getting ahead of the science. The kinase inhibitors and the antibodies have differences in mechanism, HER selectivity aside. This certainly wasn't good news for Imclone, but it wasn't the red alert that it was for OSI and Genentech.
I should have picked this week to be Heavy Mystery Week instead of last. According to Nature Science Update, there's a paper coming out from a group at MIT on the implications of the (lately revived) cosmological constant. (For those not following the story, this is the finding from some long and difficult astronomical measurements designed to measure the rate of the Universe's expansion. To the intense surprise of almost everyone, the rate turned out to be accelerating.)
I cannot completely comprehend the preprint, since I don't have the mathematical tools. But their argument seems to be that the structure of our universe is a lot more unlikely than we've even realized. Inflating universes like ours would appear to be extremely rare (although this might just be our "extreme temporal provincialism" to use one of their phrases.) And the anthropic principle won't help:
Inflationary starting points are very rare in time. That in itself is not a problem. Most of the rest of time during which nothing interesting is happening can, from an anthropic point of view, be thrown away. We can also throw away large fluctuations which lead to un-livable conditions. The danger is that there are too many possibilities which are anthropically acceptable, but not like our universe.
They get to the point of wondering if there's a true cosmological constant at all, which would seem to be throwing down the gauntlet to the observational folks (are your data correct?) and to other theorists (if it isn't the cosmological constant, then what is it?)
Monday, August 19, 2002
They All Get Real At Some Point
In every drug development project, there are put-up-or-shut-up moments. Those are referred to by more traditional project planners as "milestones," but everyone knows what they really are: the times when you walk across the fraying rope bridge, looking nervously at the depths below.
A key moment is when your team has finally made some real advances on the chemical leads you started the project with. When you finally have more potent compounds, with better (and longer-lasting) blood levels, it's time to see if they're more effective in the animals.
They had better be. Because if they aren't, you're going to have some explaining to do. If making the compounds more active and getting more of them into the body doesn't help them, then what are you going to fix now? It's hard to go to the powers that be and say "Well, we're just going to make some more different compounds and. . .well. . .sort of, y'know, hope for the best."
Not that that's not the best course, sometimes. It can be impossible to know why a particular drug bombs out (although you should make every effort to find the reason.) If you're on a good target, or the mood is forgiving, you can make the case that a different drug structure might avoid the whatever-it-was that made the last one a failure.
But without some believable theory, that strategy only works for a while. At some point, the compounds have to show that they're good, and the biological rationale has to show that it's good. If you can't point to some other problem (bad animal model? wrong species?) then you'd better be ready to fold up the tent.
Maybe some new data will show up eventually that sheds some light on why things didn't work; maybe it'll always just be one of those mysteries. The point is, no one on a drug project can allow themselves to get too attached. There's always another one to work on, and they certainly don't all work.
Sunday, August 18, 2002
And I Still Have a Couple of Those Shirts, Too
I found some old pictures taken in my grad-school lab the other day, and sat down to look them over. It had been at least nine years since I'd seen them (and it's some fifteen years since they were taken.) One thing that strikes me is how similar my hood looks - a mess then, a mess now. I'm not an orderly person in the lab, and it doesn't look like that's going to change. I don't spend nearly as much time in the hood as I used to, but it's still not pretty.
Another thing that stands out is how similar the equipment all looks. Of course, the stuff I have in industry is newer, and certainly of better quality overall, but the technology (at least in the hood) is nearly identical. One stir plate is pretty much like another, ditto the separatory funnels (which haven't changed in over a century,) flasks, vacuum manifolds, etc.
My current hood is home to a couple of multi-well shaker plates with heating/cooling blocks on them, and it's true that we didn't really have those back in the 1980s. But I wouldn't have used them much in grad school, anyway, since I was doing total synthesis of a natural product. I didn't have a lot of parallel reactions to set up; it was all going down one path most of the time. Actually, in some ways it was all going down one drainmost of the time, now that I think about it, but that's another story.
We get a lot of use out of the multi-well blocks now, though, when it's time to hang a dozen or two amides, ureas, sulfonamides, what-have-you off some defenseless nitrogen. That's the process known and loved throughout medicinal chemistry as "methyl, ethyl, propyl, butyl, futile." (Some variants of that joke have a "brutyl" group inserted, too.)
The innovations have come more noticably in analytical techniques: fancy new NMR techniques on a more routine basis, cheaper and more robust LC-mass spectrometer combinations, and so on. The NMR machines that I started grad school with would only be fit to be gutted and used as beer coolers today, for example.
Now, down the hall in biology, well, don't get me started. Those folks have done all the changing since the mid-80s. We've had plenty of new reactions and techniques added to the repetoire over in chemistry, but nothing compared to what's gone on in molecular biology. And perhaps that's one of the sources of the fix that the drug industry finds itself in these days - because all the drugs have to come from the chemists, one way or another.
Friday, August 16, 2002
Nature Stood Me Up
. . .not for the first time either, and doubtless not for the last. For anyone stopping by to get an update on my experiment, well, not very much happened. The controls all worked about like I figured they would, but the real experimental cases refused to do anything. And the one run that perhaps moved off the starting point was the one that I least expected to see anything from.
My colleague is going to set up another run in a few days, this one under more forcing conditions (in case I've underestimated the ease that things should occur.) I can come up with a couple of other hypotheses to check as well (as I alluded to this morning.)
But what this means is that (even if my whole concept is right) it isn't going to be easy. It's not like the door is just swinging open to this new field I'm picturing. I'd wondered if it was just because no one had thought to give it a push. I still think the basic idea is sound (although I still have scant evidence for that belief,) but it's an open question how far it can be generalized. If it were as general as I'd hoped, this experiment probably would have worked, frankly.
So, if the door isn't just opening up, the next thing to do is pick the damn lock. I've been uncharacteristically quiet since I got the data today, while I absorbed it and thought about what to do next. But I feel things getting back to normal already. I've got another system to try next week (and I'm working on another new one after that.) And I've had a decent idea to address today's failure, just while sitting here tonight. I'll be back on Monday morning, trying something new. Fortes fortuna juvat.
Here It Goes
The research idea I've been alluding to recently gets a key test today. I've got two values for one of the variables (high and low, essentially) and four values for another. And I've thought up two control experiments for each of those eight cases, which narrows things down quite a bit: under this experimental protocol, good news would have a very good chance of actually being good news.
But bad news would have a pretty good chance of actually being bad news, too. I can think of more ways to see a false-negative than a false-positive, but I'm not sure how likely some of those really are. If nothing happens today, I've got some plans for a second run which would address some of those.
Of course, I'd rather not find myself in the familiar research position of looking out the window, wondering what went wrong. I'll know by later this afternoon. A colleague from another department is setting things up and will collect the data, and I won't be very far from my phone during that time, I can tell you. Waiting for this stuff is nerve-wracking - it feels like I've asked the physical world out on a date and I'm waiting to hear if it'll accept.
Thursday, August 15, 2002
Great Moments in Legal Reasoning
I can't resist passing on this argument, made unsuccessfully by Schering-Plough. It's a tactic that's been tried before, and has never worked. But desperate times demand such measures, I suppose, although - well, you be the judge.
SGP's Clarinex is, as I've mentioned, extremely similar to their Claritin. It's an active metabolite, so if you've taken Claritin, you've taken Clarinex. That's the whole point, according to Schering. If you allow other companies to sell Claritin as a generic, then when people take it, their bodies will turn that into Clarinex.
Which is, of course, a patented, proprietary substance. Which these patients are breaking the law by producing - led into this illegal act by the actions of a generic drug maker. Shocking!
This didn't fly. For this and other reasons I'll go into next week, Schering ended up on the losing side (by a summary judgment) and generic Claritin (loratadine) moved much closer to the market.
[Disclosure added on 8/16: notwithstanding the tone of the above, I actually own some SGP stock, back from the days when it was a good performer.]
Week of Mystery Wrap-Up
Chad Orzel has some follow-up comments to my mysteries-of-physics stuff, titled "Shut Up and Calculate, Already." And since he's a physicist, his views and those of Doug Turnbull (see yesterday's post) and such should carry a lot more weight than mine. He's had the what's-holding-all-this-up feeling about the mathematical underpinnings of everything as well, it seems.
While I have the eyes of some physics folks - hey, when's that quantum mechanical gravitational theory going to be ready, anyway? Just kidding! There are plenty of similarly nasty questions you can ask me, too, as I'm well aware - ranging from "Where's my instant-youth pill?" down to "How come I have to wait for my cold to go away?" Actually, I answered that last one back on March 26th.
Wednesday, August 14, 2002
Our Friend the Phosphate Group, Redux
By the way, just to introduce some medicinal chemistry into this week's postings, I should point out that there's another way in which kinases outnumber phosphatases: the number of inhibitors known. It's true that we went a long time without good structural classes of compounds to inhibit kinases, but the dam burst some years back.
Now we've beaten several classes of heterocyclic structures completely into the ground, and the patent landscape looks like Yasgur's farm after they got finished holding Woodstock. But we do have kinase inhibitors, and plenty of 'em. So where are the phosphatase blockers?
Look around the literature, and you see all sorts of odd and funky structures, but no unifying themes. The first outfit that finds a general drug-friendly structural template to go after these targets will have quite a franchise on its hands.
Conservation of Conservation
The quantum-mysteries postings sure do bring in the mail, I have to say. Several letters pointed out the 2-6-etc. electron totals follow from quantized angular momentum and spin states. That's the explanation I remembered, although I'd have been hard pressed to lay it out in an e-mail as well as everyone I heard from.
But that pushes the questions down a level. You then can turn around and ask why angular momentum is quantized, and so on. Doug Turnbull, over at Beauty of Gray, anticipated this by pointing out that this is a consequence of underlying symmetry. Space is rotationally invariant (more technically, invariant under the symmetry operation of the rotation operator,) so angular momentum is conserved. Similar symmetry invariances lurk in the math to point out other conservation laws.
I think, though, that all these mysteries start to converge on the famous one of "the peculiar effectiveness of mathematics." Eventually, it comes down to wondering why the universe has mathematically reducible laws at all - and unless we get another universe (or a few) to compare ours to, it'll be difficult to answer that one.
Then there are the questions about how some of the various constants we have might have ended up the way they are. I could go on the familiar path about how different things would be if various ones were changed just a small amount, but that leads off into the Anthropic Principle, which is a ball of tar I don't particularly wish to get stuck in. It's one of those topics with no definite end, and I don't think I'll be changing my tag line to "philosophy, business, and culture" any time soon.
Hah! See what happens when the med-chem and pharma news is slow?
Tuesday, August 13, 2002
Our Friend the Phosphate Group
As for phosphorylation, I've had some folks write to talk about the importance of phosphate cleavages for cellular energy production, and about the conformational effects of phosphorylation. All that's well taken - but I guess what I was getting at yesterday is that (for example) sulfation would seem to be a perfectly reasonable way to modify proteins. Why didn't life end up using it?
Perhaps the phosphate energy part is the key. That's such a basic mechanism that enzymes to handle phosphate groups must be archaic indeed. It could be that evolution just found a use for them, since they were there anyway, and that competing methods of post-transcriptional modification (like sulfation) never got off the ground. Of course, there's always glycosylation - wonder when that kicked in, evolutionarily?
I've had several pieces of mail about yesterday's posting, none of them threatening me, which is always nice. I thought I'd address some of the issues raised, since pharma business news is pretty slow right now. (Given what it's been like recently when it's flowing freely, slow's not so bad.)
As for why there are two electrons in an s orbital, I've had several suggestions that boil down to "it's because of quantum mechanics" or "it's because of the Pauli exclusion principle." I have to admit, I used pretty much the same answers when I was asked the question myself. But the same arguments apply to p orbitals, with their 6 electrons. So, to rephrase the questions, how come some orbitals have 2, and others have 6? (Note that I'm ignoring the d and f orbitals - things are bad enough as they are.)
As I understand quantum mechanics, it does a fantastic job of telling us about the behavior of small quantized particles (or whatever they are, 'cause they aren't particles, really, and they aren't waves either.) But it doesn't tell us why the universe is made grainy like that to start with; it just tells us what the grains do.
The same problems apply to the other physical laws discussed in Feynman's lectures. Angular momentum is conserved, across a very wide range of definitions of angular momentum. How come? How Come entropy always increases? More powerful theories, which we can hardly even imagine, might shed light on these sorts of questions, but for now, we're really stuck.
All we can say is that that's the way we found 'em. They are that way because that's the way they are. Of course, similar explanations used to be applied to questions we now have better answers for, like the motion of the planets, black-body radiation, and so on. Ptolomey would have been amazed by Newton and Kepler, and Newton would have been shocked by Einstein. The answers to the next level of theory will be a while coming, and I'm sure they're going to make us all have to sit down for a while.
Monday, August 12, 2002
Don't Know Much About. . .Much
Re-reading Richard Feynman's The Character of Physical Law, I've encountered many places in his lecture series where he frankly discussed things that no one understood the reason for. Symmetry-breaking in the weak nuclear force, for example - it's important, it was instant Nobel Prize material, but why does it happen? Why in just that one of the basic forces?
It reminds me of when I was tutoring one summer in graduate school. I had someone say to me: "OK, I've learned that the innermost level of electrons has two electrons in it, the "s" shell. Why two? Why not three? Why not six?"
That stopped me in my tracks, to a degree that I can still recall. And it still does. It brought home to me the depths below the ice that we skate on. I was ready for all kinds of questions from students, and I thought I was handling them pretty well, if I did say so myself. But why are there two electrons in an s orbital? Not only do we not know, we don't even know how to begin answering the question.
And there are plenty of those waiting for anyone who takes the time to think about them. Why are there transition metals - yes, I know about the orbitals and all that, but why did they end up like that, giving us a periodic table with long stretches of almost-identical elements? Why don't we have something the size of fluorine that's electropositive (man, could we use that element in medicinal chemistry.) Why are there so many more kinases than phosphatases, and how'd we end up using phosphorylation so much in living systems, anyway? And so on. . .
Feynman didn't shy away from such things, or our ignorance of them. When you see him, with his extraordinary feel for the subject, say that he doesn't know why something is - well, you can be pretty sure that no one else knows, either. And you can resolve to admit your own ignorance of the really hard stuff: like why there are two electrons down there and not three.
Sunday, August 11, 2002
Life of the Party
My earlier gloss of the Journal of Medicinal Chemistry as "Jay Med Chem" prompts me to provide a complete guide to faking your way into sounding like a professional organic chemist. Not that that's the road to fame and fortune, but you never know when it'll come in handy.
The first lesson is the lingo of the literature. Journals are often referred to in shorthand: The Journal of the American Chemical Society is known as "Jay-Ay-Cee-Ess," or, more commonly "Jacks." Listening to chemists talking, you'd think some guy named Jack ran a prestigious journal. The Journal of Organic Chemistry is, similarly, "Jay Oh Cee," but never "Jock" and certainly not "Joke." There are journals that deserve that last nickname, but not JOC.
Other "Journal" contractions are made the same way: "Jay Het Chem," "Jay Fizz Chem," and the like. That latter one doesn't come up in organic chemistry conversations much, so be advised. The Royal Society journals would be a mouthful if they weren't known by their nicknames - imagine having to say "Journal of the Chemical Society Perkin Transactions I" instead of "Perkin One."
Another common name for journals is "Bulletin. . .etc.," but that doesn't lend itself to much shortening. I have to confess, though, that every time I look up a paper in the BCSJ that I think "More bull from the Chemical Society of Japan. . . " Another Japanese journal, Chemical and Pharmaceutical Bulletin(their version, roughly, of J. Med. Chem.) comes in for similar mental abuse, not fully deserved.
Tetrahedron Lettersis "Tet Lett" (but its longer stablemate Tetrahedronis never "Tet.") Organic Lettersis pretty new, but I've already heard as "Org Letters" (but not "Org Lett," for some reason.) Synthetic Communications,in those rare times it comes up in conversation, is "Syn Com." and Chemical Communications is "Chem Com." Noting all this, the folks over at Synthesiscut to the chase a few years ago when they named their new short-communications journal
Some of the titles are hard to deal with. I've never heard Tetrahedron Asymmetryreferred to as "Tet Asym," but that's because I've hardly ever heard the journal referred to at all.
If you're going to fake your way through a medicinal chemistry conversation, be sure to drop some more biology-oriented journals into your mix. Many of the better ones have one-word names that don't have to be contracted. Science, Nature,and Cell speak for themselves, for example. But a quick nod to "Pee Enn Ay Ess" for the Proceedings of the National Acadamy of Sciences or "Jay Bee Cee" for the Journal of Biological Chemistrywill establish your credentials. Note that that last one, despite the name, is a torrent of densely packed biology from start to finish, with not much chemistry in sight.
That should do the trick. At some later date, I'll get everyone outside the profession up to speed on the acronymic jargon of the lab itself, which led to the following conversation one day at my lunch table:
"I used DDQ in THF to try to take off my PMB, but the THP keeps coming off, too."
(From the far end of the table) "BFD."
Thursday, August 08, 2002
Better Them Than Me
Roche and their partner, Trimeris are developing an anti-HIV compound (called T-20) that has some remarkable features. It's the first to target a cell protein (gp-41) that is a key binding step in HIV's mode of infection, for one thing. But to be more precise, it has 106 more interesting things about it - that's the total number of synthetic steps in the manufacturing process.
My synthetic-organic readers are probably saying just what everyone in the field does when they first hear about that one - it's usually a variation of "no (procreating) way," followed by laughter. Those of us who've done natural product synthesis get the shivers, remembering those 20- or 30-step build-the-pyramids projects from our pasts. Trying to extrapolate those experiences up the asymptotic curve of awfulness to 106 steps is a scary exercise.
Fortunately, the compound is a peptide (36 amino acids,) which means that the chemistry is well-established and can be done largely by machine. It also means that you don't do all 106 steps in a row. Well, you could try, if you had a good excuse like having had a big recent dose of LSD or something, but it's not recommended.
For starters, it probably wouldn't work. Peptide synthesizers work by hanging one end of the chain off a solid resin support. As the molecule gets longer and snakier, it tends to ball up on you. Eventually, the far end, the one that is getting new amino acids added to it, isn't sticking out into the solution any more, but is tucked back into its own folds. End of synthesis, by default. The second problem with having that many linear steps is a mathematical one. A to-the-hundredth-power term in an equation is a bomb waiting to go off, and if your synthetic yields aren't just about perfect, you're going to be in bad shape.
For example, if every step works in 95% yield (which would be a mighty fine result across 106 steps,) then you're going to come staggering across the finish line with a 0.4% overall yield. What if you average an 85% yield? Admittedly, a peptide synthesizer will beat that, but it had damn well better, because that pace will leave you with 3-millionths of a percent yield. Ugly things, those exponents.
Roche, not being a gang of idiots, will certainly be making T-20 in small chunks, then stitching those together. The tricky part is figuring out where to break up a molecule that size. What combination of fragment assembly schemes has the best chance of high reproducible yields? The life expectancies of humans being what they are, you can't try all the permutations and pick the best. Some educating guessing using known pitfalls of protein design has apparently given them a route that works.
But it's going to be a honker of a synthesis, no matter how well it goes, because it's going to have to be done on monstrous scale. Peptides, as I never tire of pointing out, have every chance of making crappy drugs, and this is worse than most. You can't take a 36 amino-acid peptide orally and expect much to happen; your digestive tract will rip it to shreds like it does every other protein. T-20 has to be injected, twice a day, 100 mg per shot. Roche is going to have to make thousands of kilos of this stuff, which should handily break any records for peptide synthesis. It already looks like it'll be tough, at least at first.
While T-20 works quite well, it's going to be extremely costly to produce. Roche has reportedly already spent nearly $500 million on the manufacturing facility in Colorado. T-20, then, is also going to be extremely costly to take. There's room to doubt how long it'll take to pay off, especially for Trimeris. If resistance to the drug shows up ahead of expectations, it could never pay off at all. (The companies are developing some related peptides that might get around this problem, which is a good strategy.)
I'm still in awe of the decision to go ahead with this drug. It's a major risk, and I can only hope that it has major rewards for both the patients and the companies. And I can also be glad that I'm not having to keep those peptide synthesizers out in Boulder stocked with solvents!
Monday, August 05, 2002
Lighter Than Normal
I see from my traffic stats that people on are vacation these days (and I think that sites all over the blogging world are noticing the same thing.) I'm still grindstoning away for now, though, and that's why things are going to be a bit sparse this week.
One reason is the experiments I spoke about yesterday. If things go well, I might have some answers by the end of this week, and I'm already preparing for the next round. On top of that is the drug project my lab's assigned to. We're nearing some crucial stages in that, too, which is leading to a lot more work than usual (with a real deadline attached, which medicinal chemists always dread.)
The home front keeps me busy as well. My nearly-four year old son has been requesting a lab catalog to look at all the equipment, and I'm trying to teach the two-year-old to say "diastereomeric." OK, I'm not, yet. Maybe when I have some more spare time.
I've been rereading "The Character of Physical Law" by Richard Feynman, and came across a favorite quote from it, where he's speaking about early discoveries in gravitation and the motion of the planets:
"This process has developed into an avalanche of discoveries, each new discovery permits the tools for much more discovery, and this is the beginning of the avalanche which has gone on now for 400 years in a continuous process, and we are still avalanching along at high speed."
So, if you don't see anything from me for a day here and there, just assume that I'm doing my best to keep the rocks bouncing along.
Sunday, August 04, 2002
Close to the Vest
Another line in one of the aforementioned Paul Orwin posts rang true for me. He was discussing some new ideas in antibacterial research, then brought himself up short as he got close to his own work: "In the highly competitive world of academic science, even a weblog is no place to divulge current research tidbits"
The same thing goes, with pistachio nuts and a cherry on top, for industrial research. I've hardly said a word about the actual work I do during the day, and I don't plan to, either. It's a pity, since it's been an interesting project with a lot of twists and turns, and it would have been a good illustration of what med-chem research is like day-by-day. But I'd have been fired long ago if I tried to do that, and rightly so. Like all other pharma companies, no one hears a word about what we're up to until we're darn good and ready to tell 'em.
That's what makes information such a strange commodity in the business. The Journal of Medicinal Chemistry("Jay-Med-Chem" to its friends) can be an interesting read, but only in a historical sense. Projects you read about there are either well along in the clinic, or well buried out back with grass growing over them. The same goes for presentations at conferences. When I see a poster from a drug company with a good crowd around it at a meeting, I always think of someone attracting birds by throwing stale bread on the ground.
I've been guilty of crowding around 'em, too, though. I've come back from meetings bursting with the latest news from other companies, as given in their presentations. But we all have to remind ourselves that these breaking headlines are like light from distant stars. Who knows what's happening there now?
This all applies to the research project that I've alluded to over the past few months, of course. It's not directly aimed at a single therapeutic target, but it's an idea of potential usefulness, and my employer has every right to expect me to keep quiet about it. After all, I'm using their facilities to try to make it work. So all I can do is speak in generalities for now, with the hope that if it pans out, that anyone who's interested can read about it in a patent or publication. (Of course, I do have some readers at the company itself, and they've called me up at times to ask me what the heck I'm talking about. I can ease the suspense for them, not that this stuff seems to be keeping anyone up at night besides me.)
This work is on my mind because I'm nearing another crucial set of experiments, as I alluded to on July 24th. All that remains is working out some analytical methods so I can be sure that I know what I'm looking at - and I can tell you, it's a real strain for me not to just go ahead and run the things without doing that first. I could always just put the stuff in the freezer, I mutter to myself, and when I get the analysis worked out, they'd just be there waiting for me.
But that's no way to work. It shouldn't take that much longer to have a well-controlled experiment that I can actually follow. There's another new one coming up right behind that one, and I can hardly wait to get it ready to go, too. Then it'll be time for the "nonspecific elated noises" I promised when I first had this idea (see April 28, also May 2 and May 3 if you're interested.) Or, perhaps it'll be time for some Botox-worthy furrows in my brow, as I try to figure out what went wrong and why. . .
Bacteria and Their Enemies
I've been remiss in not mentioning this until now, but Paul Orwin has a couple of very good posts on the subject of antibacterials and drug resistance. I know people working in the anti-infectives area, and it's become an increasingly hard place to work over the years. Good drug targets are hard to find, because the bacteria keep weaseling their way around most of them. The discovery of a new antibiotic class is really something to celebrate, but no one gets to break the party hats out much any more.
Thursday, August 01, 2002
Transkaryotics versus Amgen
This deserves a longer post than I have time to give it, but here goes. These two companies have been at it for some time now in court. Amgen, of course, has patents on EPO, its big money-maker. TKTX has been trying to evade them, on the grounds that Amgen's patent covers recombinant protein - while their method induces human cells to produce it on their own. No foreign DNA, no overexpression - just what was there already.
As you can imagine, Amgen hasn't been too impressed by that reasoning, which Transkaryotics is ready to apply to the products of several other companies. AMGN won a round in the US, and had won in England as well - but an appeals court partially reversed that on Wednesday (the US ruling is on appeal as well.)
The latest ruling holds that TKTX did not infringe on Amgen. But it doesn't invalidate Amgen's patents (which was one goal of the legal fight,) nor does it completely address the validity of Transkaryotic's patent claims, either. It's just that the claims, as issued, don't appear to infringe. Not much, but it's the best news that TKTX has had on this front in a while.
Update on Alzheimer's Vaccines
The amyloid vaccine approach to treating Alzheimer's (which I last wrote about referring to Elan's bad news,) is back again. An Israeli company, Mindset, has started work on some vaccine candidates developed at NYU. These are thought to be less allergenic than the ones used in Elan's trials, and I hope they're right. I'm glad someone has the nerve to take another crack at this - I didn't expect it so soon after Elan's disconcerting results.
That Ariad Patent (Again)
The latest issue of The Scientist has an article about the Ariad / Eli Lilly suit. It covers the academic end of the legal problems that Ariad is causing. There are plenty of university labs working on NF-kb signaling - do they all have to cease and desist because Ariad seems to own it? As long as they aren't commercializing drugs, probably not. But what if their research leads someone else to do so?
This is only one aspect of the confusion this type of intellectual property claim can cause. There's a disturbing quote in the article from a technology licensing person at MIT, which I'm going to follow up on myself. More to come on Harvard and MIT's role in all this. . .
Breathing And Aging
I've written about the idea that aging is related to oxidative damage (most recently on June 3.) There's a lot of support for it, and the documented life-extending properties of caloric restriction are thought by many to be tied into this hypothesis. CR has worked in (for example) fruit flies and rodents, and some slow-moving experiments suggest that it works in larger animals up to primates. The less you eat, the less you metabolize, the fewer reactive oxygen species you generate, and the less damage you do to your cellular machinery. It makes a lot of sense.
Too much sense, I suppose. As is the relentless way of science, the waters have been thoroughly muddied by a report in the July 18 issue of Nature (a summary is here.) [Note added on August 1: these links don't seem to work without a subscription to Nature. I'll see if I can find free ones and post those.] These researchers studied yeast, using the number of divisions a cell can go through as a measure of lifespan. Some of their earlier work supported the CR trend, since they found that yeast grown in reduced-glucose conditions went on dividing for 30% longer.
They were even able to tie this effect to the presence of a particular gene, SIR2. That codes for a histone deacetylase, which would puts it squarely in the gene transcription / regulation area. There's a lot to write about those, of course, but for those who don't follow the field, the short story is that DNA needs to be wound and unwound from histone proteins in order to be transcribed into RNA. Acetylation and deacetylation of the histones is one of the key switches for those processes, although there are others. And there are things that regulate them, and things that regulate the regulators, and things that modulate the effect of the regulation of the thing that cancels the actions of the. . .ah, cell biology. Nothing like it.
But when they looked at things more closely, they found that the calorically-restricted yeast actually have three times the respiration rate! So much for the simple hypothesis. A closer look showed that a key factor is the way that yeast can switch back and forth between respiration (when there's enough oxygen) and fermentation (when there isn't.) Carbon dioxide is the waste product of the first process, which yields more bang for the buck. And as the world well knows, ethanol is the waste product of the second one. Grow yeast under conditions where they do both, and you've got beer. When there's just enough food to survive, it seems, the yeast switch entirely over to respiration for its greater efficiency.
As it turns out, not only do the CR yeast respire faster, but if you mutate them to where they can't respire at all, caloric restriction doesn't increase life span. So what about all the foul free radicals produced by all that respiration? The CR yeast were shown to be more sensitive to free radical sources (which usually means that their existing machinery for detoxifying such things is already stretched near its limit) but these cells showed no increase in the usual suspects (like superoxide dismutatase.)
The free-radical production and protection pathways are clearly more complicated than they seemed. (And, of course, anyone who buys superoxide dismutase tablets at a health food store is clearly a fool, but that's been obvious long before this paper came out.) What would make the story neat and clean is if the SIR2 histone deactylase turned out to be regulating unknown genes that are involved in detoxifying free radicals. That's probably too rational, but it's the first thing to check.
Does any of this apply to larger things than yeast? No one knows yet if mammals on CR diets respire more (although I'll bet that folks are checking as we speak.) SIR2 works the same way in roundworms (C. elegans, the biologist's friend,) and there are homologous genes in higher animals. It's part of some highly conserved metabolic pathway, whatever it is. If we can get our hands on it, there may be hope for an extended life span during which we could actually have a pizza every so often.