«

»

Jan
10

Collaborations, pufferfish, sea squirt, and database quality

Once in a while a collaboration takes you into a whole new domain, in the past few years I feel like I have jumped into the ocean.

One of my long time collaborators Matthew Krasowski from the University of Iowa has been working on the evolution of nuclear receptors. Last year he published a paper on the evolution of FXR, VDR, PXR in the pufferfish and other species in collaboration with many co-authors including Seth Kullman and Erin Kollitz at NC state. What is unique about Matt’s highly collaborative approach is that he uses small molecules, in this case bile salts and synthetic ligands to probe the biological activity profiles of these receptors to understand their evolution. He also included collaborators that developed computational modeling of the proteins (Ni Ai) and the ligands (me) to illustrate differences between binding pockets of the receptors. Pufferfish and zebrafish were found to have very different bile salt profiles and different receptor selectivity that matched the endogenous ligands. Interestingly the pufferfish displayed bile salt profile and receptor selectivity similar to humans.

Fast forward a year and Andrew Fidler and his group at the Cawthron Institute in New Zealand collaborated with Matt to look at the activation of the sea squirt (ciona) nuclear receptors by natural and sythetic toxins, including microalgal biotoxins. This continued some earlier pharmacophore modeling of the cionaVDR/PXRalpha which we had modelled. Two biotoxins activated this receptor and these are much larger than the previous ligands identified as activators. Also one pesticide, esfenvalerate was found to activate it at higher concentrations. The paper proposed the receptor evolved to bind the molecules in the sea squirt diet and because of their ecological niche they may be a useful biosensor.

All of these studies depend on good quality sources of data, whether that is the sequence of proteins for homology modeling, X-ray structures for docking or small molecules for pharmacophore modeling. This blog has extensively covered the idea of the need for a gold standard molecule database. Antony Williams and I were asked to put an online editorial together for Drug Discovery Today. Today this became available and we describe the long term cost of inferior database quality . This topic is becoming something that resonates with me as I go through collaborations like those described above. We are continually building on past data and for our future work, the foundation has to be good. I am grateful to all the collaborators for opening my eyes about new topics and the world around us. The opportunity to integrate experimental and computational approaches, I think is giving us new insights, but we need to make sure that our discoveries are put in good databases for future scientists.

2 comments

No ping yet

  1. Joe Olechno says:

    Great blog and great editorial in “Drug Discovery Today”! What is the impact on database quality when the dose-response (IC50) values are in error due to the mechanism of transfer?

    It appears that an unintended and unforeseen consequence of assay miniaturization is that standard liquid handling steps are more likely to sow errors than when assay volumes were relatively large. However, cost and time guarantee that miniaturization must continue.

    Are SAR experiments being confounded because errors of three orders of magnitude are going into the databases? It seems that IC50 values can be easily corrupted by the mechanism of transfer.
    For examples in errors in IC50 values driven by liquid handling see: US Patent, 7,718,653; or American Drug Discovery 3(3), 24-30.
    For the impact of leachates from plastic tips see: Science 2008 7 November; 322(5903): 917; Clinical Chem. (2009) 55:1883.
    For the impact of carry-over of active compounds see: Matson et al., 2009, 14:476, Table 2.

    All of these papers suggest that data going into the databases is wrong and may be compromising the quality of those databases. Since the strength of the database affects the strength of the SAR program and the company, one process addresses all of these problems.

    The problems that arise from underestimated potency, plastic leachates and carry-over can be dramatically reduced or eliminated with acoustic liquid handling. This technology eliminates physical contact with the liquid being transferred and that gets rid of leachates and carry-over. The researchers in the US patent cited above used acoustic liquid handling to obtain improved measures of potency—apparently the drug-like compounds were being pulled from the solution by the plastic pipette tips leading to erroneously high IC50 values.

    If IC50 values are in error by 100 or one thousand-fold, how useful is the database? What if only 20% of the data is so corrupted, what if it is 2% or 0.2%? How robust will the database be if large errors are incorporated into the data?

  2. sean says:

    Joe
    Many thanks for the comment, few would be aware of this so very much appreciate the summary. Do others have additional comments like this?

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>