Skip to content
forked from LLNL/FAST

Fusion models for Atomic and molecular STructures (FAST)

License

MIT, CC-BY-4.0 licenses found

Licenses found

MIT
LICENSE
CC-BY-4.0
LICENSE-CC-BY
Notifications You must be signed in to change notification settings

jbarks1234/FAST

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fusion models for Atomic and molecular STructures (FAST)

Predicting accurate protein-ligand binding affinity is important in drug discovery. This code implements fusion network model to benefit from Spatial Grach CNN and 3D CNN models to improve the binding affinity prediction. The code is written in python with Tensorflow and Pytorch.

Getting Started

Prerequisites

Running the application

Data format

The implemented networks use a 3D atomic representation as input data in a Hierarchical Data Format (HDF5). Each complex/pocket data is comprised of a list of atoms with their features including 3D coordinates of the atoms (x, y, z) and associated features such as atomic number and charges. For more detail, please refer to the paper in the Citing LIST section.

3D-CNN

To train or test 3D-CNN, run model/3dcnn/main_3dcnn_pdbbind.py. Here is an example comand to test a pre-trained 3D-CNN model:

python main_3dcnn_pdbbind.py --main-dir "pdbbind_3dcnn" --model-subdir "pdbbind2016_refined" --run-mode 5 --external-hdftype 3 --external-testhdf "eval_set.hdf" --external-featprefix "eval_3dcnn" --external-dir "pdbbind_2019"

SG-CNN

To train or test SG-CNN, run model/sgcnn/src/train.py or model/sgcnn/src/test.py.

For an example training script, see model/sgcnn/scripts/train_pybel_pdbbind_2016_general_refined.sh

Fusion

To train or test fusion model, run model/fusion/main_fusion_pdbbind.py

python main_fusion_pdbbind.py --main-dir "pdbbind_fusion" --fusionmodel-subdir "pdbbind2016_fusion" --run-mode 3 --external-csvfile "eval_3dcnn.csv" --external-3dcnn-featfile "eval_3dcnn_fc10.npy" --external-sgcnn-featfile "eval_sgcnn_feat.npy" --external-outprefix "eval_fusion" --external-dir "pdbbind_2019"

Pre-trained weights (checkpoint files)

We trained all of the networks above on pdbbind 2016 datasets. Particularly, we used general and refined datasets for training and validation, and evaluated the model on the core set (see sample_data/core_test.hdf).

The checkpoint files for the models are made available under the Creative Commons BY 4.0 license. See the license section below for the terms of the license. The files can be found here: ftp://gdo-bioinformatics.ucllnl.org/fast/pdbbind2016_model_checkpoints/.

Contributing

To contribute to FAST, please send us a pull request. When you send your request, make develop the destination branch on the repository.

Versioning

0.1

Authors

FAST was created by Hyojin Kim ([email protected]), Derek Jones ([email protected]), Jonathan Allen ([email protected]).

Other contributors

This project was supported by the American Heart Association (AHA) project (PI: Felice Lightstone).

Citing LIST

If you need to reference FAST in a publication, please cite the following paper:

Derek Jones, Hyojin Kim, Xiaohua Zhang, Adam Zemla, William D. Bennett, Dan Kirshner, Sergio Wong, Felice Lightstone, and Jonathan E. Allen, "Improved Protein-ligand Binding Affinity Prediction with Structure-Based Deep Fusion Inference", arxiv 2020.

License

FAST is distributed under the terms of the MIT license. All new contributions must be made under this license. See LICENSE in this directory for the terms of the license. SPDX-License-Identifier: MIT LLNL-CODE-808183

Checkpoint files are provided under the Creative Commons BY 4.0 license. See LICENSE-CC-BY in this directory for the terms of the license.
LLNL-MI-813373

About

Fusion models for Atomic and molecular STructures (FAST)

Resources

License

MIT, CC-BY-4.0 licenses found

Licenses found

MIT
LICENSE
CC-BY-4.0
LICENSE-CC-BY

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.3%
  • Shell 2.7%