LigGPT: Molecular Generation using a Transformer-Decoder Model

2021 
Application of deep learning techniques for the de novo generation of molecules, termed as inverse molecular design, has been gaining enormous traction in drug design. The representation of molecules in SMILES notation as a string of characters enables the usage of state of the art models in Natural Language Processing, such as the Transformers, for molecular design in general. Inspired by Generative Pre-Training (GPT) model that have been shown to be successful in generating meaningful text, we train a Transformer-Decoder on the next token prediction task using masked self-attention for the generation of druglike molecules in this study. We show that our model, LigGPT, outperforms other previously proposed modern machine learning frameworks for molecular generation in terms of generating valid, unique and novel molecules. Furthermore, we demonstrate that the model can be trained conditionally to optimize multiple properties of the generated molecules. We also show that the model can be used to generate molecules with desired scaffolds as well as desired molecular properties, by passing these structures as conditions, which has potential applications in lead optimization in addition to de novo molecular design. Using saliency maps, we highlight the interpretability of the generative process of the model.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    45
    References
    8
    Citations
    NaN
    KQI
    []