loading page

Combination of deep neural network with attention mechanism enhances the explainability of protein contact prediction
  • +1
  • Chen Chen,
  • Tianqi Wu,
  • Zhiye Guo,
  • Jianlin Cheng
Chen Chen
University of Missouri
Author Profile
Tianqi Wu
University of Missouri - Columbia
Author Profile
Zhiye Guo
University of Missouri
Author Profile
Jianlin Cheng
University of Missouri
Author Profile

Abstract

Deep learning has emerged as a revolutionary technology for protein residue-residue contact prediction since the 2012 CASP10 competition. Considerable advancements in the predictive power of the deep learning-based contact predictions have been achieved since then. However, little effort has been put into interpreting the black-box deep learning methods. Algorithms that can interpret the relationship between predicted contact maps and the internal mechanism of the deep learning architectures are needed to explore the essential components of contact inference and improve their explainability. In this study, we present an attention-based convolutional neural network for protein contact prediction, which consists of two attention mechanism-based modules: sequence attention and regional attention. Our benchmark results on the CASP13 free-modeling (FM) targets demonstrate that the two attention modules added on top of existing typical deep learning models exhibit a complementary effect that contributes to predictive improvements. More importantly, the inclusion of the attention mechanism provides interpretable patterns that contain useful insights into the key fold-determining residues in proteins. We expect the attention-based model can provide a reliable and practically interpretable technique that helps break the current bottlenecks in explaining deep neural networks for contact prediction. The source code of our method is available at https://github.com/jianlin-cheng/InterpretContactMap.

Peer review status:UNDER REVIEW

05 Sep 2020Submitted to PROTEINS: Structure, Function, and Bioinformatics
10 Sep 2020Assigned to Editor
10 Sep 2020Submission Checks Completed
14 Sep 2020Reviewer(s) Assigned