AMHF-TP: Multifunctional therapeutic peptides prediction based on multi-granularity hierarchical features

Shouheng Tuo , YanLing Zhu , Jiangkun Lin , Jiewei Jiang

Quant. Biol. ›› 2025, Vol. 13 ›› Issue (1) : e73

PDF (1698KB)
Quant. Biol. ›› 2025, Vol. 13 ›› Issue (1) : e73 DOI: 10.1002/qub2.73
METHOD

AMHF-TP: Multifunctional therapeutic peptides prediction based on multi-granularity hierarchical features

Author information +
History +
PDF (1698KB)

Abstract

Multifunctional therapeutic peptides (MFTP) hold immense potential in diverse therapeutic contexts, yet their prediction and identification remain challenging due to the limitations of traditional methodologies, such as extensive training durations, limited sample sizes, and inadequate generalization capabilities. To address these issues, we present AMHF-TP, an advanced method for MFTP recognition that utilizes attention mechanisms and multi-granularity hierarchical features to enhance performance. The AMHF-TP is composed of four key components: a migration learning module that leverages pretrained models to extract atomic compositional features of MFTP sequences; a convolutional neural network and self-attention module that refine feature extraction from amino acid sequences and their secondary structures; a hypergraph module that constructs a hypergraph for complex similarity representation between MFTP sequences; and a hierarchical feature extraction module that integrates multimodal peptide sequence features. Compared with leading methods, the proposed AMHF-TP demonstrates superior precision, accuracy, and coverage, underscoring its effectiveness and robustness in MFTP recognition. The comparative analysis of separate hierarchical models and the combined model, as well as with five contemporary models, reveals AMHF-TP’s exceptional performance and stability in recognition tasks.

Keywords

deep learning / hypergraph / multifunctional therapeutic peptides / multi-granularity hierarchical features

Cite this article

Download citation ▾
Shouheng Tuo, YanLing Zhu, Jiangkun Lin, Jiewei Jiang. AMHF-TP: Multifunctional therapeutic peptides prediction based on multi-granularity hierarchical features. Quant. Biol., 2025, 13(1): e73 DOI:10.1002/qub2.73

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Kardani K , Milani A , H. Shabani S , Bolhassani A . Cell penetrating peptides: the potent multi-cargo intracellular carriers. Expet Opin Drug Deliv. 2019; 16 (11): 1227- 58.

[2]

Bahar AA , Ren D . Antimicrobial peptides. Pharmaceuticals. 2013; 6 (12): 1543- 75.

[3]

Anguela XM , High KA . Entering the modern era of gene therapy. Annu Rev Med. 2019; 70 (1): 273- 88.

[4]

Ching T , Himmelstein DS , Beaulieu-Jones BK , Kalinin AA , Do BT , Way GP , et al. Opportunities and obstacles for deep learning in biology and medicine. J R Soc Interface. 2018; 15 (141): 20170387.

[5]

Min S , Lee B , Yoon S . Deep learning in bioinformatics. Briefings Bioinf. 2017; 18 (5): 851- 69.

[6]

Xiao X , Wang P , Lin WZ , Jia JH , Chou KC . iAMP-2L: a two-level multi-label classifier for identifying antimicrobial peptides and their functional types. Anal Biochem. 2013; 436 (2): 168- 77.

[7]

Lin W , Xu D . Imbalanced multi-label learning for identifying antimicrobial peptides and their functional types. Bioinformatics. 2016; 32 (24): 3745- 52.

[8]

Wei L , Zhou C , Su R , Zou Q . PEPred-Suite: improved and robust prediction of therapeutic peptides using adaptive feature representation learning. Bioinformatics. 2019; 35 (21): 4272- 80.

[9]

Bin Y , Zhang W , Tang W , Dai R , Li M , Zhu Q , et al. Prediction of neuropeptides from sequence information using ensemble classifier and hybrid features. J Proteome Res. 2020; 19 (9): 3732- 40.

[10]

Dai R , Zhang W , Tang W , Wynendaele E , Zhu Q , Bin Y , et al. BBPpred: sequence-based prediction of blood-brain barrier peptides with feature representation learning and logistic regression. J Chem Inf Model. 2021; 61 (1): 525- 34.

[11]

Grønning AG , Kacprowski T , Scheele C . MultiPep: a hierarchical deep learning approach for multi-label classification of peptide bioactivities. Biol Methods Protoc. 2021; 6 (1): bpab021.

[12]

Xiao X , Shao YT , Cheng X , Stamatovic B . iAMP-CA2L: a new CNN-BiLSTM-SVM classifier based on cellular automata image for identifying antimicrobial peptides and their functional types. Briefings Bioinf. 2021; 22 (6): bbab209.

[13]

Chen S , Li Q , Zhao J , Bin Y , Zheng C . NeuroPred-CLQ: incorporating deep temporal convolutional networks and multi-head attention mechanism to predict neuropeptides. Briefings Bioinf. 2022; 23 (5): bbac319.

[14]

Chu Y , Zhang Y , Wang Q , Zhang L , Wang X , Wang Y , et al. A transformer-based model to predict peptide-HLA class I binding and optimize mutated peptides for vaccine design. Nat Mach Intell. 2022; 4 (3): 300- 11.

[15]

Li Y , Li X , Liu Y , Yao Y , Huang G . MPMABP: a CNN and Bi-LSTM-based method for predicting multi-activities of bioactive peptides. Pharmaceuticals. 2022; 15 (6): 707.

[16]

Otovic E , Njirjak M , Kalafatovic D , Mausa G . Sequential properties representation scheme for recurrent neural network-based prediction of therapeutic peptides. J Chem Inf Model. 2022; 62 (12): 2961- 72.

[17]

Tang W , Dai R , Yan W , Zhang W , Bin Y , Xia E , et al. Identifying multi-functional bioactive peptide functions using multi-label deep learning. Briefings Bioinf. 2022; 23 (1): bbab414.

[18]

Yan W , Tang W , Wang L , Bin Y , Xia J . PrMFTP: multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization. PLoS Comput Biol. 2022; 18 (9): e1010511.

[19]

Fan H , Yan W , Wang L , Liu J , Bin Y , Xia J . Deep learning-based multi-functional therapeutic peptides prediction with a multi-label focal dice loss function. Bioinformatics. 2023; 39 (6): btad334.

[20]

Vaswani A , Shazeer N , Parmar N , Uszkoreit J , Jones L , Gomez AN , et al. Attention is all you need. Adv Neural Inf Process Syst. 2017. Preprint at arXiv: 1706.03762.

[21]

Zhang Y , Wallace B . A sensitivity analysis of (and practitioners' guide to) convolutional neural networks for sentence classification. arXiv preprint arXiv:1510.03820. 2015.

[22]

Devlin J , Chang MW , Lee K , Toutanova K . BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805. 2018.

[23]

Liu Y , Ott M , Goyal N , Du J , Joshi M , Chen D , et al. ROBERTA: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692. 2019.

[24]

Lan Z , Chen M , Goodman S , Gimpel K , Sharma P , Soricut R . ALBERT: a lite BERT for self-supervised learning of language representations. arXiv preprint arXiv:1909.11942. 2019.

[25]

Conneau A , Khandelwal K , Goyal N , Chaudhary V , Wenzek G , Guzmán F , et al. Unsupervised cross-lingual representation learning at scale. arXiv preprint arXiv:1911.02116. 2019.

[26]

Berge C . Hypergraphs: combinatorics of finite sets, 45. North Holland: Elsevier; 1984.

[27]

Barabasi AL , Oltvai ZN . Network biology: understanding the cell's functional organization. Nat Rev Genet. 2004; 5 (2): 101- 13.

[28]

Lee I , Ambaru B , Thakkar P , Marcotte EM , Rhee SY . Rational association of genes with traits using a genome-scale gene network for Arabidopsis thaliana. Nat Biotechnol. 2010; 28 (2): 149- 56.

[29]

Thiele I , Palsson . A protocol for generating a high-quality genome-scale metabolic reconstruction. Nat Protoc. 2010; 5 (1): 93- 121.

[30]

Huang DW , Sherman BT , Lempicki RA . Systematic and integrative analysis of large gene lists using DAVID bioinformatics resources. Nat Protoc. 2009; 4 (1): 44- 57.

[31]

Yu H , Kim PM , Sprecher E , Trifonov V , Gerstein M . The importance of bottlenecks in protein networks: correlation with gene essentiality and expression dynamics. PLoS Comput Biol. 2007; 3 (4): e59.

[32]

Ding K , Wang J , Li J , Li D , Liu H . Be more with less: hypergraph attention networks for inductive text classification. arXiv preprint arXiv:2011.00387. 2020.

[33]

Saifuddin KM , May C , Tanvir F , Islam MIK , Akbas E . Seq-hygan: sequence classification via hypergraph attention network. In: Proceedings of the 32nd ACM international conference on information and knowledge management. 2023. p. 2167- 77.

[34]

Beltagy I , Lo K , Cohan A . SciBERT: a pretrained language model for scientific text. arXiv preprint arXiv:1903.10676. 2019.

[35]

Chou PY , Fasman GD . Prediction of the secondary structure of proteins from their amino acid sequence. Adv Enzymol Relat Area Mol Biol. 1979; 47: 45- 148.

RIGHTS & PERMISSIONS

The Author(s). Quantitative Biology published by John Wiley & Sons Australia, Ltd on behalf of Higher Education Press.

AI Summary AI Mindmap
PDF (1698KB)

185

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/