Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/139906
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Prediction of Multiple Types of RNA Modifications via Biological Language Model
Author: Zhang, Y.
Ge, F.
Li, F.
Yang, X.
Song, J.
Yu, D.-J.
Citation: IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2023; 20(5):3205-3214
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Issue Date: 2023
ISSN: 1545-5963
1557-9964
Statement of
Responsibility: 
Ying Zhang, Fang Ge, Fuyi Li, Xibei Yang, Jiangning Song, and Dong-Jun Yu
Abstract: It has been demonstrated that RNA modifications play essential roles in multiple biological processes. Accurate identification of RNA modifications in the transcriptome is critical for providing insights into the biological functions and mechanisms. Many tools have been developed for predicting RNA modifications at single-base resolution, which employ conventional feature engineering methods that focus on feature design and feature selection processes that require extensive biological expertise and may introduce redundant information. With the rapid development of artificial intelligence technologies, end-to-end methods are favorably received by researchers. Nevertheless, each well-trained model is only suitable for a specific RNA methylation modification type for nearly all of these approaches. In this study, we present MRM-BERT by feeding task-specific sequences into the powerful BERT (Bidirectional Encoder Representations from Transformers) model and implementing fine-tuning, which exhibits competitive performance to the state-of-the-art methods. MRM-BERT avoids repeated de novo training of the model and can predict multipleRNAmodifications such as pseudouridine, m6A, m5C, and m1A in Mus musculus, Arabidopsis thaliana, and Saccharomyces cerevisiae. In addition, we analyse the attention heads to provide high attention regions for the prediction, and conduct saturated in silico mutagenesis of the input sequences to discover potential changes of RNA modifications, which can better assist researchers in their follow-up research.
Keywords: RNA modification; deep learning; self-attention mechanism; BERT; biological language model
Rights: © 2023 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.
DOI: 10.1109/tcbb.2023.3283985
Grant ID: http://purl.org/au-research/grants/nhmrc/1127948
http://purl.org/au-research/grants/nhmrc/1144652
http://purl.org/au-research/grants/arc/LP110200333
http://purl.org/au-research/grants/arc/DP120104460
Published version: http://dx.doi.org/10.1109/tcbb.2023.3283985
Appears in Collections:Molecular and Biomedical Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.