|

Deciphering 3’UTR Mediated Gene Regulation Using Interpretable Deep Representation Learning.

Researchers

Journal

Modalities

Models

Abstract

The 3′ untranslated regions (3’UTRs) of messenger RNAs contain many important cis-regulatory elements that are under functional and evolutionary constraints. It is hypothesized that these constraints are similar to grammars and syntaxes in human languages and can be modeled by advanced natural language techniques such as Transformers, which has been very effective in modeling complex protein sequence and structures. Here 3UTRBERT is described, which implements an attention-based language model, i.e., Bidirectional Encoder Representations from Transformers (BERT). 3UTRBERT is pre-trained on aggregated 3’UTR sequences of human mRNAs in a task-agnostic manner; the pre-trained model is then fine-tuned for specific downstream tasks such as identifying RBP binding sites, m6A RNA modification sites, and predicting RNA sub-cellular localizations. Benchmark results show that 3UTRBERT generally outperformed other contemporary methods in each of these tasks. More importantly, the self-attention mechanism within 3UTRBERT allows direct visualization of the semantic relationship between sequence elements and effectively identifies regions with important regulatory potential. It is expected that 3UTRBERT model can serve as the foundational tool to analyze various sequence labeling tasks within the 3’UTR fields, thus enhancing the decipherability of post-transcriptional regulatory mechanisms.© 2024 The Author(s). Advanced Science published by Wiley‐VCH GmbH.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *