Multiple RNA Modifications can be Simultaneously Predicted, Researchers say

The ability to predict and interpret modifications of ribonucleic acid (RNA) has been a welcome advance in biochemistry research. Existing predictive approaches, however, have a key drawback—they can only predict a single type of RNA modification without supporting multiple modifications or providing insightful interpretation of their prediction results.

Researchers from Xi’an Jiaotong-Liverpool University, led by Dr Jia Meng, have addressed this issue by developing a model that supports 12 RNA modification types, greatly expanding RNA research prediction and interpretation.

“To the best of our knowledge, these are the only widely occurring RNA modifications that can be profiled transcriptome-wide with existing base-resolution technologies, which are highly desired characteristics of RNA modification for reliable large-scale prediction,” Dr Meng said.

Transcriptomes are the set of all RNA transcripts in a cell. By analysing these sequences, researchers can understand which genes are turned “on” or “off” in the cells and related tissues.

The research proved the efficacy of their MultiRM model—described by Dr Meng as an attention-based multi-label neural network approach for integrated prediction and interpretation of RNA modifications from the primary RNA sequence. Attention mechanisms weigh the contributions of inputs to optimize the process of learning target results.

The study, scheduled to be published in July, states that the primary purpose of the MultiRM study was to establish an interpretable predictor that could achieve state-of-the-art accuracy in the identification of these RNA modifications and primary RNA sequences.

The prediction method helps researchers understand the sequence-dependant mechanisms of RNA modification, cuts wet-lab experiment costs and provides insights into the regulatory circuit of RNA metabolism.

The approach is still primarily for fundamental research only, Dr Meng said. The tool, however, may help scientists design more efficient RNA therapeutics.

“It is hard to predict which diseases will benefit from the research, but studies indicate the enzymes of m6A RNA methylation play a key role in leukaemia, lung cancer and breast cancer,” he said.

The multiple, simultaneous predictions were achieved using a multitask-learning framework that integrates the prediction tasks for all the 12 RNA modifications into a single prediction task, Dr Meng said.

“Existing tools focused on the interpretation of the model,” he said. “The MultiRM model provides a more comprehensive view of the epitranscriptomes and discovers underlying mechanisms of the prediction results.”

The study discovered a surprising finding—the RNA modifications show strong and significant positive associations among each other, including those originated from different nucleotides. This suggests regions exist that are intensively modified by multiple RNA modifications, which are likely to be the key regulatory components for the epitranscriptome layer of gene regulation.

How MultiRM Was Built

MultiRM uses a deep-learning framework on the TensorFlow platform (www.tensorflow.org). The researchers’ approach enables accommodation of the shared structure of different modifications while fully exploiting their distinct features.

As some modifications are more abundant than others, additional algorithms were used to balance the training data issue in multi-label learning. Other machine-learning algorithms were implemented to create MultiRM’s baseline benchmark.

The MultiRM model has implications for other researchers. The research team developed a web server that was developed and made accessible to serve the research community. The data, code and model can be freely downloaded by researchers. The information properly shares the constructed MultiRM model. It takes as input an RNA sequence and returns the predicted RNA modification sites together with the key sequence contents that drive the positive predictions.

The study is titled “Attention-based multi-label neural networks for integrated prediction 2 and interpretation of twelve widely occurring RNA modifications”.

In addition to Dr Meng, other XJTLU researchers were Zitao Song, Daiyun Huang, Bowen Song Kunqi Chen, Yiyou Song, Gang Liu, Jionglong Su, João Pedro de Magalhães and Daniel J. Rigden.

Scroll to Top
Verified by MonsterInsights