Skip to Main Content (Press Enter)

Logo UNIMORE
  • ×
  • Home
  • Corsi
  • Insegnamenti
  • Professioni
  • Persone
  • Pubblicazioni
  • Strutture
  • Terza Missione
  • Attività
  • Competenze

UNI-FIND
Logo UNIMORE

|

UNI-FIND

unimore.it
  • ×
  • Home
  • Corsi
  • Insegnamenti
  • Professioni
  • Persone
  • Pubblicazioni
  • Strutture
  • Terza Missione
  • Attività
  • Competenze
  1. Pubblicazioni

Predicting gene expression levels from DNA sequences and post-transcriptional information with transformers

Articolo
Data di Pubblicazione:
2022
Citazione:
Predicting gene expression levels from DNA sequences and post-transcriptional information with transformers / Pipoli, Vittorio; Cappelli, Mattia; Palladini, Alessandro; Peluso, Carlo; Lovino, Marta; Ficarra, Elisa. - In: COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE. - ISSN 0169-2607. - 225:(2022), pp. 107035-107044. [10.1016/j.cmpb.2022.107035]
Abstract:
Background and objectives: In the latest years, the prediction of gene expression levels has been crucial due to its potential applications in the clinics. In this context, Xpresso and others methods based on Convolutional Neural Networks and Transformers were firstly proposed to this aim. However, all these methods embed data with a standard one-hot encoding algorithm, resulting in impressively sparse matrices. In addition, post-transcriptional regulation processes, which are of uttermost importance in the gene expression process, are not considered in the model.Methods: This paper presents Transformer DeepLncLoc, a novel method to predict the abundance of the mRNA (i.e., gene expression levels) by processing gene promoter sequences, managing the problem as a regression task. The model exploits a transformer-based architecture, introducing the DeepLncLoc method to perform the data embedding. Since DeepLncloc is based on word2vec algorithm, it avoids the sparse matrices problem.Results: Post-transcriptional information related to mRNA stability and transcription factors is included in the model, leading to significantly improved performances compared to the state-of-the-art works. Transformer DeepLncLoc reached 0.76 of R-2 evaluation metric compared to 0.74 of Xpresso.Conclusion: The Multi-Headed Attention mechanisms which characterizes the transformer methodology is suitable for modeling the interactions between DNA's locations, overcoming the recurrent models. Finally, the integration of the transcription factors data in the pipeline leads to impressive gains in predictive power. (C) 2022 Elsevier B.V. All rights reserved.
Tipologia CRIS:
Articolo su rivista
Keywords:
Attention; DNA; Gene-expression; Prediction; Transcription-factors; Transformers; Base Sequence; Gene Expression; RNA, Messenger; DNA; Transcription Factors
Elenco autori:
Pipoli, Vittorio; Cappelli, Mattia; Palladini, Alessandro; Peluso, Carlo; Lovino, Marta; Ficarra, Elisa
Autori di Ateneo:
FICARRA ELISA
LOVINO MARTA
PIPOLI VITTORIO
Link alla scheda completa:
https://iris.unimore.it/handle/11380/1289669
Link al Full Text:
https://iris.unimore.it//retrieve/handle/11380/1289669/641381/Transformers.pdf
Pubblicato in:
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE
Journal
  • Utilizzo dei cookie

Realizzato con VIVO | Designed by Cineca | 25.10.3.0