Hyena architecture enables fast and efficient protein language modeling

Abstract The emergence of self‐supervised deep language models has revolutionized natural language processing tasks and has recently extended its applications to biological sequence analysis. Traditional language models, primarily based on Transformer architectures, demonstrate substantial effective...

Full description

Bibliographic Details
Main Authors: Yiming Zhang, Bian Bian, Manabu Okumura
Format: Article
Language:English
Published: Wiley 2025-03-01
Series:iMetaOmics
Subjects:
Online Access:https://doi.org/10.1002/imo2.45