Recurrent Linear Transformer

Recurrent Linear Transformer


Date
Dec 19, 2023 12:00 AM
Event
Amii AI Seminar

In this seminar presented by the Alberta Machine Intelligence Institute (Amii), Subhojeet Pramanik, University of Alberta, explains that Recurrent Linear Transformer addresses limitations in the transformer architecture by proposing a recurrent alternative to the self-attention mechanism. Overcoming issues of context-dependent inference and high computational costs, the approach demonstrates longer context and reduced computational complexity over vanilla self-attention mechanism. Evaluating performance in pixel-based reinforcement learning environments, the approach outperforms the state-of-the-art GTrXL by at least 40% faster FPS and more than 50% lesser memory usage. Notably, our approach achieves over 37% improvement in performance on harder tasks.

Subho
Subho
Senior ML Engineer at Costimate