In this seminar presented by the Alberta Machine Intelligence Institute (Amii), Subhojeet Pramanik, University of Alberta, explains that Recurrent Linear Transformer addresses limitations in the transformer architecture by proposing a recurrent alternative to the self-attention mechanism.