single-resources

Resources

Lattice Recurrent Unit

Lattice Recurrent Units (LRU) is a new family of models that address the challenge of learning deep multi-layer recurrent models with limited resources without affecting the generalisability of sequence models. LRU models achieve this goal by creating distinct (but coupled) flow of information inside the units: a first flow along time dimension and a second flow along depth dimension. It also offers a symmetry in how information can flow horizontally and vertically.

We analyze the effects of decoupling three different components of our LRU model: Reset Gate, Update Gate and Projected State. We evaluate this family on new LRU models on computational convergence rates and statistical efficiency.

Our results show that LRU has better empirical computational convergence rates and statistical efficiency values, along with learning more accurate language models.

Links

Github

Project Website