This page has materials that help understand these modern architectures that may replace transformers for many tasks over the next couple of years. From the team that included the amazing people that came up with flash attention, hyena variants, mamba, and “based”.