meta: This is a paper that uses "landmarks" in text as a way to be able to scan an arbitrary part of the text, it is not necessarily a "landmark... transformer paper" as the post title may suggest.
The paper title is 'Landmark Attention: Random-Access Infinite Context Length for Transformers'. OP, please don't use marketing-speak to write submission titles on HN, it's an unhelpful distraction from the subject matter.