In short, all modern large language models (LLMs) are made from the same algorithmic building blocks. The architectural differences between Llama 2 and the original 2017 Transformer were not invented by Meta, and are all public owing to open access publishing being the norm in computer science. So, even though Yi-34B adopts Llama 2's architecture, Meta's model did not give 01.AI access to any previously inaccessible innovation.