The language pattern occurs quite frequently in many domains. Unfortunately, I have so far only seen half-assed interpreters. At best, people did a shallow embedding (interpreter) of their DSL. No one ever did a true compiler.
I think the reason for this can be found in the ideal encoding of the input language: In its most concise formulation, a language is a recursive algebraic datatypes. Handling of that input requires a recursive algorithm that deals with all corner cases. Engineers I worked with tend to not see their input language as an ADT, though. They focus on some particular use cases and if there is a specification it is often too large and yet incomplete. But on top of that comes the hesitation to implement a complete algorithm. Nearly every time, some "corner case" is ignored because "nobody uses it" and we end up with an implementation that is factually incorrect and incomplete.
You don't even have to squint to see that BNF description is a sum of products data type. With that, functions on those types write themselves and you're a type-check away from consistency.
I think the reason for this can be found in the ideal encoding of the input language: In its most concise formulation, a language is a recursive algebraic datatypes. Handling of that input requires a recursive algorithm that deals with all corner cases. Engineers I worked with tend to not see their input language as an ADT, though. They focus on some particular use cases and if there is a specification it is often too large and yet incomplete. But on top of that comes the hesitation to implement a complete algorithm. Nearly every time, some "corner case" is ignored because "nobody uses it" and we end up with an implementation that is factually incorrect and incomplete.