The first thing wrong is that you have to serialize and deserialize it. Operationally, that's inconvenient, and it shows that they're optimizing for network bandwidth. But these days, squeezing the most out of each bit is, in most cases, not a defensible design decision.
Then, once you deserialize it, it's still a printable version of ASN.1. Sure, it's unambiguous, rigidly defined, and standardized. It's still gouge-your-eyeballs-out horrible to try to do anything with.
Say you get an XML message over the wire with a bit flipped. If you look at it, you have a good chance to be able to figure out what went wrong, edit one character, and you can now process it. If you get an ASN.1 message in the same condition, it's pretty much game over (though there may be special tools that could save you).
Say you get an XML, and you don't know the schema. You look at it, and you can see what's going on. You get an ASN.1 where you don't know the schema, and you can be totally sunk. (If I recall correctly, in ASN.1, you can have schemas that are private, that is, not specified in the standard.)
Then, once you deserialize it, it's still a printable version of ASN.1. Sure, it's unambiguous, rigidly defined, and standardized. It's still gouge-your-eyeballs-out horrible to try to do anything with.
Say you get an XML message over the wire with a bit flipped. If you look at it, you have a good chance to be able to figure out what went wrong, edit one character, and you can now process it. If you get an ASN.1 message in the same condition, it's pretty much game over (though there may be special tools that could save you).
Say you get an XML, and you don't know the schema. You look at it, and you can see what's going on. You get an ASN.1 where you don't know the schema, and you can be totally sunk. (If I recall correctly, in ASN.1, you can have schemas that are private, that is, not specified in the standard.)