They understand best whatever was used during their training. For OpenAI's GPTs we don't really know since they don't disclose anything anymore, but there are good reasons to assume they used markdown or something closely related.
Well, for one, their chat markup language (i.e. what they used for chat/instruction tuning). But they closed the source on that last year, so we don't know what it looks like anymore. I doubt it changed much though. Also, when you work with their models a lot for e.g. document processing, you'll find that markdown tends to work better in the context than, say, html. I've heard similar observations from people at other companies.