Hacker News new | past | comments | ask | show | jobs | submit login

They understand best whatever was used during their training. For OpenAI's GPTs we don't really know since they don't disclose anything anymore, but there are good reasons to assume they used markdown or something closely related.



Just out of curiosity, what are some of those good reasons?

It's clear enough that they can use and consume markdown, but is the suggestion here that they've seen more markdown than xml?

I'd have guessed possibly naively that they fed in more straight html but I'd be interested to know why that's unlikely to be the case


Well, for one, their chat markup language (i.e. what they used for chat/instruction tuning). But they closed the source on that last year, so we don't know what it looks like anymore. I doubt it changed much though. Also, when you work with their models a lot for e.g. document processing, you'll find that markdown tends to work better in the context than, say, html. I've heard similar observations from people at other companies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: