Few weeks ago, we were tasked to make llama3 model to do function calling and build an API similar/compatible with OpenAI's python client.
While we're still improving our dataset and FastAPI backend server (which can't be open-sourced), we're happy to share with the community our open-sourced project of how to properly fine-tune llama3 model to support function calling, all following OpenAI's best practices.
We hope this will help other people with similar tasks/needs, happy to answer any questions.
https://github.com/michaelnny/Llama3-FunctionCalling/blob/ma...