Hey HN!
I’ve just released a Swift macro called FunctionCalling that simplifies the process of making your Swift functions callable via LLM (Large Language Model) services like ChatGPT, Anthropic Claude, and others.
Have you ever been frustrated with writing repetitive boilerplate code to expose your Swift functions to these LLM services? FunctionCalling removes that hassle. With just two annotations, your functions are instantly ready to be called from these services. Here’s a quick overview:
- Annotation Power: Simply decorate your functions with `@CallableFunction` within a `@FunctionCalling`-marked struct or class.
- Auto-Generated Code: The macro handles all the heavy lifting, generating the necessary boilerplate so you don’t have to.
- LLM Integration: It works seamlessly with popular LLMs, allowing you to focus on the logic while the macro takes care of the API integration.
Check out this code snippet to see how easy it is:
```swift
@FunctionCalling(service: .claude)
struct MyFunctionTools: ToolContainer {
@CallableFunction
func getStockPrice(ticker: String) async throws -> String {
// Fetch and return the stock price
}
}
```
The macro will auto-generate the code to make this function callable externally. No more JSON headaches!
Supports popular LLM services
No more manual JSON creation
Instant API-ready functions
If you’re building tools or services that interact with LLMs or other APIs, this could save you a ton of time.
Check out the GitHub repo here[1] and let me know what you think! Feedback and contributions are more than welcome.
[1]: https://github.com/fumito-ito/FunctionCalling