One of the design principles of sqlc is that SQL queries should be static in application code so that you know exactly what SQL is running on your database. It turns out you can get pretty far operating under this constraint, although there are some annoyances.
Riza, Inc. (https://riza.io) | SWEs and DevRel Engineers | Full-time or part-time | San Francisco
We use WASM to provide isolated runtimes for executing untrusted code, mostly generated by LLMs. Our customers do things like extract data from log lines at run time by asking claude-3-5-sonnet to generate a parsing function on-the-fly and then sending it to us for execution.
* Our hosted and self-hosted runtime service (Rust, WASM)
* Integrations and demos with adjacent frameworks and tools (Python / JavaScript / TypeScript)
* New products
We have seed money, but the whole company is currently just me and Kyle working out of a converted warehouse on Alabama St. We’re second-time founders, so we know the risk we’re asking you to take and we’re prepared to compensate accordingly. Send an email to me at andrew at riza dot io or pop in our Discord (https://discord.gg/4P6PUeJFW5) and say hi.
Why do we have to "get there?" Humans use calculators all the time, so why not have every LLM hooked up to a calculator or code interpreter as a tool to use in these exact situations?
I would argue that most sota models do know that they don't know this, as evidenced by the fact that when you give them a code interpreter as a tool they choose to use it to write a script that counts the number of letters rather than try to come up with an answer on their own.
Yes, we are doing this at Riza[0] (via WASM). I'd love to have folks try our downloadable CLI which wraps isolated Python/JS runtimes (also Ruby/PHP but LLMs don't seem to write those very well). Shoot me an email[1] or say hi in Discord[1].
Plug in a code interpreter as a tool and the model will write Python or JavaScript to solve this and get it right 100% of the time. (Full disclosure: I work on a product called Riza that you can use as a code interpreter tool for LLMs)
reply