Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: ServerlessAI – Build, scale, and monetize AI apps without back end (serverlessai.dev)
42 points by HeavenFox 3 hours ago | hide | past | favorite | 13 comments
Hello HN,

I’ve always loved building frontend-only apps—those you can prototype over a weekend, host for free on GitHub Pages, and scale to millions of users. Unfortunately, AI-enabled apps complicate things, as exposing your OpenAI key to the world is obviously a no-go. This also means mobile developers often have to run their own servers.

That’s why I built ServerlessAI, an API gateway that lets you securely call multiple AI providers directly from client side using OpenAI-compatible APIs. You can authenticate users through any identity provider, like Google or Apple, and set per-user request or spending quotas. You can also define an allowlist for endpoints and models. To monetize, you can apply different quotas for various user tiers.

To start, I recommend checking out our tutorials, where we walk you through building a complete, deployment-ready AI app in 5 minutes. We’ve got tutorials for React, Next.js, and iOS: https://serverlessai.dev/docs/tutorials/

Our long term vision is to offer the best toolkit for AI developers at every stage of their project’s lifecycle. If OpenAI / Anthropic / etc are AWS, we want to be the Supabase / Upstash / etc. We are building optional out-of-box tools for authentication and payment management, so you can roll out your prototype faster. In the future, we want to provide the best prompt engineering tools for fine-tuning, A/B testing, and backtesting, as well as the best observability tools.

We’d love to hear your feedback. Thanks for stopping by!






So, I'll be honest, I don't understand this market. I get that one can be profitable selling shovels during the gold rush, sure. But I have trouble understanding who is knowledgeable/dedicated enough to try to get their AI app going, but would pay to abstract/outsource this part of the chain.

(I suppose, relately, I have trouble understanding why anyone would just sort of presume OpenAI would be forever the best backend here as well?)


Please hire a real artist those graphics on the home page are disturbing.

This is cool, congrats on launching!

How is it different from Puter AI, which offers auth + free inference?

https://docs.puter.com/AI/chat/


Founder of Puter here. Thank you very much for mentioning us!

If you don't want to pay for this service, keytrustee.org does this for free.

To potentially save you some headache, take a look at serverless.com and weigh the likelihood they come after you about that name if you are planning on making this a business.

(And yes, I hate their name too. I don't honestly know how defendable an entire technology term actually is. It also results in terrible Googling.)


This is a great idea. You should market to app devs as well.

I would also build this on top of firebase marketplace: https://extensions.dev


Thank you! We do believe app developers will find this valuable, and we are also working on IAP integration. Sadly I am not an app developer, so if anyone have any suggestions on how I can serve this community better, I'm eager to hear from you! My email is in my HN profile.

Will look into Firebase Marketplace! That is a great suggestion!


Hah! Nice idea! I built something with a similar mindset but instead of calling cloud AI providers my aim is to provide a self-hostable complete AI platform: https://github.com/singulatron/singulatron

I know that might sound like putting the server back to serverless. But I would say it's being your own serverless provider - once you have the platform installed on your servers you can build frontend-only AI apps on top.

Hope you don't mind the self-plug. Your approach definitely a ton of advantages when starting out (no infra to manage etc).


Great idea! I like the ergonomics of this for the developer-side, it's easy to add and puts the onus on the developer to have a robust auth system that avoids users creating 1000s of accounts so they can get unlimited LLM access.

One challenge on frontend-only apps is if the prompt is proprietary then this will be exposed unless you will then offer prompt templating or prompt mapping on your side i.e. the frontend says prompt: Template_123 and then this maps to the actual prompt somehow. Prompting is important still and maybe for a while so having the internals externally available could be sensitive.


I disagree. These sorts of pricing structures rule out consumer usage quickly due to high numbers of users. Have flat usage fees and get rid of these per user limits. Both doesn’t make sense unless you want to turn away any app that has many users.

I’d also recommend they clean up the copy of what they offer (expand on the why).

Other than that looks cool


Thanks for the feedback! Pricing is something we are iterating on. Our intention is that we only make money if our customers make money - which is why only authenticated users count towards the limit.

If pricing is preventing anyone from using our product, please shoot me an email (in my HN profile) and we'd love to hear about your use case!

Good call on the marketing copy! We will do some revisions!


Thanks for the feedback! We are building prompt templates right now. Besides the security benefits you mentioned, this can also enable developers to tweak prompts without redeployment, run A/B tests, and evaluate different models. It's an incredibly powerful tool!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: