Hacker News new | past | comments | ask | show | jobs | submit login

This is neat and glad to see more work in this area.

My issue is that it’s hard to buy credits for something specific like this, especially when my phone does it for free. So it’s tough to compete with Apple.

I hope more work in this area gets us closer to local ai that can do this without needing a service as I would gladly pay $10 one time (or sponsor an oss dev) to be able to do this for the rest of my life.

I’ve really enjoyed stablediffusion run locally. Even on my crappy machine, it’s nice to be able to not worry about credits and there’s no ticking clock impacting my exploration.




I rather like the fremium model, where basic use is free and you pay for the API or high-res. 2MP isn't bad, that's about 1400x1400, but I'd like to see 1500 ot 2000, personally. I think they'll have a tough time transitioning to paid-only.

Credits are annoying, but no one has really cracked micropayments--it's too expensive to take $1 payments.

I like how replicate does it, just take a card and bill for usage, then you can decide to just comp it or defer the charges when usage is low.

Going local is fun and practical for hobby use, but for business use an API makes more sense. Let someone else deal with hardware.


This.

I understand why these are SaaS offerings from a business POV, but don’t understand why there aren’t more options to run locally. Are gaming GPUs like a 3080 not powerful enough?


SaaS makes more money and fits into the 4HourWorkWeek “make recurring revenue from people” playbook.

Building a thing and selling it seems like it will make less money to me.

It’s funny that on a larger scale I used to hate enterprise software. It took a year to install and had to be patched and run servers and stuff. But you paid a big amount and then like 10-20%/year for maintenance and that’s it. So many things are saas and cost $500-100+/user/year and it’s not just the cost but it’s the planning and gatekeeping. Making it available to more users can be expensive. I kind miss the simplicity of budgeting $1M and being done. Now each year it’s figuring out who really needs it and cleaning up expired accounts and being stingy on if another team can use it or not.

One of my favorite things was how easy it was to scale and share with new users.


> Are gaming GPUs like a 3080 not powerful enough?

It really depends on the model. Just cherry-picking memory as a capacity dimension first: The SAM model from Meta ships at around 2.4GB w/ 360 million parameters. That trained model fits just fine on a 12GB 3080 Ti. How fast it can compute predictions on a single 3080 Ti is a different story, in the case of SAM it does well, but this ultimately depends on how complex the given model is (not the only variable, but a big one).

> don’t understand why there aren’t more options to run locally

I think it's likely that you haven't been looking in the right places for local solutions. The deep learning space is very well represented in open source at the moment across a wide set of verticals: language models, computer vision, speech recognition, voice synthesis, etc. You don't always get the white glove UX that SaaS sometimes can offer, but thats true of much of the rest of the OSS world as well.

EDIT: Wanted to note that I use both a 3080 Ti and my M2 Max for a variety of DL tasks (both for training and inference).


(I'm a dev on the project.) We've not decided on the exact term of the credits, but they will be long-lasting, so you can pay $5 for 250 images and use that over the course of a few years. We'd make them non-expiring, but that creates an unbounded liability.


Thanks. I appreciate unexpiring credits and think that’s a super reasonable price.

Again, thanks for your work. I don’t want to criticize and am glad you built this. I just like to voice this opinion in case it helps, in any small way, to increase the odds of more local software.


Something like that seems eminently reasonable. Low dollar amount for enough uses that I don't need to think too much every time I press the execute button. Reasonable expiration window. No subscription which I generally prefer. (Though I'd note that Photoshop is getting very close to doing this sort of thing and a Photoshop + Lightroom subscription is actually pretty reasonable--$20/mo--if you use them a lot. That's the sort of price point that a lot of standalone generative AI tools are going to be up against.)


Generally things with "Lifetime Guarantee" means "Lifetime of the company" or even "Lifetime of the specified product line/family/version" if you look at the fine print.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: