Hacker News new | past | comments | ask | show | jobs | submit login

Are there any other FDWs that do ML inference?

Remember, this is not plain file serving -- this is actually invoking XGBoost library which does complex mathematical operations. The user does not get data from disk, they get inference results.

Unless you know of any other solution which can invoke XGBoost (or some other inference library), I don't see anything "embarrassingly overkill" there.




My issue isn't with the inference step or even the reading step, it's the fetching step.


How are you doing online ML inference, without fetching data?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: