
Ask HN: How did you architect your web application that uses machine learning? - wuliwong
I have recently begun incorporating machine learning to tune some parameters which are used in a web application. I am quite new to working with machine learning but it looks like the ML portion of this project will grow substantially. Currently I write small ML scripts using python and just manually update the Ruby on Rails application with the results.<p>How have you architected your application that incorporates machine learning?
======
CabSauce
I've only built relatively small, POC applications, but I've been using flask
to host an API that gets called by whatever application.

It depends on how often you want to update your model, how complex it is, how
much data it requires to infer, etc. I think splitting up the services/code
will be helpful if we get to the point where there are multiple teams working
on this stuff.

------
mtmail
Before ML we used
[https://en.wikipedia.org/wiki/Genetic_programming](https://en.wikipedia.org/wiki/Genetic_programming)
to find better parameters to match millions of objects (about 30 attributes)
against a list of 1000s of human verified (positive and negative) matches. It
ran every night and would email us every couple of days a list of parameter
better than the previous. We added those to a configuration file.

------
ganeshkrishnan
We use big data and ML for our project. The training happens on a separate
machine running Scala, spark and deeplearning4j. This app reads from our
database creates the model and then saves it to a network drive. This happens
every Sunday night.

The web application reloads the ml model and then makes predictions based on
real time data.

There are also plenty of predictions that do not require real time prediction
for example inventory levels, sales for next few days etc. In this case we
train on the training machine and write the results back to the DB using spark
jdbc. The web just displays this data.

We have scaled up pretty well however in the future we will move to Apache
Kafka to real time data transfer and prediction

------
chudi
We have a number of batch jobs that updates the models every night then a
separate api to serve the main app that draws de ui.

Sometimes your ml model doesn't need a cronjob and you can update your results
near real-time, but the principle still holds, an API separate from the main
app.

~~~
wuliwong
This seems to be what I'm hearing over and over and luckily it is pretty close
to how I assumed it would work.

I think I am going to build an API inside my rails app which the python then
uses to access data and update parameters. I have a lot of work as far as
designing the models themselves but the overall software architecture now
makes some sense to me.

Thanks.

------
chudi
There was also this great ask hn some months ago:

[https://news.ycombinator.com/item?id=13821217](https://news.ycombinator.com/item?id=13821217)

~~~
wuliwong
Thanks, that's perfect.

