
Amazon SageMaker – Build, train, and deploy machine learning models at scale - irs
https://aws.amazon.com/sagemaker
======
gk1
This blog post also includes screenshots:
[https://aws.amazon.com/blogs/aws/sagemaker/](https://aws.amazon.com/blogs/aws/sagemaker/)

------
michaelbarton
If anyone from AWS is in this forum, could you comment if the custom Docker
training in sagemaker can also be used for general optimisation of any
dockerised objective function, e.g. bayesian hyper parameter training?

In the blog post example there is this python code:

    
    
        def train(
            channel_input_dirs, hyperparameters, output_data_dir,
             model_dir, num_gpus, hosts, current_host):
    

Would I also write some kind of similar function for scoring the result of the
training?

To provide some context, I work in bioinformatics where some of our algorithms
have 100s of parameters. This is not ML where we want to classify or predict
but rather optimise the parameters for a given objective function. If
sagemaker allows general optimisation in an AWS lambda like way, that would be
very useful.

~~~
tomfaulhaber
Engineer on the SageMaker team here.

There are no restrictions on the types of algorithms that you can optimize
using the HyperParameterOptimization service.

SageMaker is designed for machine learning which means it's optimized for
algorithms that process a lot of data to develop a model where each run of the
algorithm may generate an objective function value (or potentially many such
as the value may change during training).

If this structure fits your problem, SageMaker could be useful to you even if
your problem isn't strictly "machine learning."

~~~
michaelbarton
That's great. Thank you for following up so quickly. Could you point me to
where I could read about this, or see an example?

I took a quick search on the documentation and I could see anything on a
cursory pass:

[http://docs.aws.amazon.com/search/doc-
search.html?searchPath...](http://docs.aws.amazon.com/search/doc-
search.html?searchPath=documentation-
guide&searchQuery=HyperParameterOptimization)

~~~
tomfaulhaber
Sorry, for the very delayed reply here.

The hyperparameter optimization feature is still in preview (though the rest
of SageMaker is in general availability). We'll be putting up a page within
the next week or so for you to request access.

------
kernel_sanders
SageMaker

{1G}

Human Druid - Sage

{G}, Tap: Create 0/1 Plant Token named Seed of Knowledge

Sacrifice {X} Plants: Look at the top X cards of opponent's library

1/1

~~~
orangejewce
I thought MTG too.

------
xtracto
I wonder how does this compare to offerings like DataRobot and the like.

~~~
GFischer
"The process of incorporating machine learning into an application often
involves a team of experts tuning and tinkering for months with inconsistent
setups. Businesses and developers want an end-to-end, development to
production pipeline for machine learning"

This seems to be a complete pipeline, including deployment to production.
Edit: DataRobot too, I misread. Edit2: DataRobot seems to offer a "codeless"
approach. Thanks for sharing, I'll check it out.

I was wondering at the difference with Microsoft Azure Machine Learning Studio
(which I haven't used yet).

