diff --git a/README.rst b/README.rst index 8f0a913b82..1a6e3dfd40 100644 --- a/README.rst +++ b/README.rst @@ -30,7 +30,8 @@ Table of Contents 5. `Chainer SageMaker Estimators <#chainer-sagemaker-estimators>`__ 6. `AWS SageMaker Estimators <#aws-sagemaker-estimators>`__ 7. `BYO Docker Containers with SageMaker Estimators <#byo-docker-containers-with-sagemaker-estimators>`__ -8. `BYO Model <#byo-model>`__ +8. `SageMaker Automatic Model Tuning <#sagmaker-automatic-model-tuning>`__ +9. `BYO Model <#byo-model>`__ Getting SageMaker Python SDK @@ -263,6 +264,56 @@ Please refer to the full example in the examples repo: The example notebook is is located here: ``advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb`` + +SageMaker Automatic Model Tuning +-------------------------------- + +All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs. A hyperparameter tuning job runs multiple training jobs that differ by their hyperparameters to find the best one. The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs. You can read more about SageMaker Automatic Model Tuning in the `AWS documentation `__. + +Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs: + +.. code:: python + + from sagemaker.tuner import HyperparameterTuner, ContinuousParameter + + # Configure HyperparameterTuner + my_tuner = HyperparameterTuner(estimator=my_estimator, # previously-configured Estimator object + objective_metric_name='validation-accuracy', + hyperparameter_ranges={'learning-rate': ContinuousParameter(0.05, 0.06)}, + metric_definitions=[{'Name': 'validation-accuracy', 'Regex': 'validation-accuracy=(\d\.\d+)'}], + max_jobs=100, + max_parallel_jobs=10) + + # Start hyperparameter tuning job + my_tuner.fit({'train': 's3://my_bucket/my_training_data', 'test': 's3://my_bucket_my_testing_data'}) + + # Deploy best model + my_predictor = my_tuner.deploy(initial_instance_count=1, instance_type='ml.m4.xlarge') + + # Make a prediction against the SageMaker endpoint + response = my_predictor.predict(my_prediction_data) + + # Tear down the SageMaker endpoint + my_tuner.delete_endpoint() + +There is also an analytics object with each ``HyperparameterTuner`` instance, which presents useful information about the hyperparameter tuning job, like a pandas dataframe summarizing the associated training jobs: + +.. code:: python + + # Retrieve analytics object + my_tuner_analytics = my_tuner.analytics() + + # Look at summary of associated training jobs + my_dataframe = my_tuner_analytics.dataframe() + +For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples. + +For more detailed explanations of the classes mentioned, see: + +- `API docs for HyperparameterTuner and parameter range classes `__. +- `API docs for analytics classes `__. + + FAQ ---