Skip to content

Conversation

@benwtrent
Copy link
Member

@benwtrent benwtrent commented Mar 30, 2020

A new field called inference_config is now added to the trained model config object. This new field allows for default inference settings from analytics or some external model builder.

The inference processor can still override whatever is set as the default in the trained model config.

Docs preview:

@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@benwtrent
Copy link
Member Author

@elasticmachine update branch

elasticmachine and others added 3 commits March 30, 2020 09:51
…of github.com:benwtrent/elasticsearch into feature/ml-inference-add-infer-config-to-model-config
@benwtrent
Copy link
Member Author

@elasticmachine update branch

@elastic elastic deleted a comment from elasticmachine Mar 31, 2020
<5> Optionally, a human-readable description
<6> Optionally, an object map contain metadata about the model
<7> Optionally, an array of tags to organize the model
<8> The default inference config to use with the model. Must match the underlying
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the new object needs to be added in the definitions section too (e.g. https://www.elastic.co/guide/en/elasticsearch/reference/master/put-inference.html#ml-put-inference-trained-model)

Copy link
Member

@davidkyle davidkyle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great change! LGTM

One request could we now make the inference config for the inference processor optional? i.e. remove the need for

"inference_config": {
    "regression": {}
}

public class ClassificationConfigTests extends AbstractXContentTestCase<ClassificationConfig> {

public static ClassificationConfig randomClassificationConfig() {
return new ClassificationConfig(randomBoolean() ? null : randomIntBetween(-1, 10),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is -1 valid for numTopClasses

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, indicates "_all"

}

boolean isNoop(ClassificationConfig originalConfig) {
return (resultsField == null || originalConfig.getResultsField().equals(resultsField))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flip the test around so you call equals on the one you've proved to be non-null.

resultsField == null || resultsField.equals(originalConfig.getResultsField())

I know getResultsField() can't return null as it has a default value but it seems sensible to reverse it anyway. Also topClassesResultsField below

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

roger

Regression regression = ((Regression)analytics.getAnalysis());
return new RegressionConfig(RegressionConfig.DEFAULT_RESULTS_FIELD,
regression.getBoostedTreeParams().getNumTopFeatureImportanceValues() == null ?
0 :
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let RegressionConfig pick the default value (it does this in the constructor anyway)

RegressionConfig(RegressionConfig.DEFAULT_RESULTS_FIELD,
                    regression.getBoostedTreeParams().getNumTopFeatureImportanceValues())

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same with ClassificationConfig

@benwtrent
Copy link
Member Author

One request could we now make the inference config for the inference processor optional?

Part of me is still reticent to do that. It is right now, the only indication of the "inference type". I suppose we can remove it in the future for sure. I don't think we should do it in this PR though.

@benwtrent
Copy link
Member Author

@elasticmachine update branch

Copy link
Contributor

@lcawl lcawl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Documentation LGTM

@benwtrent
Copy link
Member Author

@elasticmachine update branch

@benwtrent benwtrent merged commit 4e1ff31 into elastic:master Apr 2, 2020
@benwtrent benwtrent deleted the feature/ml-inference-add-infer-config-to-model-config branch April 2, 2020 14:34
benwtrent added a commit to benwtrent/elasticsearch that referenced this pull request Apr 2, 2020
…54421)

A new field called `inference_config` is now added to the trained model config object. This new field allows for default inference settings from analytics or some external model builder.

The inference processor can still override whatever is set as the default in the trained model config.
benwtrent added a commit that referenced this pull request Apr 2, 2020
…4421) (#54647)

* [ML] add new inference_config field to trained model config (#54421)

A new field called `inference_config` is now added to the trained model config object. This new field allows for default inference settings from analytics or some external model builder.

The inference processor can still override whatever is set as the default in the trained model config.

* fixing for backport
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants