[7.x][ML] ML Model Inference Ingest Processor (#49052) #49257
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This adds a couple of things:
Related Feature PRs:
[ML][Inference] Adjust inference configuration option API ([ML][Inference] Adjust inference configuration option API #47812)
[ML][Inference] adds logistic_regression output aggregator ([ML][Inference] adds logistic_regression output aggregator #48075)
[ML][Inference] Adding read/del trained models ([ML][Inference] Adding read/del trained models #47882)
[ML][Inference] Adding inference ingest processor ([ML][Inference] Adding inference ingest processor #47859)
[ML][Inference] fixing classification inference for ensemble ([ML][Inference] fixing classification inference for ensemble #48463)
[ML][Inference] Adding model memory estimations ([ML][Inference] Adding model memory estimations #48323)
[ML][Inference] adding more options to inference processor ([ML][Inference] adding more options to inference processor #48545)
[ML][Inference] handle string values better in feature extraction ([ML][Inference] handle string values better in feature extraction #48584)
[ML][Inference] Adding _stats endpoint for inference ([ML][Inference] Adding _stats endpoint for inference #48492)
[ML][Inference] add inference processors and trained models to usage ([ML][Inference] add inference processors and trained models to usage #47869)
[ML][Inference] add new flag for optionally including model definition ([ML][Inference] add new flag for optionally including model definition #48718)
[ML][Inference] adding license checks ([ML][Inference] adding license checks #49056)
[ML][Inference] Adding memory and compute estimates to inference ([ML][Inference] Adding memory and compute estimates to inference #48955)