Skip to content

[ML] Get trained models: pre-packed models are not affected by from and size #51543

@dolaru

Description

@dolaru

Spotted in 7.6.0

We currently ship with 1 pre-packed model: lang_ident_model_1. When doing a GET _ml/inference, that model is always prepended to the list of trained_model_configs in the response.

This also means that the from and size query parameters are not respected. The pre-packed model will always be the first model in the list, unless a model_id is specified.

Expected:
Pre-packed models should be treated the same way the models stored in .ml-inference* are, in GET _ml/inference responses.

Metadata

Metadata

Assignees

Labels

:mlMachine learning>bug

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions