Skip to content

Add metadata of evaluation metrics to models #1908

@rogancarr

Description

@rogancarr

Currently, we can use ML.NET to evaluate the performance of a model. However, when the model is passed to a third party, the evaluation metrics must be passed separately.

If you consider the evaluation metrics** to be properties of the model, then it makes sense to include them in the model. I expect this to be helpful in deployment, productionization, debugging, etc. Plus it would be nice to have properties of the model visible in an IDE and accessible programatically.

Related to #511

** e.g. over a dataset representative of the expected distribution of data to be seen by the model

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions