Skip to content

Should "model" be "trt_model" here? #2932

@choosehappy

Description

@choosehappy

Might be a copy paste error:

out = model(data)

above has this code:

        trt_model = torchtrt.dynamo.compile(
            exp_program,
            inputs=[input_tensor],
            enabled_precisions={torch.float8_e4m3fn},
            min_block_size=1,
            debug=False,
        )

and then a note saying

        # Inference compiled Torch-TensorRT model over the testing dataset

but then it looks like inference is done with the original model?

            out = model(data)

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions