Skip to content

Torchscript Model can't have bfloat16 inputs / outputs in 24.09 #7853

@MatthieuToulemont

Description

@MatthieuToulemont

I am using the triton container from 24.09 and want to serve torscript models with BFLOAT16 inputs but I am getting the following error:

failed to load 'UnflattenSequence' version 1: Internal: unsupported datatype TYPE_BF16 for input 'latent_as_sequence_input__0' for model 'UnflattenSequence'"

Given that torchscript supports bfloat16, shouldn't this be supported as well ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestmodule: backendsIssues related to the backendspytorchPyTorch or LibTorch related

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions