Skip to content

Getting Uncertainty results by using modified model output #342

@nehiridil

Description

@nehiridil

Hey! I want to get uncertainty scores using model output as a parameter. We cannot make any modifications on the generated output from LLM as far as I understand. Is there a way to do that?

I would like to get the results from LLM, process it and than run uncertainty quantification.

Thanks!

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions