Support for Bedrock Application Inference Profiles #7809
Replies: 3 comments 5 replies
-
I would also accept a simple ENV var that makes it possible to simply bypass the model provider while this accommodation is pending. |
Beta Was this translation helpful? Give feedback.
-
Delivered #7822 for this. Wound up adding a new field |
Beta Was this translation helpful? Give feedback.
-
You can consider to use |
Beta Was this translation helpful? Give feedback.
-
Checked
Feature request
Please add support for use of Application Inference Profiles as
model
in Bedrock client.Motivation
My organization has adopted Application Inference Profiles to track costs and usage within a shared AWS account. I therefore need to be able to supply the Application Inference Profile arn as the
model
.Proposal (If applicable)
I see that support for Cross Region Inference was added in 29c5b8c but it expects a region-optional model ID as the
model
.I need support for Application Inference Profiles if I am to continue using LangChain JS. Per the docs, this requires permitting the full ARN of the application inference profile as the model name. I therefore need the filter to include accommodation for inference profile ARN as model name. I will attempt to contribute this myself as I have urgent need for enabling this.
Beta Was this translation helpful? Give feedback.
All reactions