docs/integrations/llms/bedrock/ #28620
Replies: 3 comments
-
Error while trying to use custom imported models in bedrock
|
Beta Was this translation helpful? Give feedback.
-
When I tried out the custom imported model(DeepSeek 70B) on Bedrock using LangChain I ran into this error:
To replicate this: from langchain_aws import BedrockLLM
deepseek_70b_llm = BedrockLLM(
aws_access_key_id="my-access-key-id",
aws_secret_access_key="my-secret-access",
aws_session_token="my-session-token",
region="us-east-1",
provider="deepseek",
model_id="custom-imported-model-ARN", # ARN like 'arn:aws:bedrock:...' obtained via provisioning the custom model
model_kwargs={"temperature": 0.75, "max_tokens":512, "max_retries": 3},
streaming=True,
)
deepseek_70b_llm.invoke("Who wrote the general relativity paper?") |
Beta Was this translation helpful? Give feedback.
-
Fetching 5 files: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:00<?, ?it/s] Processing query: 'Expalin about the generation of carbon emissions in UK' event though the query is valid-output, nemo guardrails predicting correctly and sending to llm and there is no response from the LLM, every time getting the same error 23:52:39.397 | Invocation Params {'model_id': 'anthropic.claude-3-5-sonnet-20240620-v1:0', 'provider': 'anthropic', 'stream': True, 'trace': None, 'guardrailIdentifier': None, 'guardrailVersion': None, '_type': LLM Prompt (13e98..) - self_check_input User LLM Completion (13e98..) 23:52:41.869 | Output Stats None |
Beta Was this translation helpful? Give feedback.
-
docs/integrations/llms/bedrock/
You are currently on a page documenting the use of Amazon Bedrock models as text completion models. Many popular models available on Bedrock are chat completion models.
https://python.langchain.com/docs/integrations/llms/bedrock/
Beta Was this translation helpful? Give feedback.
All reactions