-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat : Support AWS bedrock base models #25
base: main
Are you sure you want to change the base?
Feat : Support AWS bedrock base models #25
Conversation
Hey @nirga , Quick question about integrating Stability AI models in our Hub. I'm looking at AWS Bedrock's Stable Diffusion 3.5 integration (from their model catalog). Not sure about the best format to implement this:
Would appreciate your thoughts on this. Thanks! |
I think it should be in a new api @detunjiSamuel |
Hey @nirga , I have completed this PR and would appreciate your review.
Please let me know if any additional information or changes are needed. Thank you! |
💵 To receive payouts, sign up on Algora, link your Github account and connect with Stripe. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @detunjiSamuel, can you take a look at the tests? Seems to be failing
@nirga , Thanks for pointing that out. Thanks again! |
Hi @nirga , I wanted to follow up on this PR as it's been a few weeks since the last update. As mentioned previously, I've addressed the test failures by implementing AWS Smithy to replay the test responses, which resolved the CI issues. Everything is passing and ready for review. |
AWS Bedrock Provider Integration
Added support for AWS Bedrock as a new LLM provider:
Key Changes
Added Bedrock provider implementation with model-specific handlers:
Testing Notes
All tests pass using AWS credentials in us-east-1/2 regions
Verified error handling for invalid credentials/models
Tested non-streaming responses ( models in Bedrock don't seem to have streaming types )
Review notes
The model ID from AWS link does not work consistently.
Instead, use the
Inference profile ARN
orInference profile ID
from the cross-region reference tab as your model_id.Issue: #20
/claim #20