-
Notifications
You must be signed in to change notification settings - Fork 52
Caching AI SDK Models #317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: b8a1dfc The changes in this PR will be included in the next version bump. Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
The latest updates on your projects. Learn more about Vercel for GitHub. 2 Skipped Deployments
|
commit: |
c1a2208 to
86b625c
Compare
|
There are a couple of things I'm not happy with about this PR. Overall, the implementation is solid, and the UI is nice. But:
It doesn't really make sense to me to have caching without tracing, or tracing without caching. If you want a great AI SDK integration, you really just want to call one function, wrap your model, and you're good to go. We can have options to disable tracing or disable caching, but having a single function seems indispensable. My thinking is that inside scorers, we would want to prevent tracing but allow caching, since scorer traces don't make sense in the trace view. |
9bf49cd to
0061595
Compare
0061595 to
61ece64
Compare
Fixes #309