-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't use o3-mini with aider 0.73. Error "litellm.NotFoundError: OpenAIException - Error code: 404". o1-mini works fine #3097
Comments
Same here. Exact same config options as @it-sha except on macosx. Exact same error |
Only Tier 3 to Tier 5 have access. curl https://api.openai.com/v1/models \
-H "Authorization: Bearer $OPENAI_API_KEY" |
Yup, this is the problem - evidenced by it being an OpenAIException and not an Aider issue |
Thanks for answer. It looks like I am not on this Tier. Do we have some option to get access to o3-mini high? Open Router do not provide it, but maybe some other provider? |
Azure is a possibility, but that also requires a special access. |
I spent money to upgrade to Tier 3 and get access, only to find out that upgrading tiers does not gaurantee access. From an OpenAI employee:
|
@ahalekelly I feel your pain. Thank you for sharing your experience. @Klohto Thanks for advice. Unfortunately Azure provides huge limitations (4к tokens for input/output, details: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits) I found that Glama.ai provides access to o3-mini-high via API (https://glama.ai/models/o3-mini-high) but I can't find a way to use it with Aider Maybe you could give me some advice to do that? |
https://aider.chat/docs/llms/openai-compat.html OPENAI_API_BASE=https://glama.ai/api/gateway/openai/v1 then just call openai/o3-mini-high |
@Klohto Thank you a lot! I will try |
I have access to
Please advise on how to use Thank you |
Same error as @SeaDude for
|
Thanks for trying aider and filing this issue. The fix is available in the main branch. You can get it by installing the latest version from github:
If you have a chance to try it, let me know if it works better for you. This is because you need to set model settings for any provider's o3 to disable temperature use. |
@paul-gauthier checked with
|
Thanks for trying aider and filing this issue. This doc may be helpful: |
I changed nothing but installed the main branch from the command and the temperature error is now gone for me and it appears to be working as expected. |
use deepseek |
I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days. Note: A bot script made these updates to the issue. |
Issue
Aider: 0.73.0
Model: openai/o3-mini
Model o1-mini works fine
o3-mini returns error messages:
Environment:
Microsoft Windows 11 Pro
$ python3 --version
Python 3.11.9
Aider installed by this way:
https://aider.chat/2025/01/15/uv.html#one-liners
If you need any additional info please let me know
The text was updated successfully, but these errors were encountered: