Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use o3-mini with aider 0.73. Error "litellm.NotFoundError: OpenAIException - Error code: 404". o1-mini works fine #3097

Open
it-sha opened this issue Feb 1, 2025 · 17 comments
Labels
question Further information is requested stale

Comments

@it-sha
Copy link

it-sha commented Feb 1, 2025

Issue

Aider: 0.73.0
Model: openai/o3-mini

Model o1-mini works fine

$ aider --model o1-mini
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── 
Aider v0.73.0
Main model: o1-mini with whole edit format
Weak model: gpt-4o-mini
Git repo: .git with 42 files
Repo-map: using 4096 tokens, files refresh
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── 
> hi

Understood. I'll follow your guidelines for any future code changes.

o3-mini returns error messages:

$ aider --model o3-mini
─────────────────────────────────────────────────────────────────────────────────────────────── 
Aider v0.73.0
Main model: o3-mini with diff edit format
Weak model: gpt-4o-mini
Git repo: .git with 42 files
Repo-map: using 4096 tokens, files refresh
───────────────────────────────────────────────────────────────────────────────────────────────
> hi


litellm.NotFoundError: OpenAIException - Error code: 404 - {'error': {'message': 'The model 
`o3-mini` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 
'param': None, 'code': 'model_not_found'}}



$ aider --list-models openai/
─────────────────────────────────────────────────────────────────────────────────────────────── 
Models which match "openai/":
...
- openai/o1
- openai/o1-2024-12-17
- openai/o1-mini
- openai/o1-mini-2024-09-12
- openai/o1-preview
- openai/o1-preview-2024-09-12
- openai/o3-mini
- openai/o3-mini-2025-01-31
$ aider --model openai/o3-mini
─────────────────────────────────────────────────────────────────────────────────────────────── 
Aider v0.73.0
Main model: openai/o3-mini with diff edit format
Weak model: gpt-4o-mini
Git repo: .git with 42 files
Repo-map: using 4096 tokens, files refresh
─────────────────────────────────────────────────────────────────────────────────────────────── 
> hi


litellm.NotFoundError: OpenAIException - Error code: 404 - {'error': {'message': 'The model 
`o3-mini` does not exist or you do not have access to it.', 'type': 'invalid_request_error',    
'param': None, 'code': 'model_not_found'}}

───────

Environment:
Microsoft Windows 11 Pro

$ python3 --version
Python 3.11.9

Aider installed by this way:
https://aider.chat/2025/01/15/uv.html#one-liners

If you need any additional info please let me know

@it-sha it-sha changed the title Can't use o3-mini with aider 0.73. Error "litellm.NotFoundError: OpenAIException - Error code: 404" Can't use o3-mini with aider 0.73. Error "litellm.NotFoundError: OpenAIException - Error code: 404". o1-mini works fine Feb 1, 2025
@KalenJosifovski
Copy link

Same here. Exact same config options as @it-sha except on macosx. Exact same error

@Klohto
Copy link

Klohto commented Feb 1, 2025

Only Tier 3 to Tier 5 have access.

curl https://api.openai.com/v1/models \
  -H "Authorization: Bearer $OPENAI_API_KEY"

@gitkenan
Copy link

gitkenan commented Feb 1, 2025

Only Tier 3 to Tier 5 have access.

curl https://api.openai.com/v1/models
-H "Authorization: Bearer $OPENAI_API_KEY"

Yup, this is the problem - evidenced by it being an OpenAIException and not an Aider issue

@it-sha
Copy link
Author

it-sha commented Feb 1, 2025

Thanks for answer. It looks like I am not on this Tier.

Do we have some option to get access to o3-mini high? Open Router do not provide it, but maybe some other provider?

@Klohto
Copy link

Klohto commented Feb 1, 2025

Azure is a possibility, but that also requires a special access.

@ahalekelly
Copy link

I spent money to upgrade to Tier 3 and get access, only to find out that upgrading tiers does not gaurantee access. From an OpenAI employee:

The “select developers on API usage tiers 3-5” that the blog post mentions were chosen on Weds, meaning, if you reached tier 3 or higher after that, you won’t have access today. It’s not automatic, and I’m sorry that this wasn’t clearer earlier.

And that said, being in one of these tiers does not guarantee access immediately. For safety and security reasons, I can’t disclose all the eligibility criteria—but the “select”, established developers within those tiers that receive access first are those that are known to be using the models in good ways.

We’ll be doing more passes next week and so forth—both in giving access to more developers within tiers 3-5 and adding any new developers who recently reached tier 3 (and you’ll receive an email once o3-mini’s been enabled for your account).

@it-sha
Copy link
Author

it-sha commented Feb 3, 2025

@ahalekelly I feel your pain. Thank you for sharing your experience.
I hear that there is more than one condition to get access. One of them was a tier and the second was one was 14 days of using this tear.
Image
UPDATE: Sorry, in your link, it was mentioned as well. 14 days+ is not a guarantee for access, as well

@Klohto Thanks for advice. Unfortunately Azure provides huge limitations (4к tokens for input/output, details: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits)

I found that Glama.ai provides access to o3-mini-high via API (https://glama.ai/models/o3-mini-high) but I can't find a way to use it with Aider

Maybe you could give me some advice to do that?

@Klohto
Copy link

Klohto commented Feb 3, 2025

I found that Glama.ai provides access to o3-mini-high via API (https://glama.ai/models/o3-mini-high) but I can't find a way to use it with Aider

https://aider.chat/docs/llms/openai-compat.html

OPENAI_API_BASE=https://glama.ai/api/gateway/openai/v1
OPENAI_API_BASE=GLAMA_API_KEY

then just call openai/o3-mini-high

@it-sha
Copy link
Author

it-sha commented Feb 3, 2025

@Klohto Thank you a lot! I will try

@SeaDude
Copy link

SeaDude commented Feb 4, 2025

I have access to o3-mini but I get the error:

litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 
'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 
'unsupported_parameter'}}

Please advise on how to use aider with o3-mini-*

Thank you

@vladiliescu
Copy link

Same error as @SeaDude for azure/o3-mini. azure/o1-mini works.

Aider v0.73.0
Main model: azure/o3-mini with whole edit format
Weak model: azure/gpt-4o
Git repo: .git with 44 files
Repo-map: using 4096 tokens, auto refresh
────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> hi


litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported
parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param':
'temperature', 'code': 'unsupported_parameter',

@paul-gauthier
Copy link
Collaborator

Thanks for trying aider and filing this issue.

The fix is available in the main branch. You can get it by installing the latest version from github:

aider --install-main-branch

# or...

python -m pip install --upgrade --upgrade-strategy only-if-needed git+https://github.com/Aider-AI/aider.git

If you have a chance to try it, let me know if it works better for you.

This is because you need to set model settings for any provider's o3 to disable temperature use.

@it-sha
Copy link
Author

it-sha commented Feb 4, 2025

@paul-gauthier
after aider --install-main-branch

Image

checked with

// .env
OPENAI_API_BASE=https://glama.ai/api/gateway/openai/v1
OPENAI_API_BASE=GLAMA_API_KEY

then just call openai/o3-mini-high

@paul-gauthier
Copy link
Collaborator

Thanks for trying aider and filing this issue. This doc may be helpful:

https://aider.chat/docs/config/reasoning.html

@github-actions github-actions bot added the question Further information is requested label Feb 5, 2025
@bradsjm
Copy link

bradsjm commented Feb 6, 2025

Thanks for trying aider and filing this issue.

The fix is available in the main branch. You can get it by installing the latest version from github:

aider --install-main-branch

# or...

python -m pip install --upgrade --upgrade-strategy only-if-needed git+https://github.com/Aider-AI/aider.git

If you have a chance to try it, let me know if it works better for you.

This is because you need to set model settings for any provider's o3 to disable temperature use.

I changed nothing but installed the main branch from the command and the temperature error is now gone for me and it appears to be working as expected.

@yaskin
Copy link

yaskin commented Feb 7, 2025

use deepseek

Copy link

I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days.

Note: A bot script made these updates to the issue.

@github-actions github-actions bot added the stale label Feb 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested stale
Projects
None yet
Development

No branches or pull requests

10 participants