-
Notifications
You must be signed in to change notification settings - Fork 2.1k
fix: omit temperature parameter when not explicitly set for OpenAI Compatible providers #7188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reviewed my own code and found it surprisingly tolerable. Almost like I knew what I was doing.
const handlerWithModel = new GroqHandler({ apiModelId: modelId, groqApiKey: "test-groq-api-key" }) | ||
const handlerWithModel = new GroqHandler({ | ||
apiModelId: modelId, | ||
groqApiKey: "test-groq-api-key", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this intentional? The temperature is set inline here (line 119) while in other tests it's part of the options object. Could we standardize this for better consistency?
requestOptions.temperature = this.options.modelTemperature | ||
} else if (deepseekReasoner) { | ||
// DeepSeek Reasoner has a specific default temperature | ||
requestOptions.temperature = DEEP_SEEK_DEFAULT_TEMPERATURE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The special handling for DeepSeek Reasoner models is well-implemented. Could we add a comment explaining why this model requires a specific default temperature? This would help future maintainers understand the reasoning behind this special case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we don't need to send this anymore since we are letting the provider set it
…mpatible providers - Modified OpenAiHandler to only include temperature when modelTemperature is defined - Modified BaseOpenAiCompatibleProvider to only include temperature when modelTemperature is defined - Added tests to verify temperature is omitted when undefined - Updated existing tests to explicitly set temperature where needed This allows backend services (LiteLLM, vLLM) to use their configured default temperatures instead of being forced to use temperature=0 when "Use custom temperature" is unchecked. Fixes #7187
82062c2
to
a80ca11
Compare
- Remove temperature parameter expectations from provider tests - Tests now expect temperature to be omitted when not explicitly set - Aligns with PR #7188 changes to fix OpenAI Compatible provider behavior
The requestOptions parameter was added after PR #7188, so tests need to be updated to not expect it after the revert
Summary
This PR fixes an issue where the temperature parameter was always being sent with a value of 0.0 when "Use custom temperature" was unchecked in the OpenAI Compatible provider settings. This prevented backend services (like LiteLLM/vLLM) from using their configured default temperatures.
Problem
When configuring RooCode with an OpenAI Compatible Provider pointing at a LocalLLM (LiteLLM → vLLM stack):
temperature: 0.0
(greedy decoding)Solution
Modified the temperature handling in two key places:
modelTemperature
is explicitly definedmodelTemperature
is explicitly definedSpecial case: DeepSeek Reasoner models still get their specific default temperature when not explicitly set.
Changes
src/api/providers/openai.ts
to conditionally include temperaturesrc/api/providers/base-openai-compatible-provider.ts
to conditionally include temperatureTesting
Fixes #7187
Important
Fixes temperature parameter handling in OpenAI Compatible providers to only include it when explicitly set, allowing backend defaults to be used.
OpenAiHandler
andBaseOpenAiCompatibleProvider
now only includetemperature
in requests ifmodelTemperature
is explicitly set.src/api/providers/openai.ts
andsrc/api/providers/base-openai-compatible-provider.ts
to conditionally includetemperature
.temperature
is omitted when undefined and included when set inopenai.spec.ts
,groq.spec.ts
, and other test files.This description was created by
for 0b3c58b. You can customize this summary. It will automatically update as commits are pushed.