Skip to content

Conversation

roomote[bot]
Copy link

@roomote roomote bot commented Aug 18, 2025

Summary

This PR fixes an issue where the temperature parameter was always being sent with a value of 0.0 when "Use custom temperature" was unchecked in the OpenAI Compatible provider settings. This prevented backend services (like LiteLLM/vLLM) from using their configured default temperatures.

Problem

When configuring RooCode with an OpenAI Compatible Provider pointing at a LocalLLM (LiteLLM → vLLM stack):

  • If "Use custom temperature" was unchecked, all requests were sent with temperature: 0.0 (greedy decoding)
  • This overrode any backend-configured defaults (e.g., LiteLLM's configured temperature)
  • Users expected that unchecking the option would allow the backend to use its own defaults

Solution

Modified the temperature handling in two key places:

  1. OpenAiHandler - Only includes temperature in the request when modelTemperature is explicitly defined
  2. BaseOpenAiCompatibleProvider - Only includes temperature when modelTemperature is explicitly defined

Special case: DeepSeek Reasoner models still get their specific default temperature when not explicitly set.

Changes

  • Modified src/api/providers/openai.ts to conditionally include temperature
  • Modified src/api/providers/base-openai-compatible-provider.ts to conditionally include temperature
  • Added comprehensive tests to verify the new behavior
  • Updated existing tests that were expecting temperature to always be present

Testing

  • ✅ Added tests to verify temperature is omitted when undefined
  • ✅ Added tests to verify temperature is included when explicitly set (including 0)
  • ✅ Updated existing tests to work with the new behavior
  • ✅ All tests passing

Fixes #7187


Important

Fixes temperature parameter handling in OpenAI Compatible providers to only include it when explicitly set, allowing backend defaults to be used.

  • Behavior:
    • OpenAiHandler and BaseOpenAiCompatibleProvider now only include temperature in requests if modelTemperature is explicitly set.
    • Special case for DeepSeek Reasoner models to use a specific default temperature when not set.
  • Code Changes:
    • Modified src/api/providers/openai.ts and src/api/providers/base-openai-compatible-provider.ts to conditionally include temperature.
  • Testing:
    • Added tests to verify temperature is omitted when undefined and included when set in openai.spec.ts, groq.spec.ts, and other test files.
    • Updated existing tests to align with new behavior.
    • All tests passing.

This description was created by Ellipsis for 0b3c58b. You can customize this summary. It will automatically update as commits are pushed.

@roomote roomote bot requested review from mrubens, cte and jr as code owners August 18, 2025 15:23
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Aug 18, 2025
Copy link
Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reviewed my own code and found it surprisingly tolerable. Almost like I knew what I was doing.

const handlerWithModel = new GroqHandler({ apiModelId: modelId, groqApiKey: "test-groq-api-key" })
const handlerWithModel = new GroqHandler({
apiModelId: modelId,
groqApiKey: "test-groq-api-key",
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this intentional? The temperature is set inline here (line 119) while in other tests it's part of the options object. Could we standardize this for better consistency?

requestOptions.temperature = this.options.modelTemperature
} else if (deepseekReasoner) {
// DeepSeek Reasoner has a specific default temperature
requestOptions.temperature = DEEP_SEEK_DEFAULT_TEMPERATURE
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The special handling for DeepSeek Reasoner models is well-implemented. Could we add a comment explaining why this model requires a specific default temperature? This would help future maintainers understand the reasoning behind this special case.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we don't need to send this anymore since we are letting the provider set it

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Aug 18, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Aug 19, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Aug 19, 2025
…mpatible providers

- Modified OpenAiHandler to only include temperature when modelTemperature is defined
- Modified BaseOpenAiCompatibleProvider to only include temperature when modelTemperature is defined
- Added tests to verify temperature is omitted when undefined
- Updated existing tests to explicitly set temperature where needed

This allows backend services (LiteLLM, vLLM) to use their configured default temperatures
instead of being forced to use temperature=0 when "Use custom temperature" is unchecked.

Fixes #7187
@daniel-lxs daniel-lxs force-pushed the fix/openai-compatible-temperature-7187 branch from 82062c2 to a80ca11 Compare August 19, 2025 23:55
- Remove temperature parameter expectations from provider tests
- Tests now expect temperature to be omitted when not explicitly set
- Aligns with PR #7188 changes to fix OpenAI Compatible provider behavior
@daniel-lxs daniel-lxs moved this from PR [Needs Prelim Review] to PR [Needs Review] in Roo Code Roadmap Aug 20, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Aug 20, 2025
@mrubens mrubens merged commit 090737c into main Aug 21, 2025
16 checks passed
@mrubens mrubens deleted the fix/openai-compatible-temperature-7187 branch August 21, 2025 09:14
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Aug 21, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Aug 21, 2025
daniel-lxs added a commit that referenced this pull request Sep 2, 2025
…penAI Compatible providers (#7188)"

This reverts commit 090737c.
daniel-lxs added a commit that referenced this pull request Sep 2, 2025
The requestOptions parameter was added after PR #7188, so tests need to be updated to not expect it after the revert
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lgtm This PR has been approved by a maintainer PR - Needs Review size:L This PR changes 100-499 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Temperature setting not applied correctly when using OpenAI Compatible Provider with LocalLLM URL
4 participants