Skip to content

【Feature】: add GLM-47 tool parser and support thinking/non-thinking mode toggle`#151

Merged
xyDong0223 merged 1 commit intobaidu:mainfrom
astrophel0:support-glm47-think-fc
Jan 30, 2026
Merged

【Feature】: add GLM-47 tool parser and support thinking/non-thinking mode toggle`#151
xyDong0223 merged 1 commit intobaidu:mainfrom
astrophel0:support-glm47-think-fc

Conversation

@astrophel0
Copy link
Contributor

PR Description

1. Background

Currently, vLLM does not support Function Calling for the GLM-47 (GLM4) series models. This PR introduces a dedicated tool parser for GLM-47 and adds a control mechanism for its hybrid "thinking" (reasoning) mode.

2. Changes

  • New Tool Parser: Added the glm47 tool parser. Users can enable it by passing the command-line argument:
    --tool-call-parser glm47
  • Thinking Mode Toggle: Added support for switching between thinking and non-thinking modes via chat_template_kwargs.
    • Control Parameter: "chat_template_kwargs": {"enable_thinking": true/false}.
    • Default Behavior: Thinking mode is enabled by default.
  • Bug Fix (Multi-turn Conversation):
    • Issue: The previous logic relied on string matching (checking if </think> existed in the prompt) to determine the reasoning state, which was unreliable in multi-turn dialogues.
    • Fix: Replaced the string-matching logic with explicit control via the enable_thinking parameter, ensuring consistent behavior across complex conversation histories.

3. Test Plan

  • Verified Function Calling performance with GLM-47 models using the --tool-call-parser glm47 flag.
  • Tested multi-turn chat completions to ensure enable_thinking correctly toggles the model's reasoning output without state confusion.

Notes: This PR improves the robustness of GLM4 integration by moving away from prompt-parsing heuristics to explicit parameter control.

@liwei109
Copy link
Collaborator

@kurkol Please review this PR.

@kurkol
Copy link
Contributor

kurkol commented Jan 26, 2026

Thanks for your work on adapting the GLM-4.7 model. However, during the adaptation, please avoid overwriting entire vLLM source files. Instead, use localized replacements or override specific class methods to maximize the vllm-kunlun plugin’s compatibility with different vLLM versions. You can refer to other adaptation PRs for concrete examples.

@kurkol
Copy link
Contributor

kurkol commented Jan 26, 2026

In particular, you can follow the approach used in this PR, which replaces the relevant class methods: #75

@astrophel0 astrophel0 force-pushed the support-glm47-think-fc branch from 696e007 to db9bb28 Compare January 27, 2026 03:49
@astrophel0
Copy link
Contributor Author

In particular, you can follow the approach used in this PR, which replaces the relevant class methods: #75

To keep the PR focused, I have only modified two methods within this class to support toggling between thinking and non-thinking modes. This ensures minimal impact on the existing logic.

@kurkol
Copy link
Contributor

kurkol commented Jan 30, 2026

In particular, you can follow the approach used in this PR, which replaces the relevant class methods: #75

To keep the PR focused, I have only modified two methods within this class to support toggling between thinking and non-thinking modes. This ensures minimal impact on the existing logic.

OK, then just fix the file naming — it should be glm, not gim.

Signed-off-by: zhangzhenyi <zhangzhenyi@baidu.com>
@astrophel0 astrophel0 force-pushed the support-glm47-think-fc branch from db9bb28 to 6710fcf Compare January 30, 2026 07:15
@astrophel0
Copy link
Contributor Author

In particular, you can follow the approach used in this PR, which replaces the relevant class methods: #75

To keep the PR focused, I have only modified two methods within this class to support toggling between thinking and non-thinking modes. This ensures minimal impact on the existing logic.

OK, then just fix the file naming — it should be glm, not gim.

done

@xyDong0223 xyDong0223 merged commit 803f0e5 into baidu:main Jan 30, 2026
1 of 2 checks passed
liwei109 pushed a commit that referenced this pull request Feb 1, 2026
Signed-off-by: zhangzhenyi <zhangzhenyi@baidu.com>
liwei109 added a commit that referenced this pull request Feb 1, 2026
Signed-off-by: zhangzhenyi <zhangzhenyi@baidu.com>
Co-authored-by: Li Wei <liwei.109@outlook.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants