-
Notifications
You must be signed in to change notification settings - Fork 535
more misc builder improvements #2592
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThis pull request adds Wave AI platform integration features across the frontend and backend. Changes include: adding a scrollToBottom capability to the AI panel input, implementing a "Send Output to AI" button in the builder that transmits filtered build output to the AI model, and gating configuration data fetching based on the active tab. On the backend, the changes extend the chat system to capture and propagate platform metadata (PlatformInfo) throughout the AI message conversion pipeline for both Anthropic and OpenAI providers. Additionally, the app initialization flow in the builder controller is reordered to start the app immediately after a successful build rather than after output emission. The BuilderAppGenerator callback signature is updated to return platform information as a third value. Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes
Pre-merge checks and finishing touches❌ Failed checks (2 warnings, 1 inconclusive)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
frontend/builder/builder-buildpanel.tsx (1)
85-102: Consider extracting the filtering logic to reduce duplication.The line filtering logic appears twice:
- Lines 88-90 in
handleSendToAI- Lines 100-102 for
filteredLinesWhile both are functional, you could reduce duplication by either:
- Extracting a
getFilteredLines(lines: string[], includeDebug: boolean)helper- Reusing the memoized
filteredLinesinhandleSendToAI(though you'd need to slice it)That said, the current duplication is minimal and the code is clear, so this is optional.
Example refactor:
+const getFilteredLines = (lines: string[], includeDebug: boolean) => { + return includeDebug + ? lines + : lines.filter((line) => !line.startsWith("[debug]") && line.trim().length > 0); +}; + const handleSendToAI = useCallback(() => { const currentShowDebug = globalStore.get(model.showDebug); const currentOutputLines = globalStore.get(model.outputLines); - const filtered = currentShowDebug - ? currentOutputLines - : currentOutputLines.filter((line) => !line.startsWith("[debug]") && line.trim().length > 0); + const filtered = getFilteredLines(currentOutputLines, currentShowDebug); const linesToSend = filtered.slice(-200); // ... rest of the function }, [model]); -const filteredLines = showDebug - ? outputLines - : outputLines.filter((line) => !line.startsWith("[debug]") && line.trim().length > 0); +const filteredLines = getFilteredLines(outputLines, showDebug);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
frontend/app/aipanel/aipanelinput.tsx(2 hunks)frontend/app/aipanel/waveai-model.tsx(2 hunks)frontend/builder/builder-buildpanel.tsx(2 hunks)frontend/builder/tabs/builder-configdatatab.tsx(2 hunks)frontend/types/gotypes.d.ts(1 hunks)pkg/aiusechat/anthropic/anthropic-convertmessage.go(1 hunks)pkg/aiusechat/openai/openai-convertmessage.go(1 hunks)pkg/aiusechat/uctypes/usechat-types.go(2 hunks)pkg/aiusechat/usechat.go(5 hunks)pkg/buildercontroller/buildercontroller.go(1 hunks)
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-11-01T00:57:23.025Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2504
File: frontend/app/aipanel/aipanel-contextmenu.ts:15-16
Timestamp: 2025-11-01T00:57:23.025Z
Learning: In the waveterm codebase, types defined in custom.d.ts are globally available and do not require explicit imports. Backend types defined in gotypes.d.ts are also globally available.
Applied to files:
frontend/types/gotypes.d.ts
📚 Learning: 2025-10-14T06:30:54.763Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2430
File: frontend/app/aipanel/aimessage.tsx:137-144
Timestamp: 2025-10-14T06:30:54.763Z
Learning: In `frontend/app/aipanel/aimessage.tsx`, the `AIToolUseGroup` component splits file operation tool calls into separate batches (`fileOpsNeedApproval` and `fileOpsNoApproval`) based on their approval state before passing them to `AIToolUseBatch`. This ensures each batch has homogeneous approval states, making group-level approval handling valid.
Applied to files:
frontend/builder/builder-buildpanel.tsx
📚 Learning: 2025-10-15T03:21:02.229Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2433
File: pkg/aiusechat/tools_readfile.go:197-197
Timestamp: 2025-10-15T03:21:02.229Z
Learning: In Wave Terminal's AI tool definitions (pkg/aiusechat/tools_*.go), the Description field should not mention approval requirements even when ToolApproval returns ApprovalNeedsApproval. This prevents the LLM from asking users for approval before calling the tool, avoiding redundant double-approval prompts since the runtime will enforce approval anyway.
Applied to files:
pkg/aiusechat/usechat.go
🧬 Code graph analysis (2)
frontend/builder/builder-buildpanel.tsx (1)
frontend/app/aipanel/waveai-model.tsx (1)
WaveAIModel(43-582)
pkg/aiusechat/usechat.go (1)
pkg/wavebase/wavebase.go (1)
GetSystemSummary(376-383)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Build for TestDriver.ai
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Analyze (go)
🔇 Additional comments (11)
frontend/types/gotypes.d.ts (1)
1223-1224: LGTM! Clean telemetry field additions.The new optional fields for Wave AI chat tracking are properly typed and follow the existing naming conventions.
frontend/builder/tabs/builder-configdatatab.tsx (1)
72-72: LGTM! Smart optimization to gate data fetching.The change to only fetch config/data when the "configdata" tab is active prevents unnecessary API calls and improves performance. The dependency array is correctly updated to include
activeTab.Also applies to: 148-158
pkg/buildercontroller/buildercontroller.go (1)
264-275: Clarify the purpose of the 1-second delay.The reordering to start the app immediately after a successful build is good, but the 1-second sleep at line 275 lacks explanation.
Could you document why this delay is necessary? Is it to:
- Allow the app process to stabilize?
- Ensure the port is fully bound?
- Give time for initial output to be captured?
Consider adding a comment explaining the rationale, especially since this is a critical part of the build flow.
frontend/app/aipanel/aipanelinput.tsx (1)
20-20: LGTM! Clean scrollToBottom implementation.The new method is straightforward and correctly scrolls the textarea to its bottom. Good addition to support the programmatic scrolling feature.
Also applies to: 47-52
pkg/aiusechat/openai/openai-convertmessage.go (1)
221-223: LGTM! Consistent PlatformInfo injection.The PlatformInfo block injection follows the same pattern as existing TabState and AppStaticFiles handling. The ordering and formatting are consistent.
frontend/app/aipanel/waveai-model.tsx (1)
304-304: LGTM! Well-designed optional scrolling behavior.The optional
scrollToBottomparameter is a clean extension of theappendTextAPI. The 10ms delay ensures the DOM updates before scrolling, which is a sensible approach.Also applies to: 321-323
frontend/builder/builder-buildpanel.tsx (1)
118-123: LGTM! Clean integration of "Send Output to AI" feature.The button properly integrates with the WaveAIModel and leverages the new scrollToBottom capability. The feature will help users quickly send build output to AI for debugging.
pkg/aiusechat/uctypes/usechat-types.go (1)
452-452: All BuilderAppGenerator implementations have been correctly updated for the new signature.Verification confirms that the implementation at usechat.go:694-696 uses the correct signature
func() (string, string, string, error), and the usage at usechat.go:415 properly unpacks all four return values:appGoFile, appStaticFiles, platformInfo, appErr. The helper functiongenerateBuilderAppDataalso returns(string, string, string, error), matching the required signature. Only one assignment exists in the codebase, and it is correctly implemented.pkg/aiusechat/anthropic/anthropic-convertmessage.go (1)
75-104: LGTM! Clean multi-block injection pattern.The refactoring from a single AppGoFile block to individual blocks for PlatformInfo, AppStaticFiles, and AppGoFile is well-implemented. Each field is checked independently and only appended when non-empty, following the same pattern as the TabState injection above.
pkg/aiusechat/usechat.go (2)
414-421: LGTM! Generator signature correctly updated.The BuilderAppGenerator call now captures and assigns the new platformInfo return value alongside the existing appGoFile and appStaticFiles.
865-872: Verify that including the local username in platform info sent to AI is intentional and properly governed.The implementation unconditionally collects and sends the current user's username to AI services (Anthropic, OpenAI) as part of the
<PlatformInfo>context block. This is clearly intentional based on the code structure, but it is:
- Undocumented: No comments explain why username collection is necessary
- Non-configurable: No flags, settings, or opt-out mechanism exists
- Unannounced: No user-facing documentation indicates their username is transmitted to external AI services
Please confirm with your product and privacy teams:
- Is this username transmission explicitly approved and intentional?
- Should this be documented for users or configurable/opt-in?
- Is this appropriate for the builder mode context, or should it be gated behind user consent?
No description provided.