Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cost: Reduce LLM Token Usage in Log/Event Analysis #739

Open
Tracked by #463
elliotxx opened this issue Jan 16, 2025 · 0 comments
Open
Tracked by #463

Cost: Reduce LLM Token Usage in Log/Event Analysis #739

elliotxx opened this issue Jan 16, 2025 · 0 comments
Labels
area/ai AI-related features help wanted Extra attention is needed medium Requires a moderate level of project knowledge and skills, but does not require deep core technical priority/important-longterm P2 Important over the long term,but may not be staffed and/or may need multiple releases to complete
Milestone

Comments

@elliotxx
Copy link
Collaborator

elliotxx commented Jan 16, 2025

What would you like to be added?

Implement frontend pre-processing for logs and events to extract key information before sending to LLM API, reducing token consumption and improving cost efficiency.

Why is this needed?

Currently, the log and event aggregators send full content to LLM APIs (despite length limits), causing unnecessary token consumption.

This improvement will:

  1. Frontend Pre-processing:
  • Extract key information from logs/events
  • Filter out redundant data
  • Identify critical patterns
  1. Implementation Details:
  • Add frontend utility functions for data extraction
  • Implement pattern recognition for common log formats
  • Create configurable filtering rules
  1. Expected Benefits:
  • Reduced token consumption
  • Lower API costs
  • Faster response times

This optimization will make our AI features more cost-effective while maintaining analysis quality.

@elliotxx elliotxx added area/ai AI-related features priority/important-soon P1 Must be staffed and worked on either currently,or very soon,ideally in time for the next release labels Jan 16, 2025
@elliotxx elliotxx added this to the v0.7.0 milestone Jan 16, 2025
@elliotxx elliotxx self-assigned this Jan 16, 2025
@github-actions github-actions bot mentioned this issue Jan 16, 2025
37 tasks
@elliotxx elliotxx added priority/important-longterm P2 Important over the long term,but may not be staffed and/or may need multiple releases to complete help wanted Extra attention is needed and removed priority/important-soon P1 Must be staffed and worked on either currently,or very soon,ideally in time for the next release labels Jan 16, 2025
@elliotxx elliotxx removed their assignment Jan 20, 2025
@elliotxx elliotxx added the medium Requires a moderate level of project knowledge and skills, but does not require deep core technical label Feb 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/ai AI-related features help wanted Extra attention is needed medium Requires a moderate level of project knowledge and skills, but does not require deep core technical priority/important-longterm P2 Important over the long term,but may not be staffed and/or may need multiple releases to complete
Projects
None yet
Development

No branches or pull requests

1 participant