-
-
Notifications
You must be signed in to change notification settings - Fork 41
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Is your feature request related to a problem? Please describe.
Right now, the system makes two separate LLM calls to generate the user’s about description and SEO content. Both prompts use the exact same input data, only the output structure changes. This leads to unnecessary token consumption, higher costs, and inefficient scaling.
Describe the solution you'd like
Combine both prompts into a single LLM call and use a unified structured output. This reduces token usage, avoids duplicate prompts, and keeps the workflow clean and scalable.
Describe alternatives you've considered
- Using a single LLM request
- Reducing token usage at scale
- Removing duplicate or repeated prompt logic
- Defining a more structured, future-proof prompt format
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request