Skip to content

Commit 20f96cb

Browse files
committed
adding message to response body
1 parent cb51ef7 commit 20f96cb

File tree

3 files changed

+7
-39
lines changed

3 files changed

+7
-39
lines changed

backend/app/api/docs/llm/get_llm_call.md

Lines changed: 0 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -2,43 +2,9 @@ Retrieve the status and results of an LLM call job by job ID.
22

33
This endpoint allows you to poll for the status and results of an asynchronous LLM call job that was previously initiated via the POST `/llm/call` endpoint.
44

5-
### Path Parameters
6-
7-
**`job_id`** (required, UUID) - The unique identifier of the job returned when the LLM call was created.
8-
9-
### Response
10-
11-
The endpoint returns an `LLMJobPublic` object containing:
12-
13-
- **`job_id`** (UUID) - The unique identifier of the job
14-
- **`status`** (string) - Current status of the job. Possible values:
15-
- `PENDING` - Job has been created and is waiting to be processed
16-
- `PROCESSING` - Job is currently being processed
17-
- `SUCCESS` - Job completed successfully
18-
- `FAILED` - Job failed during processing
19-
- **`llm_response`** (object | null) - The complete LLM response when status is `SUCCESS`, containing:
20-
- `response` - Normalized LLM response with provider_response_id, conversation_id, provider, model, and output
21-
- `usage` - Token usage information (input_tokens, output_tokens, total_tokens)
22-
- **`error_message`** (string | null) - Error details if the job failed, otherwise null
23-
- **`job_inserted_at`** (datetime) - Timestamp when the job was created
24-
- **`job_updated_at`** (datetime) - Timestamp when the job was last updated
25-
26-
### Usage
27-
28-
1. Create an LLM call using POST `/llm/call` to receive a `job_id`
29-
2. Use this endpoint to poll for the job status
30-
3. When the status is `SUCCESS`, the `llm_response` field will contain the complete LLM response
31-
4. When the status is `FAILED`, check the `error_message` field for details
32-
33-
### Polling Strategy
34-
35-
- Poll this endpoint periodically until `status` is either `SUCCESS` or `FAILED`
36-
- Use exponential backoff (e.g., 1s, 2s, 4s, 8s) to reduce server load
37-
- Stop polling when status is terminal (`SUCCESS` or `FAILED`)
385

396
### Notes
407

418
- This endpoint returns both the job status AND the actual LLM response when complete
429
- LLM responses are also delivered asynchronously via the callback URL (if provided)
4310
- Jobs can be queried at any time after creation
44-
- The endpoint returns a 404 error if the job_id does not exist

backend/app/api/routes/llm.py

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,10 @@ def llm_call(
7575
if not job:
7676
raise HTTPException(status_code=404, detail="Job not found")
7777

78-
message = "Your response is being generated and will be delivered via callback."
78+
if request.callback_url:
79+
message = "Your response is being generated and will be delivered via callback."
80+
else:
81+
message = "Your response is being generated"
7982

8083
job_response = LLMJobImmediatePublic(
8184
job_id=job.id,
@@ -85,8 +88,6 @@ def llm_call(
8588
job_updated_at=job.updated_at,
8689
)
8790

88-
# message = "Your response is being generated and will be delivered via callback." if request.callback_url else "Your response is being generated. Use the job_id to poll for results."
89-
9091
return APIResponse.success_response(data=job_response)
9192

9293

@@ -116,7 +117,8 @@ def get_llm_call_status(
116117
llm_calls = get_llm_calls_by_job_id(session=session, job_id=job_id)
117118

118119
if llm_calls:
119-
# Get the first (latest) LLM call
120+
# Get the first LLM call from the list which will be the only call for the job id
121+
# since we initially won't be using this endpoint for llm chains
120122
llm_call = llm_calls[0]
121123

122124
llm_response = LLMResponse(

backend/app/models/llm/response.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -111,12 +111,12 @@ class LLMJobBasePublic(SQLModel):
111111

112112
job_id: UUID
113113
status: str # JobStatus from job.py
114-
message: str
115114

116115

117116
class LLMJobImmediatePublic(LLMJobBasePublic):
118117
"""Immediate response after creating an LLM job."""
119118

119+
message: str
120120
job_inserted_at: datetime
121121
job_updated_at: datetime
122122

0 commit comments

Comments
 (0)