-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added dashboard and otel viewing commands #324
base: main
Are you sure you want to change the base?
Conversation
public static async Task<string> GetTrace(string requestId) | ||
{ | ||
// Endpoint URL with the request ID | ||
string url = $"https://spanretriever1-webapp.whitehill-dc9ec001.westus.azurecontainerapps.io/getspan/{requestId}"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
eventually, how will this get retrieved? for the actual endpoint for the user to use?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the web service will be deployed as an Azure Container App so the endpoint will be public and authentication will happen within the web service to determine if the user can access the spans
|
||
var request = JsonConvert.DeserializeObject<ExportTraceServiceRequest>(data, settings); | ||
|
||
var otlpEndpoint = "http://localhost:4317"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will it always be port 4317?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, the dashboard will always be listening on port 4317 given how it is currently launched:
docker run --rm -it -p 18888:18888 -p 4317:18889 -d --name aspire-dashboard mcr.microsoft.com/dotnet/aspire-dashboard:8.0.0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is that the port that the docs/samples recommend? should we offer it on another port via an option? or is this probably fine?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these files under the "proto/opentelemetry/proto" directory... Are they copies from somewhere? where from? is there a more "not-make-a-copy" way to get them (like via some package manager or something?) ... The worry i have is that they'll get out of date.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these are copied from the open telemetry repository
I looked into automatically pulling the proto files but there doesn't seem to be any clear way. I checked in with Peter and he said that they should be backward compatible so any recent version of the proto files should be good enough.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made various comments... Let me know if any of them are unclear.
|
||
try | ||
{ | ||
using (FileStream fs = new FileStream(filepath, mode)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should use FileHelpers.AppendAllText
or FileHelpers.WriteAllText
(they handle other scenarios, things like if you pass in -
it'll use STDOUT, or, if you pass in @blah
it'll put it in the .ai/data
directory that it finds ... )
Console.WriteLine(startResult); | ||
|
||
// Wait before fetching logs if needed | ||
System.Threading.Thread.Sleep(3000); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason it's 3s instead of 2s instead of 4s? Is there a more "deterministic" way to determine that the code has waited enough?
@@ -35,7 +35,7 @@ public static bool DispatchParseCommand(INamedValueTokens tokens, ICommandValues | |||
|
|||
var parsed = Program.DispatchParseCommand(tokens, values); | |||
if (parsed || values.DisplayHelpRequested()) return parsed; | |||
|
|||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
intentional?
Added the following commands:
ai tool dashboard start
ai tool dashboard stop
^^^ starts or stops the dashboard (via docker commands)
ai chat assistant trace get --request-id XXXX
^^^ given a specific request id, show the OTEL payload
ai chat assistant trace get --request-id XXXX --output-file YYY
^^^ given specific request id, show and output OTEL payload to file
ai chat assistant trace get --request-id XXXX --dashboard
^^^ given specific request id, show the OTEL payload and automatically add to dash
ai ... --output-request-id requestid.txt
ai ... --output-add-request-id requestid.txt