Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added dashboard and otel viewing commands #324

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

catherine-m-zhang
Copy link

@catherine-m-zhang catherine-m-zhang commented Jul 26, 2024

Added the following commands:

ai tool dashboard start
ai tool dashboard stop
^^^ starts or stops the dashboard (via docker commands)

ai chat assistant trace get --request-id XXXX
^^^ given a specific request id, show the OTEL payload

ai chat assistant trace get --request-id XXXX --output-file YYY
^^^ given specific request id, show and output OTEL payload to file

ai chat assistant trace get --request-id XXXX --dashboard
^^^ given specific request id, show the OTEL payload and automatically add to dash

ai ... --output-request-id requestid.txt
ai ... --output-add-request-id requestid.txt

@catherine-m-zhang catherine-m-zhang requested a review from robch as a code owner July 26, 2024 16:10
ai-cli.sln Outdated Show resolved Hide resolved
ai-cli.sln Outdated Show resolved Hide resolved
src/ai/ai-cli.csproj Outdated Show resolved Hide resolved
public static async Task<string> GetTrace(string requestId)
{
// Endpoint URL with the request ID
string url = $"https://spanretriever1-webapp.whitehill-dc9ec001.westus.azurecontainerapps.io/getspan/{requestId}";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

eventually, how will this get retrieved? for the actual endpoint for the user to use?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the web service will be deployed as an Azure Container App so the endpoint will be public and authentication will happen within the web service to determine if the user can access the spans


var request = JsonConvert.DeserializeObject<ExportTraceServiceRequest>(data, settings);

var otlpEndpoint = "http://localhost:4317";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will it always be port 4317?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, the dashboard will always be listening on port 4317 given how it is currently launched:
docker run --rm -it -p 18888:18888 -p 4317:18889 -d --name aspire-dashboard mcr.microsoft.com/dotnet/aspire-dashboard:8.0.0

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is that the port that the docs/samples recommend? should we offer it on another port via an option? or is this probably fine?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these files under the "proto/opentelemetry/proto" directory... Are they copies from somewhere? where from? is there a more "not-make-a-copy" way to get them (like via some package manager or something?) ... The worry i have is that they'll get out of date.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these are copied from the open telemetry repository
I looked into automatically pulling the proto files but there doesn't seem to be any clear way. I checked in with Peter and he said that they should be backward compatible so any recent version of the proto files should be good enough.

Copy link
Collaborator

@robch robch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made various comments... Let me know if any of them are unclear.


try
{
using (FileStream fs = new FileStream(filepath, mode))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should use FileHelpers.AppendAllText or FileHelpers.WriteAllText (they handle other scenarios, things like if you pass in - it'll use STDOUT, or, if you pass in @blah it'll put it in the .ai/data directory that it finds ... )

Console.WriteLine(startResult);

// Wait before fetching logs if needed
System.Threading.Thread.Sleep(3000);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason it's 3s instead of 2s instead of 4s? Is there a more "deterministic" way to determine that the code has waited enough?

@@ -35,7 +35,7 @@ public static bool DispatchParseCommand(INamedValueTokens tokens, ICommandValues

var parsed = Program.DispatchParseCommand(tokens, values);
if (parsed || values.DisplayHelpRequested()) return parsed;

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

intentional?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants