Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 23 additions & 1 deletion genai-function-calling/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,5 +78,27 @@ http://localhost:5601/app/apm/traces?rangeFrom=now-15m&rangeTo=now

![Kibana screenshot](./kibana-trace.png)

## Prerequisites

Docker or Podman is required. You'll also need an OpenAI API compatible
inference platform and an OpenTelemetry collector.

First of all, you need to be in a directory that contains this repository. If
you haven't yet, you get one like this:
```bash
curl -L https://github.com/elastic/observability-examples/archive/refs/heads/main.tar.gz | tar -xz
cd observability-examples-main/genai-function-calling/
```

### Podman

If you are using [Podman](https://podman.io/) to run docker containers, export
`HOST_IP`. If you don't you'll get this error running exercises:
> unable to upgrade to tcp, received 500

Here's how to export your `HOST_IP`:
* If macOS: `export HOST_IP=$(ipconfig getifaddr en0)`
* If Ubuntu: `export HOST_IP=$(hostname -I | awk '{print $1}')`

---
[native]: https://opentelemetry.io/docs/languages/java/instrumentation/#native-instrumentation
[native]: https://opentelemetry.io/docs/languages/java/instrumentation/#native-instrumentation
8 changes: 1 addition & 7 deletions genai-function-calling/openai-agents/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,12 +1,6 @@
# Use glibc-based image with pre-compiled wheels for psutil
FROM python:3.12-slim

# TODO: temporary until openai-agents 0.0.5
RUN apt-get update \
&& apt-get install -y --no-install-recommends git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

RUN --mount=type=cache,target=/root/.cache/pip python -m pip install --upgrade pip

COPY requirements.txt /tmp
Expand All @@ -15,4 +9,4 @@ RUN --mount=type=cache,target=/root/.cache/pip edot-bootstrap --action=install

COPY main.py /

CMD [ "python", "main.py" ]
CMD [ "opentelemetry-instrument", "python", "main.py" ]
2 changes: 1 addition & 1 deletion genai-function-calling/openai-agents/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ services:
env_file:
- .env
extra_hosts: # send localhost traffic to the docker host, e.g. your laptop
- "localhost:host-gateway"
- "localhost:${HOST_IP:-host-gateway}"
7 changes: 7 additions & 0 deletions genai-function-calling/openai-agents/env.example
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,13 @@ OPENAI_API_KEY=
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment to use RamaLama instead of OpenAI
# OPENAI_BASE_URL=http://localhost:8080/v1
# OPENAI_API_KEY=unused
# # This works when you supply a major_version parameter in your prompt. If you
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment and complete if you want to use Azure OpenAI Service
## "Azure OpenAI Endpoint" in https://oai.azure.com/resource/overview
# AZURE_OPENAI_ENDPOINT=https://YOUR_RESOURCE_NAME.openai.azure.com/
Expand Down
3 changes: 1 addition & 2 deletions genai-function-calling/openai-agents/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
# TODO: temporary until openai-agents 0.0.5
openai-agents @ git+https://github.com/openai/openai-agents-python.git@main
openai-agents~=0.0.5
httpx~=0.28.1

elastic-opentelemetry~=0.8.0
Expand Down
2 changes: 1 addition & 1 deletion genai-function-calling/semantic-kernel-dotnet/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
ARG DOTNET_VERSION=9.0

FROM mcr.microsoft.com/dotnet/sdk:${DOTNET_VERSION}-alpine AS edot
ARG EDOT_VERSION=1.0.0-beta.1
ARG EDOT_VERSION=1.0.0-beta.2
ARG EDOT_INSTALL=https://github.com/elastic/elastic-otel-dotnet/releases/download/${EDOT_VERSION}/elastic-dotnet-auto-install.sh
ENV OTEL_DOTNET_AUTO_HOME=/edot
WORKDIR /edot
Expand Down
6 changes: 3 additions & 3 deletions genai-function-calling/semantic-kernel-dotnet/app.csproj
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,9 @@
</ItemGroup>

<ItemGroup>
<PackageReference Include="Microsoft.SemanticKernel" Version="1.40.1" />
<PackageReference Include="Microsoft.SemanticKernel.Agents.Core" Version="1.40.1-preview" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.OpenAI" Version="1.40.1" />
<PackageReference Include="Microsoft.SemanticKernel" Version="1.42.0" />
<PackageReference Include="Microsoft.SemanticKernel.Agents.Core" Version="1.42.0-preview" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.OpenAI" Version="1.42.0" />
</ItemGroup>

</Project>
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ services:
env_file:
- .env
extra_hosts: # send localhost traffic to the docker host, e.g. your laptop
- "localhost:host-gateway"
- "localhost:${HOST_IP:-host-gateway}"
7 changes: 7 additions & 0 deletions genai-function-calling/semantic-kernel-dotnet/env.example
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,13 @@ OPENAI_API_KEY=
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment to use RamaLama instead of OpenAI
# OPENAI_BASE_URL=http://localhost:8080/v1
# OPENAI_API_KEY=unused
# # This works when you supply a major_version parameter in your prompt. If you
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment and complete if you want to use Azure OpenAI Service
## "Azure OpenAI Endpoint" in https://oai.azure.com/resource/overview
# AZURE_OPENAI_ENDPOINT=https://YOUR_RESOURCE_NAME.openai.azure.com/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
Expand All @@ -16,4 +16,4 @@
# under the License.
wrapperVersion=3.3.2
distributionType=only-script
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.7/apache-maven-3.9.7-bin.zip
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.9/apache-maven-3.9.9-bin.zip
2 changes: 1 addition & 1 deletion genai-function-calling/spring-ai/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ services:
env_file:
- .env
extra_hosts: # send localhost traffic to the docker host, e.g. your laptop
- "localhost:host-gateway"
- "localhost:${HOST_IP:-host-gateway}"
7 changes: 7 additions & 0 deletions genai-function-calling/spring-ai/env.example
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,13 @@ OPENAI_API_KEY=
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment to use RamaLama instead of OpenAI
# OPENAI_BASE_URL=http://localhost:8080/v1
# OPENAI_API_KEY=unused
# # This works when you supply a major_version parameter in your prompt. If you
# # leave it out, you need to update this to qwen2.5:3b to proceed the tool call.
# CHAT_MODEL=qwen2.5:0.5b

# Uncomment and complete if you want to use Azure OpenAI Service
## "Azure OpenAI Endpoint" in https://oai.azure.com/resource/overview
# AZURE_OPENAI_ENDPOINT=https://YOUR_RESOURCE_NAME.openai.azure.com/
Expand Down
2 changes: 1 addition & 1 deletion genai-function-calling/spring-ai/mvnw
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
Expand Down
Loading