Skip to content

Commit d1df2e4

Browse files
committed
feat: split API and provider specs into separate llama-stack-api pkg
Extract API definitions, models, and provider specifications into a standalone llama-stack-api package that can be published to PyPI independently of the main llama-stack server. Motivation External providers currently import from llama-stack, which overrides the installed version and causes dependency conflicts. This separation allows external providers to: - Install only the type definitions they need without server dependencies - Avoid version conflicts with the installed llama-stack package - Be versioned and released independently This enables us to re-enable external provider module tests that were previously blocked by these import conflicts. Changes - Created llama-stack-api package with minimal dependencies (pydantic, jsonschema) - Moved APIs, models, providers datatypes, strong_typing, and schema_utils - Updated all imports from llama_stack.* to llama_stack_api.* - Preserved git history using git mv for moved files - Configured local editable install for development workflow - Updated linting and type-checking configuration for both packages - Rebased on top of upstream src/ layout changes Testing Package builds successfully and can be imported independently. All pre-commit hooks pass with expected exclusions maintained. Next Steps - Publish llama-stack-api to PyPI - Update external provider dependencies - Re-enable external provider module tests Signed-off-by: Charlie Doern <[email protected]>
1 parent 1c9a31d commit d1df2e4

File tree

457 files changed

+1499
-1177
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

457 files changed

+1499
-1177
lines changed

.github/workflows/python-build-test.yml

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,13 +30,16 @@ jobs:
3030
activate-environment: true
3131
version: 0.7.6
3232

33+
- name: Build Llama Stack Spec package
34+
working-directory: src/llama-stack-api
35+
run: uv build
36+
3337
- name: Build Llama Stack package
34-
run: |
35-
uv build
38+
run: uv build
3639

37-
- name: Install Llama Stack package
40+
- name: Install Llama Stack package (with spec from local build)
3841
run: |
39-
uv pip install dist/*.whl
42+
uv pip install --find-links src/llama-stack-api/dist dist/*.whl
4043
4144
- name: Verify Llama Stack package
4245
run: |

.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ repos:
4242
hooks:
4343
- id: ruff
4444
args: [ --fix ]
45-
exclude: ^src/llama_stack/strong_typing/.*$
45+
exclude: ^(src/llama_stack/strong_typing/.*|src/llama-stack-api/llama_stack_api/strong_typing/.*)$
4646
- id: ruff-format
4747

4848
- repo: https://github.com/adamchainz/blacken-docs

docs/docs/api-overview.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ The Llama Stack provides a comprehensive set of APIs organized by stability leve
88

99
These APIs are fully tested, documented, and stable. They follow semantic versioning principles and maintain backward compatibility within major versions. Recommended for production applications.
1010

11-
[**Browse Stable APIs →**](./api/llama-stack-specification)
11+
[**Browse Stable APIs →**](./api/llama-stack-apiification)
1212

1313
**Key Features:**
1414
- ✅ Backward compatibility guaranteed
@@ -24,7 +24,7 @@ These APIs are fully tested, documented, and stable. They follow semantic versio
2424

2525
These APIs include v1alpha and v1beta endpoints that are feature-complete but may undergo changes based on feedback. Great for exploring new capabilities and providing feedback.
2626

27-
[**Browse Experimental APIs →**](./api-experimental/llama-stack-specification-experimental-apis)
27+
[**Browse Experimental APIs →**](./api-experimental/llama-stack-apiification-experimental-apis)
2828

2929
**Key Features:**
3030
- 🧪 Latest features and capabilities
@@ -40,7 +40,7 @@ These APIs include v1alpha and v1beta endpoints that are feature-complete but ma
4040

4141
These APIs are deprecated and will be removed in future versions. They are provided for migration purposes and to help transition to newer, stable alternatives.
4242

43-
[**Browse Deprecated APIs →**](./api-deprecated/llama-stack-specification-deprecated-apis)
43+
[**Browse Deprecated APIs →**](./api-deprecated/llama-stack-apiification-deprecated-apis)
4444

4545
**Key Features:**
4646
- ⚠️ Will be removed in future versions

docs/docs/building_applications/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,4 +80,4 @@ Build production-ready systems with:
8080
- **[Getting Started](/docs/getting_started/quickstart)** - Basic setup and concepts
8181
- **[Providers](/docs/providers/)** - Available AI service providers
8282
- **[Distributions](/docs/distributions/)** - Pre-configured deployment packages
83-
- **[API Reference](/docs/api/llama-stack-specification)** - Complete API documentation
83+
- **[API Reference](/docs/api/llama-stack-apiification)** - Complete API documentation

docs/docs/building_applications/playground.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -295,4 +295,4 @@ llama stack run meta-reference
295295
- **[Agents](./agent)** - Building intelligent agents
296296
- **[RAG (Retrieval Augmented Generation)](./rag)** - Knowledge-enhanced applications
297297
- **[Evaluations](./evals)** - Comprehensive evaluation framework
298-
- **[API Reference](/docs/api/llama-stack-specification)** - Complete API documentation
298+
- **[API Reference](/docs/api/llama-stack-apiification)** - Complete API documentation

docs/docs/concepts/apis/external.mdx

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ External APIs must expose a `available_providers()` function in their module tha
5858

5959
```python
6060
# llama_stack_api_weather/api.py
61-
from llama_stack.providers.datatypes import Api, InlineProviderSpec, ProviderSpec
61+
from llama_stack_api.providers.datatypes import Api, InlineProviderSpec, ProviderSpec
6262
6363
6464
def available_providers() -> list[ProviderSpec]:
@@ -79,7 +79,7 @@ A Protocol class like so:
7979
# llama_stack_api_weather/api.py
8080
from typing import Protocol
8181
82-
from llama_stack.schema_utils import webmethod
82+
from llama_stack_api.schema_utils import webmethod
8383
8484
8585
class WeatherAPI(Protocol):
@@ -151,12 +151,12 @@ __all__ = ["WeatherAPI", "available_providers"]
151151
# llama-stack-api-weather/src/llama_stack_api_weather/weather.py
152152
from typing import Protocol
153153
154-
from llama_stack.providers.datatypes import (
154+
from llama_stack_api.providers.datatypes import (
155155
Api,
156156
ProviderSpec,
157157
RemoteProviderSpec,
158158
)
159-
from llama_stack.schema_utils import webmethod
159+
from llama_stack_api.schema_utils import webmethod
160160
161161
162162
def available_providers() -> list[ProviderSpec]:

docs/docs/distributions/building_distro.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ external_providers_dir: /workspace/providers.d
6565
Inside `providers.d/custom_ollama/provider.py`, define `get_provider_spec()` so the CLI can discover dependencies:
6666

6767
```python
68-
from llama_stack.providers.datatypes import ProviderSpec
68+
from llama_stack_api.providers.datatypes import ProviderSpec
6969
7070
7171
def get_provider_spec() -> ProviderSpec:

docs/docs/providers/external/external-providers-guide.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ container_image: custom-vector-store:latest # optional
8080
All providers must contain a `get_provider_spec` function in their `provider` module. This is a standardized structure that Llama Stack expects and is necessary for getting things such as the config class. The `get_provider_spec` method returns a structure identical to the `adapter`. An example function may look like:
8181

8282
```python
83-
from llama_stack.providers.datatypes import (
83+
from llama_stack_api.providers.datatypes import (
8484
ProviderSpec,
8585
Api,
8686
RemoteProviderSpec,

docs/docs/providers/vector_io/inline_sqlite-vec.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@ description: |
153153
Example using RAGQueryConfig with different search modes:
154154
155155
```python
156-
from llama_stack.apis.tools import RAGQueryConfig, RRFRanker, WeightedRanker
156+
from llama_stack_api.apis.tools import RAGQueryConfig, RRFRanker, WeightedRanker
157157
158158
# Vector search
159159
config = RAGQueryConfig(mode="vector", max_chunks=5)
@@ -358,7 +358,7 @@ Two ranker types are supported:
358358
Example using RAGQueryConfig with different search modes:
359359

360360
```python
361-
from llama_stack.apis.tools import RAGQueryConfig, RRFRanker, WeightedRanker
361+
from llama_stack_api.apis.tools import RAGQueryConfig, RRFRanker, WeightedRanker
362362

363363
# Vector search
364364
config = RAGQueryConfig(mode="vector", max_chunks=5)

docs/openapi_generator/generate.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
import fire
1717
import ruamel.yaml as yaml
1818

19-
from llama_stack.apis.version import LLAMA_STACK_API_V1 # noqa: E402
19+
from llama_stack_api.apis.version import LLAMA_STACK_API_V1 # noqa: E402
2020
from llama_stack.core.stack import LlamaStack # noqa: E402
2121

2222
from .pyopenapi.options import Options # noqa: E402

0 commit comments

Comments
 (0)