Skip to content

Conversation

derekhiggins
Copy link
Contributor

Pass TEXT_MODEL parameter to pytest to enable model override in integration test runs.

Pass TEXT_MODEL parameter to pytest to enable model override
in integration test runs.

Signed-off-by: Derek Higgins <[email protected]>
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Sep 8, 2025
@derekhiggins
Copy link
Contributor Author

required for rebase of #3128

@@ -232,7 +232,7 @@ if [[ -n "$TEST_SUBDIRS" ]]; then
EXTRA_PARAMS="$EXTRA_PARAMS --text-model=$TEXT_MODEL --embedding-model=sentence-transformers/all-MiniLM-L6-v2"
else
PYTEST_TARGET="tests/integration/"
EXTRA_PARAMS="$EXTRA_PARAMS --suite=$TEST_SUITE"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was intentionally not done. the model comes as part of the "test suite" definition. I think it is unsatisfactory and am working to make it slightly better. however, can you tell me what broke -- what your use case was?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've been keeping the vLLM ci PR up to date #3128,
One of the reasons that the latest rebase fails was because the model name was incorrect

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the suites need to somehow key off provider. maybe detect the provider that's active (i'm already not a fan of --provider)? https://github.com/llamastack/llama-stack/blob/main/tests/integration/suites.py#L28 is only setup for ollama.

we have multiple ways to set the model for testing. what if suite only specified the set of tests to run, not the models?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, I've closed this and updated the CI PR

@derekhiggins
Copy link
Contributor Author

#3128 has been update to use the new vllm setup

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants