-
-
Notifications
You must be signed in to change notification settings - Fork 217
CI testing via Spack (experimental) #3884
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I am aware of this feature, and it looks very neat. The problem is if a user has a dependency that isn't on spack (or is installed through pip, or you need to compile C++ code against dolfinx, as in the case of DOLFINx_MPC), which we then need to add to the spack spec (i.e. we need to install |
the same thing in Spack) for FEniCS.
That's not correct - Spack can install just build dependencies, just as we do manually now in the Dockerfiles. Users can then install further whatever they like by hand. |
you need to re-concretize to ensure that the same nanobind is used for both dolfinx and an extension. This might cause a reinstall of heavy dependencies. I’ve experienced this when for instance trying to use dolfinx_MPC with dolfinx installed with spack. |
Let's not get sidetracked on Docker image design for now - none of us have even tried the Spack Docker generation yet. |
run: | | ||
. $GITHUB_WORKSPACE/spack-src/share/spack/setup-env.sh | ||
spack env activate py | ||
spack load gcc |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this to ensure jit compilation works?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this to ensure jit compilation works?
Yes.
Looks ok as an extra way of testing dolfinx Only question I have is: why does one have to load gcc in the Python env? Should it be a run-time dependency, or is there a specific test that needs it? |
.github/workflows/ci.yml
Outdated
spack info py-fenics-dolfinx | ||
spack env create py dolfinx-src/.github/workflows/spack-config/gh-actions-env-test.yml | ||
spack -e py develop --path $GITHUB_WORKSPACE/dolfinx-src fenics-dolfinx@main | ||
spack -e py develop [email protected]=main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These should be checked out using the GitHub action at the git ref specified in the yml file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would be more code for no advantage.
What remains is to get the branch from the refs env file, i.e.
- name: Load environment variables
run: cat .github/workflows/fenicsx-refs.env >> $GITHUB_ENV
- name: Set up env
run: |
....
spack -e py develop fenics-basix@git.${{ env.basix_ref }}=main
The only one we should checkout using actions is DOLFINx - the commit hash is different for 'pull request runs' and 'push runs' (Github creates internal commits for PR triggers). Getting the correct hash ref isn't trivial.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is an advantage: the GitHub action does very shallow clones and handles authentication even into private forks.
.github/workflows/ci.yml
Outdated
spack info fenics-dolfinx | ||
spack env create cxx dolfinx-src/.github/workflows/spack-config/gh-actions-env-test.yml | ||
spack -e cxx develop --path $GITHUB_WORKSPACE/dolfinx-src fenics-dolfinx@main | ||
spack -e cxx develop [email protected]=main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as below on checkout action and yml git ref.
For JIT. I think it's an issue with how It does make technical sense why one would like to load the compiler - one could use different C compilers for JIT. |
The PR adds experimental 'single source' CI testing via Spack. All other tests are unchanged. The proposal is to merge this PR and monitor how it goes. While experimental, it will not be a merge requirement.
Background
Issues this PR addresses are:
Spack-based CI
This PR address the above issues.
This PR exposed a number of shortcomings/bugs in FEniCS Spack specs, and bugs in dependency Spack spec. This PR uses the specs in a branch spack-packages. The changes will eventually be upstreamed.
Issues and questions
This PR uses minimal dependency pinning. Should we pin significant dependencies, e.g. PETSc, and let others float?
If Spack cannot find a suitable cached dependency, it will re-build and cache. The re-build of some heavy dependencies, e.g. LLVM (for Numba) and VTK (for pyvista) can be very heavy. 'Controlled' rebuilds can be managed and just require patience or running locally on a powerful system and pushing to the cache. What will be problematic is unexpected, heavy re-builds since the CI will suddenly become very slow. Version pinning of heavy dependencies may avoid this.
If we see too frequent re-builds, we can figure out what to do.
In this PR, all Python test dependencies are installed using Spack.
Advantages:
Disadvantages:
A question for later is: should Python test dependencies be installed via Spack or
pip
?pip
would be faster and less package maintenance, but would require greater care with Python dependency versions.The Spack version is presently the development version. We may want to pin this. There are a few details still to add on specifying FEniCS dependency branches.