Skip to content

Containerized CI + Cleanup testing #218

Open
dustinswales wants to merge 87 commits intoufs-community:gsl/developfrom
dustinswales:cleanup_testing
Open

Containerized CI + Cleanup testing #218
dustinswales wants to merge 87 commits intoufs-community:gsl/developfrom
dustinswales:cleanup_testing

Conversation

@dustinswales
Copy link
Collaborator

@dustinswales dustinswales commented Feb 24, 2026

This PR updates to CI based testing system in several ways.

  1. Testing cleanup
    1a) Remove dependency of testing repository (https://github.com/barlage/mpas_testcase) from CI workflow
    1b) Added namelist/stream case files to repository under testing_and_setup/ufs-community/cases. Previously these were in testing repository.
    1c) Remove dependency on THREDDS server from CI workflow.
    1d) Create GitHub Release assets with files from THREDDS server (e.g., physics tables and MPAS data files).
    1e) New script to fetch GitHub release assets (testing_and_setup/ufs-community/data/get_data.sh)

  2. Images for Containerized CI
    2a) Added Dockerfiles for minimal images (e.g., GNU, Intel oneapi, nvhpc)
    2b) Added Dockerfiles for dependencies (e.g., netCDF, and PnetCDF)
    2c) Added new CI script that creates images from Dockerfiles and pushes to Dockerhub.
    *We have MPAS images for GNU, Intel oneAPI, and NVHPCS, but only GNU has been implemented in this PR (Intel oneAPI will be part of a subsequent PR)

  3. Containerized CI
    3a) Updated CI script to run in container using images from Dockerhub (Created in 2c).
    3b) Modify workflows to use namelists/streams in testing_and_setup/ufs-community/cases.
    3c) Modify workflow scripts to fetch GitHub assets (Using 1e)

Comments

All existing CI tests pass*.
*NOTE. The MPAS-Dev tests are showing differences in the HEAD of gsl/develop, which we also see in this feature branch.

To download the release assets for running the regression tests offline;
./testing_and_setup/ufs-community/data/get_data.sh

The CI script to upload the images to Dockerhub wont work on a pull_request trigger, since it relies on Github secrets in the target repository. (This is default behavior for security concerns)
I updated this repo with the appropriate secrets, so once this PR is merged, the image building script will work.

Priority Reviewers


# Get TEMPO data
wget ${verbose} https://github.com/ufs-community/MPAS-Model/releases/download/MPAS-v8.3.1-2.13/tempo_data.tar
wget ${verbose} https://github.com/ufs-community/MPAS-Model/releases/download/MPAS-v8.3.1-2.14/tempo_data.tar
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@clark-evans Note that I only incremented the tempo data here.
The release asset has all the files we need, so we could update all the files every time we update one of them? (I don't love this since each file may have its own release cadence)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summarizing our offline discussions in case anyone looks back at this: even if not ideal, agree with making a new release with all of the files even if only one of them was updated in that release. I think it's easier for us to keep track of everything that way.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So if we're doing that, I presume lines 60, 69, and 78 should be updated to the v8.3.1-2.14 release rather than the v8.3.1-2.13 release?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only TEMPO has changed from v2.13 -> v2.14, so I only increment that file here.
If we update all files here to use v2.14, we would lose track of when the versions for the different data files change.
My thinking (hope) was that this file, and its github history, would help keep track of changing input data files.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, so we'd treat this as our way of tracking our changes, while the releases would contain everything in case someone external needed to grab the supplemental files? That's fine with me.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. That was my intention.

@dustinswales
Copy link
Collaborator Author

@AndersJensen-NOAA @joeolson42 Can you guys review this PR?
All tests are passing, so you can dive in as deep as you desire.

Copy link
Collaborator

@joeolson42 joeolson42 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a very slim perusal more than a strict review. Getting this PR committed with bugs is probably not a set back relative to where we are today with the server issues.

Copy link
Collaborator

@clark-evans clark-evans left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

General comment: while I like that each test has its own directory with its own namelist(s), stream_lists, and streams files, there is a lot of duplication between them. Would it be cleaner to place common files into a common directory -- or perhaps two common directories, one for MPAS-Dev and one for ufs-community tests -- from which each set of tests could pull as needed?

@dustinswales
Copy link
Collaborator Author

General comment: while I like that each test has its own directory with its own namelist(s), stream_lists, and streams files, there is a lot of duplication between them. Would it be cleaner to place common files into a common directory -- or perhaps two common directories, one for MPAS-Dev and one for ufs-community tests -- from which each set of tests could pull as needed?

I like this idea, but I don't have time to address it anytime soon.
Could we move forward as-is and open an issue to distill the case config files?
Or if someone else wants to do it as part of this PR, I'd gladly except it. :)
It may make sense for a scientist/developer to go through and confirm and document all of the configurations

@clark-evans
Copy link
Collaborator

General comment: while I like that each test has its own directory with its own namelist(s), stream_lists, and streams files, there is a lot of duplication between them. Would it be cleaner to place common files into a common directory -- or perhaps two common directories, one for MPAS-Dev and one for ufs-community tests -- from which each set of tests could pull as needed?

I like this idea, but I don't have time to address it anytime soon. Could we move forward as-is and open an issue to distill the case config files? Or if someone else wants to do it as part of this PR, I'd gladly except it. :) It may make sense for a scientist/developer to go through and confirm and document all of the configurations

That's fine with me. Some notes for you, me, or whoever else might do this later:

  • The mpasdev and ufscommunity-non-hrrrv5 tests have common stream_list and streams files with each other.
  • The ufscommunity-hrrrv5 tests have common stream_list and streams files with each other.
  • The ufscommunity-hrrrv5 tests have common namelists with each other except for the GFS winter test, which for some reason lacks config_gvf_update = false. Possibly an oversight much earlier in the process?

@clark-evans clark-evans linked an issue Mar 13, 2026 that may be closed by this pull request
@dustinswales
Copy link
Collaborator Author

@clark-evans @ligiabernardet I wasn't able to get the image to push to ghcr/ufs-community. I'm sure it's something permission related, but I don't have the time to chase it down. Is it okay if we use my Dockerhub account and I open an issue to revisit this later?
I want to highlight that the only way these images would change is if someone changes the Dockerfiles in this repo and pushed it to gsl/develop (i.e. bypassing our PR processes). So it is "self contained" and safe.

@clark-evans
Copy link
Collaborator

@clark-evans @ligiabernardet I wasn't able to get the image to push to ghcr/ufs-community. I'm sure it's something permission related, but I don't have the time to chase it down. Is it okay if we use my Dockerhub account and I open an issue to revisit this later? I want to highlight that the only way these images would change is if someone changes the Dockerfiles in this repo and pushed it to gsl/develop (i.e. bypassing our PR processes). So it is "self contained" and safe.

That's fine with me. I'm not worried about safety in this context - just wanting to make sure all is well if something were to happen to you! Want me to open the issue, or are you OK with doing so?

@ligiabernardet
Copy link
Collaborator

It sounds like a good approach for now. If we need to revisit anything regarding permissions in the repo, please reach out.

Copy link
Collaborator

@clark-evans clark-evans left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK with me once my versioning comment is addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

MP tables as release asset.

5 participants