Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MALI data mode #6945

Open
wants to merge 14 commits into
base: master
Choose a base branch
from
Open

Conversation

matthewhoffman
Copy link
Contributor

This PR introduces a data mode for MALI that allows GLC to be active but the MALI ice thickness be prescribed from a previous MALI run. This capability is added by introducing a new component variable MALI_PROGNOSTIC_MODE, which replaces the existing MALI_DYNAMIC variable. MALI_PROGNOSTIC_MODE can take on values of PROGNOSTIC (full MALI model), STATIC (ice thickness held at its initial value), and DATA (ice thickness updated monthly from a specified MALI history file). A new compset is added to that include MALI in data mode in a G-case (GMPAS-JRA1p5-DIB-PISMF-DIS). A datamode history file is added for the mpas.ais8to30km mesh. The DATA and STATIC modes are needed for developing ocean/ice-sheet coupling and SMB evaluation in BG cases.

MALI data mode uses the MALI component but rather than making prognostic
calculations it reads an ice thickness field from a previous simulation.
For now, it is assumed there is a single datamode file per mesh.  This
could be expanded if needed.

Details:
* replace MALI%STATIC, which is not being used, with MALI%DATA
* set MALI namelist options appropriate for a data mode
* add an input stream for data stream to MALI streams file when data mode is activated
* add MPAS_DISLISIO_JRA1p5 compset (DIS=Data Ice Sheet)
* remove two unused compsets that had reference %STATIC mode
This is necessary for MALI data mode input file, which starts in 2000.
This suppressed irrelevant errors for the oQU240wLI mesh.
The previous changes made all the MALI_DYNAMIC changes in one place at
the beginning, causing the subsequent options to get overridden later in
the file.  Also handle config_SGH.
This change supports three modes instead of two:
* PROGNOSTIC=MALI runs prognostically
* STATIC=MALI initial condition is held static over time
* DATA=MALI thickness is read monthly from a data file
@matthewhoffman
Copy link
Contributor Author

Draft version of this PR discussed at E3SM-Ocean-Discussion#111

@xylar
Copy link
Contributor

xylar commented Jan 25, 2025

It seems like I messed up the compsets with STATIC a couple years back:
4984e6b
Rather than making a PISMF and DISMF version of GMPAS-MALI-DIB-IAF-DISMF (the latter with incorrect indentation!).

I think we want to remove both of those and instead have JRA1p5 compsets something like:

GMPAS-JRA1p5-DIB-PISMF-SIS

I think we wouldn't use the DISMF version so let's not add it. @matthewhoffman, if you like, I can do this and push a commit.

Then, we would want to add a test like this that uses this compset as well.

@xylar
Copy link
Contributor

xylar commented Jan 25, 2025

I tried to run:

./create_test --wait --walltime 1:00:00 ERS_Ld5.TL319_oQU240wLI_ais8to30.GMPAS-MALI-DIB-IAF-DISMF.chrysalis_intel.mpaso-ocn_glcshelf

and that failed with:

MCT::m_SparseMatrixPlus:: FATAL--length of vector x different from column count of sMat.Length of x =    64800 Number of columns in sMat =  1036800
  1: 001.MCT(MPEU)::die.: from MCT::m_SparseMatrixPlus::initDistributed_()

I think that probably indicates that the necessary coupling files are maybe not there, another reason to switch to a JRA, rather than CORE IAF, test.

@jonbob
Copy link
Contributor

jonbob commented Jan 27, 2025

@xylar -- the TL319 resolution is intended for JRA forcing. If you want to use CORE-II, the datm resolution should be T62

@xylar
Copy link
Contributor

xylar commented Jan 27, 2025

@jonbob, that makes sense and fits with the need to move away from the CORE-IAF and to the JRA1p5 forcing for MALI static mode as I already suggested above.

@xylar
Copy link
Contributor

xylar commented Jan 28, 2025

Oh, sorry, I see. I just used the wrong atmosphere grid in my test. My mistake!

@xylar
Copy link
Contributor

xylar commented Jan 28, 2025

@matthewhoffman, I made the compset and test changes. I successfully ran:

ERS_Ld5.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf
ERS_Ld5.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-SIS.chrysalis_intel.mpaso-ocn_glcshelf

Comment on lines +156 to +157
<alias>GMPAS-JRA1p5-DIB-PISMF-SIS</alias>
<lname>2000_DATM%JRA-1p5_SLND_MPASSI%DIB_MPASO%IBPISMFDATMFORCED_DROF%JRA-1p5_MALI%SIASTATIC_SWAV</lname>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@matthewhoffman, are you good with the SIS name here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xylar , yes, I like the SIS name in the compset, and I agree the GLC component should use the %SIASTATIC variant.

Copy link
Contributor

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the new compsets and tests I added, I think this looks great!

@trhille
Copy link
Contributor

trhille commented Jan 28, 2025

Testing on Chrysalis

  • ERS_Ld5.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf
  • ERS_Ld5.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-SIS.chrysalis_intel.mpaso-ocn_glcshelf
  • SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf
  • SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-SIS.chrysalis_intel.mpaso-ocn_glcshelf

@trhille
Copy link
Contributor

trhille commented Jan 29, 2025

SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf is failing and claiming a memory leak on Chrysalis, but everything appears to have run just fine.
From TestStatus:

PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf CREATE_NEWCASE
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf XML
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf SETUP
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf SHAREDLIB_BUILD time=189
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf MODEL_BUILD time=946
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf SUBMIT
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf RUN time=124
FAIL SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf MEMLEAK memleak detected, memory went from 3647.510000 to 4165.320000 in 126 days
PASS SMS_Lm2.TL319_oQU240wLI_ais8to30.GMPAS-JRA1p5-DIB-PISMF-DIS.chrysalis_intel.mpaso-ocn_glcshelf SHORT_TERM_ARCHIVER

From TestStatus.log:

 ---------------------------------------------------
2025-01-28 18:46:17: memleak detected, memory went from 3647.510000 to 4165.320000 in 126 days
 ---------------------------------------------------

But the test duration was 59 days, so I'm not sure where the 126 days is coming from.

cpl log indicates that memory highwater was not at the end of the run:

(seq_mct_drv): ===============          SUCCESSFUL TERMINATION OF CPL7-e3sm ===============
(seq_mct_drv): ===============        at YMD,TOD =   20000301       0       ===============
(seq_mct_drv): ===============  # simulated days (this run) =       59.000  ===============
(seq_mct_drv): ===============  compute time (hrs)          =        0.022  ===============
(seq_mct_drv): ===============  # simulated years / cmp-day =      178.142  ===============
(seq_mct_drv): ===============  pes min memory highwater  (MB)    2666.477  ===============
(seq_mct_drv): ===============  pes max memory highwater  (MB)    4174.848  ===============
(seq_mct_drv): ===============  pes min memory last usage (MB)     806.469  ===============
(seq_mct_drv): ===============  pes max memory last usage (MB)    1431.629  ===============

@jonbob, do you have any thoughts on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants