Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MALI data mode #111

Conversation

matthewhoffman
Copy link
Collaborator

MALI data mode uses the MALI component but rather than making prognostic calculations it reads an ice thickness field from a previous simulation. For now, it is assumed there is a single datamode file per mesh. This could be expanded if needed. A new variant of LISIO is also added: MPAS_DISLISIO_JRA1p5

@matthewhoffman matthewhoffman marked this pull request as draft October 3, 2024 02:40
@matthewhoffman
Copy link
Collaborator Author

I placed a data mode file at /global/cfs/cdirs/e3sm/inputdata/glc/mpasli/mpas.ais8to30km/ais_8to30km_datamode.nc on Perlmutter for testing. It currently contains monthly thickness fields from 2015-02-01 to 2058-05-01 but will updated with a more complete time range when I finish generating it.

Testing with:

./create_test --wait --walltime 0:30:00    ERS_Ld5.TL319_oQU240wLI_ais8to30.MPAS_DISLISIO_JRA1p5.pm-cpu_intel.mpaso-ocn_glcshelf

@jonbob , when I try to run this test, I get an error, because MALI is trying to read time 0001-01-01_00:00:00 on init. Does that make sense to you? What year does this JRA compset start with? Or do you think this could be an issue with the calendar and streams handling in the glc driver?

@xylar xylar changed the base branch from master to alternate October 3, 2024 07:18
@xylar xylar changed the base branch from alternate to master October 3, 2024 07:18
@xylar
Copy link
Collaborator

xylar commented Oct 3, 2024

I believe JRA starts in 1958.

@jonbob
Copy link
Collaborator

jonbob commented Oct 3, 2024

JRA does start in 1958, but I don't think that's the issue. By default, tests use a STARTDATE of 0001-01-01. The data models have options for how they handle dates outside their range of actual data, like cycling or extending (basically using the first or last data value). So we have two use cases for JRA forcing: one where we start at 0001-01-01 and that lines up with the beginning of the JRA data (1958) and cycles over time; and one where we run inside the JRA date range and the model date is the same as the data date

@matthewhoffman
Copy link
Collaborator Author

Thanks, @jonbob. MALI does not currently have a way to cycle dates on forcing files, so we'll need to talk through if we need to add that capability or if we can set up a compset that uses consistent dates for what we need data mode to do. The MALI simulation I am using starts in 2000 and ends in 2300 (but I will probably keep just the first 100 years, or maybe even less).

@xylar and @cbegeman , what compset/forcing and year range do you think we should set up for this? Do you think a JRA-forced G-case is the right configuration for the ocean? If so, do you want to start in 1958 (JRA start date) or 2000 (MALI simulation actual start date)? And does it matter if the start year is the actual start year or year 0? Based on what we decide, I can adjust xtime in the MALI forcing file to match that convention, which I think is fine as we intend this to be schematic and not an actual ice-sheet history.

@rljacob
Copy link

rljacob commented Oct 3, 2024

Why aren't you doing this by making a dglc model analogous to docn and datm ?

@jonbob
Copy link
Collaborator

jonbob commented Oct 3, 2024

@rljacob -- it is one of the things we discussed, but wasn't sure who could do that

@jonbob
Copy link
Collaborator

jonbob commented Oct 3, 2024

@matthewhoffman -- we can make a testdef that changes the STARTDATE and a few settings for the JRA forcing to make it all consistent

@cbegeman
Copy link
Collaborator

cbegeman commented Oct 3, 2024

It sounds like from what @jonbob is saying that there wouldn't be a problem starting in 2000 and not changing MALI's dates. JRA seems fine, but I think that we should make sure AIS ice runoff is removed in this test, which would not currently be the case (https://github.com/E3SM-Project/E3SM/blob/d4ca7d0606930f53f30eb7ff7dbb92ec83b83b84/components/mpas-ocean/bld/build-namelist#L728-L733).

@xylar
Copy link
Collaborator

xylar commented Oct 3, 2024

I concur, let's start at 2000 unless that causes unexpected issues.

@@ -142,6 +142,11 @@
<lname>2000_DATM%JRA-1p4-2018_SLND_MPASSI%DIB_MPASO%IBDISMFDATMFORCED_DROF%JRA-1p4-2018-AIS0ROF_SGLC_SWAV</lname>
</compset>

<compset>
<alias>GMPAS-JRA1p4-DIB-PISMF-DIS</alias>
<lname>2000_DATM%JRA-1p4-2018_SLND_MPASSI%DIB_MPASO%IBPISMFDATMFORCED_DROF%JRA-1p4-2018-AIS0ROF_MALI%DATA_SWAV</lname>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

based on discussion, change to JRA-1p4-2018-AIS0ICE option?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and switch from 1p4 to 1p5?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on Slack discussion, wait for a PR that fixes things so runoff is removed from G-Cases as it is from B-Cases. The, yes, switch to JRA1p5 but do not use any of the -AIS0* options so that MPASO takes care of the zeroing out.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update note: this PR will be rebased, and compset adjustments made, once E3SM-Project#6693 is merged.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need not to forget to do the rebase once E3SM-Project#6693 goes in, which I think will be soon.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: I have rebased the PR and updated the compset per the discussion in this thread.

@matthewhoffman
Copy link
Collaborator Author

Why aren't you doing this by making a dglc model analogous to docn and datm ?

@rljacob , to follow up more on this question, we've discussed this option but feel doing this through the MALI component is critical for a few reasons. This capability is serving as a quasi-coupling mode that helps us prepare the ocean model physics for an active, evolving MALI component. By having MALI replay the ice thickness history from a previous MALI simulation, we ensure that everything about the configuration is identical to what MPAS-Ocean and the coupler will be seeing when we finally transition to fully coupled ice sheet simulations. In particular, the ice-shelf basal melting functionality is in the coupler itself, as opposed to a component model, operating on the GLC grid at the OCN timestep, and is only enabled if GLC is active (and an ocean namelist is activated). Ensuring that those complex settings are working correctly with a new DGLC component would be extra work with little benefit. In particular, this does not seem worth the work because the data MALI mode is replaying previous MALI simulation output as a stepping stone towards full coupling, as opposed to existing data components that are replaying observations and meant to represent historical conditions accurately. We don't anticipate extending this capability to historical reanalysis of ice-sheet thickness, in part because such reanalyses don't really exist and are not expected to anytime soon.

(Maybe this is more detail than you were asking for, but I wanted to have our offline discussions on this question documented.)

@matthewhoffman
Copy link
Collaborator Author

Note for this PR: we discussed today that eventually we will need to create a MALI data mode version with the MALI 4km AIS mesh, but that will happen after the SORRM v3r4 ocean mesh based on the MALI initial condition is finalized. At that point, we will run a standalone MALI 4km simulation to generate the data mode history. That will all be part of a later, follow-on PR, and the 8km AIS data mode in the PR will still be useful for development and testing of MPAS-Ocean wetting and drying in the interim.

@matthewhoffman matthewhoffman marked this pull request as ready for review October 31, 2024 20:00
@matthewhoffman
Copy link
Collaborator Author

After help from @jonbob , I got the start time and the JRA forcing time to be 2000-01-01. I've also updated the MALI data file to include monthly thickness fields from 2000-01-01 to 2100-01-01.

@xylar, this PR is now ready for review. A run using the data mode can be created on Perlmutter with:

./create_test --wait  -q debug  --walltime 0:30:00    ERS_Ld5.TL319_oQU240wLI_ais8to30.MPAS_DISLISIO_JRA1p5.pm-cpu_intel.mpaso-ocn_glcshelf

If everyone is happy with it, I still need to put the data mode file on the LCRC server - right now it only exists locally on Perlmutter's inputdata space.

Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great to me! I only have one suggestion, and it fits well with moving the data file over to LCRC.

I'd like to run a longer DISLISIO test but probably on Chrysalis once the data file is there, and that can happen once this is on E3SM. I think it's ready to be moved over there.

# in MALI data mode, read an input stream with ice thickness monthly
lines.append(' <stream name="data-mode-input"')
lines.append(' type="input"')
lines.append(f' filename_template="{din_loc_root}/glc/mpasli/{glc_grid}/{grid_prefix}_datamode.nc"')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be a good idea to give this file a datestamp. That way, it could at least be updated and it would be clear which version was expected.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is added in c09dbd6

MALI data mode uses the MALI component but rather than making prognostic
calculations it reads an ice thickness field from a previous simulation.
For now, it is assumed there is a single datamode file per mesh.  This
could be expanded if needed.

Details:
* replace MALI%STATIC, which is not being used, with MALI%DATA
* set MALI namelist options appropriate for a data mode
* add an input stream for data stream to MALI streams file when data mode is activated
* add MPAS_DISLISIO_JRA1p5 compset (DIS=Data Ice Sheet)
* remove two unused compsets that had reference %STATIC mode
This is necessary for MALI data mode input file, which starts in 2000.
This suppressed irrelevant errors for the oQU240wLI mesh.
@matthewhoffman matthewhoffman force-pushed the matthewhoffman/mali/data-mode branch from 627761c to b68cf33 Compare January 21, 2025 22:04
@xylar xylar changed the base branch from master to alternate January 21, 2025 22:07
@xylar xylar changed the base branch from alternate to master January 21, 2025 22:07
The previous changes made all the MALI_DYNAMIC changes in one place at
the beginning, causing the subsequent options to get overridden later in
the file.  Also handle config_SGH.
This change supports three modes instead of two:
* PROGNOSTIC=MALI runs prognostically
* STATIC=MALI initial condition is held static over time
* DATA=MALI thickness is read monthly from a data file
@matthewhoffman matthewhoffman force-pushed the matthewhoffman/mali/data-mode branch from 7e403fd to 2989c61 Compare January 22, 2025 22:22
@matthewhoffman
Copy link
Collaborator Author

Since Xylar's review, I've made the following changes:

  • rebased
  • update compset definition
  • corrected logic for how MALI_DYNAMIC is applied to nl options
  • changed MALI_DYNAMIC attribute to MALI_PROGNOSTIC_MODE, which now takes on 3 values: PROGNOSTIC, STATIC, DATA

Same test as before passes:

./create_test --wait -q debug  --walltime 0:30:00    ERS_Ld5.TL319_oQU240wLI_ais8to30.MPAS_DISLISIO_JRA1p5.pm-cpu_intel.mpaso-ocn_glcshelf

PASS ERS_Ld5.TL319_oQU240wLI_ais8to30.MPAS_DISLISIO_JRA1p5.pm-cpu_intel.mpaso-ocn_glcshelf RUN

@xylar , let me know if you'd like any further changes or if I can move this over to the main repo now. One thing that could receive attention is there are two compsets added in this PR in different files (allactive vs. mpas-ocean compset files) that are very similar compsets. We probably don't need both. @xylar , @jonbob , let me know if you have a preference of which to keep. Similarly, there is not a compset that include MALI static mode - should we add one?

@@ -70,6 +71,7 @@ def buildnml(case, caseroot, compname):
elif glc_grid == 'mpas.ais8to30km':
grid_date += '20221027'
grid_prefix += 'ais_8to30km'
datamode_date += '20250121'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will presumably add a similar file for mpas.ais4to20km in a follow-up PR? I think that's the one that would be compatible with the SORRMr4 mesh I'm working on and the SORRMr5 mesh that would include a dry region. But we could also make an Icos r8 (?) mesh that could be compatible with this lower res MALI mesh. Forgive me if we've already discussed this and I've lost track.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, my plan was to add a datamode file for the recently updated 4km AIS mesh, most likely in a follow up PR.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

I would suggest we keep GMPAS-JRA1p5-DIB-PISMF-DIS and get rid of MPAS_DISLISIO_JRA1p5. I think this is mostly a variant of a G-case with data MALI instead of no MALI. That still seems like mostly a G-case, and those have traditionally lived in the mpas-ocean space rather than in the "allactive" space. I also think the more verbose name is more helpful. It isn't actually clear to me if MPAS_DISLISIO_JRA1p5 would have PISMF, DISMF or snowcapping (though the last is hard to fathom), whereas that is clear in GMPAS-JRA1p5-DIB-PISMF-DIS.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

I think once you get rid of MPAS_DISLISIO_JRA1p5 (assuming we're in agreement), this is ready for E3SM.

@matthewhoffman
Copy link
Collaborator Author

Moved to E3SM-Project#6945

@xylar
Copy link
Collaborator

xylar commented Jan 25, 2025

I'll close this since it's moved.

@xylar xylar closed this Jan 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants