From 121a38d21e1b4e29458de17ae3b9a6353637526d Mon Sep 17 00:00:00 2001 From: Alexander Robinson Date: Mon, 20 Jan 2025 15:33:26 +0100 Subject: [PATCH] Deployed 52501db with MkDocs version: 1.6.0 --- hpc-notes/index.html | 4 ++-- index.html | 2 +- search/search_index.json | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/hpc-notes/index.html b/hpc-notes/index.html index 4cb2d50..2d13003 100644 --- a/hpc-notes/index.html +++ b/hpc-notes/index.html @@ -124,7 +124,7 @@

Running at PIK on HPC2024 (foote)

module load ncview/2.1.10 module load cdo/2.4.2 -

When installing climber-x-exlib (see further below) use the pik script:

+

When installing fesm-utils (see Dependencies) use the pik script:

./install_pik.sh ifx
 

To link to data sources, use the following path:

@@ -140,7 +140,7 @@

Running at AWI on albedo

module load cdo/2.2.0 module load python/3.10.4 -

When installing climber-x-exlib (see further below) use the awi script (which is a link to the dkrz script):

+

When installing fesm-utils (see Dependencies) use the awi script (which is a link to the dkrz script):

./install_awi.sh ifx
 

To link to data sources, use the following path:

diff --git a/index.html b/index.html index 9275dd0..700e885 100644 --- a/index.html +++ b/index.html @@ -266,5 +266,5 @@

Example model domain intialization diff --git a/search/search_index.json b/search/search_index.json index f695e72..075ba5b 100644 --- a/search/search_index.json +++ b/search/search_index.json @@ -1 +1 @@ -{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Yelmo Welcome to Yelmo , an easy to use continental ice sheet model. Yelmo is a 3D ice-sheet-shelf model solving for the coupled dynamics and thermodynamics of the ice sheet system. Yelmo can be used for idealized simulations, stand-alone ice sheet simulations and fully coupled ice-sheet and climate simulations. Yelmo has been designed to operate as a stand-alone model or to be easily plugged in as a module in another program. The key to its flexibility is that no variables are defined globally and parameters are defined according to the domain being modeled. In this way, all variables and calculations are store in an object that entirely represents the model domain. The physics and design of Yelmo are described in the following article: Robinson, A., Alvarez-Solas, J., Montoya, M., Goelzer, H., Greve, R., and Ritz, C.: Description and validation of the ice-sheet model Yelmo (version 1.0), Geosci. Model Dev., 13, 2805\u20132823, https://doi.org/10.5194/gmd-13-2805-2020 , 2020. The Yelmo code repository can be found here: https://github.com/palma-ice/yelmo General model structure - classes and usage yelmo_class The Yelmo class defines all data related to a model domain, such as Greenland or Antarctica. As seen below in the yelmo_class defintion, the 'class' is simply a user-defined Fortran type that contains additional types representing various parameters, variables or sets of module variables. type yelmo_class type(yelmo_param_class) :: par ! General domain parameters type(ygrid_class) :: grd ! Grid definition type(ytopo_class) :: tpo ! Topography variables type(ydyn_class) :: dyn ! Dynamics variables type(ymat_class) :: mat ! Material variables type(ytherm_class) :: thrm ! Thermodynamics variables type(ybound_class) :: bnd ! Boundary variables to drive model type(ydata_class) :: dta ! Data variables for comparison type(yregions_class) :: reg ! Regionally aggregated variables end type Likewise the module variables are defined in a similar way, e.g. ytopo_class that defines variables and parameters associated with the topography: type ytopo_class type(ytopo_param_class) :: par ! Parameters type(ytopo_state_class) :: now ! Variables end type Submodules such as ytopo_class include parameter definitions relevant to topography calculations, as well as all variables that define the state of the domain being modeled. Example model domain intialization The below code snippet shows an example of how to initialize an instance of Yelmo inside of a program, run the model forward in time and then terminate the instance. ! === Initialize ice sheet model ===== ! Initialize Yelmo objects (multiple yelmo objects can be initialized if needed) ! In this case `yelmo1` is the Yelmo object to initialize and `path_par` is the ! path to the parameter file to load for the configuration information. This ! command will also initialize the domain grid and load initial topographic ! variables. call yelmo_init(yelmo1,filename=path_par,grid_def=\"file\",time=time_init) ! === Load initial boundary conditions for current time and yelmo state ===== ! These variables can be loaded from a file, or passed from another ! component being simulated. Yelmo does not care about the source, ! it only needs all variables in the `bnd` class to be populated. ! ybound: z_bed, z_sl, H_sed, H_w, smb, T_srf, bmb_shlf, T_shlf, Q_geo yelmo1%bnd%z_bed = [2D array] yelmo1%bnd%z_sl = [2D array] yelmo1%bnd%H_sed = [2D array] yelmo1%bnd%H_w = [2D array] yelmo1%bnd%smb = [2D array] yelmo1%bnd%T_srf = [2D array] yelmo1%bnd%bmb_shlf = [2D array] yelmo1%bnd%T_shlf = [2D array] yelmo1%bnd%Q_geo = [2D array] ! Print summary of initial boundary conditions call yelmo_print_bound(yelmo1%bnd) ! Next, initialize the state variables (dyn,therm,mat) ! (in this case, initialize temps with robin method) call yelmo_init_state(yelmo1,time=time_init,thrm_method=\"robin\") ! Run yelmo for eg 100.0 years with constant boundary conditions and topo ! to equilibrate thermodynamics and dynamics ! (impose a constant, small dt=1yr to reduce possibility for instabilities) call yelmo_update_equil(yelmo1,time,time_tot=100.0,topo_fixed=.FALSE.,dt=1.0) ! == YELMO INITIALIZATION COMPLETE == ! Note: the above routines `yelmo_init_state` and `yelmo_update_equil` ! are optional, if the user prefers another way to initialize the state variables. ! == Start time looping and run the model == ! Advance timesteps do n = 1, ntot ! Get current time time = time_init + n*dt ! Update the Yelmo ice sheet call yelmo_update(yelmo1,time) ! Here you may be updating `yelmo1%bnd` variables to drive the model transiently. end do ! == Finalize Yelmo instance == call yelmo_end(yelmo1,time=time) That's it! See Getting started to see how to get the code, compile a test program and run simulations.","title":"Home"},{"location":"#yelmo","text":"Welcome to Yelmo , an easy to use continental ice sheet model. Yelmo is a 3D ice-sheet-shelf model solving for the coupled dynamics and thermodynamics of the ice sheet system. Yelmo can be used for idealized simulations, stand-alone ice sheet simulations and fully coupled ice-sheet and climate simulations. Yelmo has been designed to operate as a stand-alone model or to be easily plugged in as a module in another program. The key to its flexibility is that no variables are defined globally and parameters are defined according to the domain being modeled. In this way, all variables and calculations are store in an object that entirely represents the model domain. The physics and design of Yelmo are described in the following article: Robinson, A., Alvarez-Solas, J., Montoya, M., Goelzer, H., Greve, R., and Ritz, C.: Description and validation of the ice-sheet model Yelmo (version 1.0), Geosci. Model Dev., 13, 2805\u20132823, https://doi.org/10.5194/gmd-13-2805-2020 , 2020. The Yelmo code repository can be found here: https://github.com/palma-ice/yelmo","title":"Yelmo"},{"location":"#general-model-structure-classes-and-usage","text":"","title":"General model structure - classes and usage"},{"location":"#yelmo_class","text":"The Yelmo class defines all data related to a model domain, such as Greenland or Antarctica. As seen below in the yelmo_class defintion, the 'class' is simply a user-defined Fortran type that contains additional types representing various parameters, variables or sets of module variables. type yelmo_class type(yelmo_param_class) :: par ! General domain parameters type(ygrid_class) :: grd ! Grid definition type(ytopo_class) :: tpo ! Topography variables type(ydyn_class) :: dyn ! Dynamics variables type(ymat_class) :: mat ! Material variables type(ytherm_class) :: thrm ! Thermodynamics variables type(ybound_class) :: bnd ! Boundary variables to drive model type(ydata_class) :: dta ! Data variables for comparison type(yregions_class) :: reg ! Regionally aggregated variables end type Likewise the module variables are defined in a similar way, e.g. ytopo_class that defines variables and parameters associated with the topography: type ytopo_class type(ytopo_param_class) :: par ! Parameters type(ytopo_state_class) :: now ! Variables end type Submodules such as ytopo_class include parameter definitions relevant to topography calculations, as well as all variables that define the state of the domain being modeled.","title":"yelmo_class"},{"location":"#example-model-domain-intialization","text":"The below code snippet shows an example of how to initialize an instance of Yelmo inside of a program, run the model forward in time and then terminate the instance. ! === Initialize ice sheet model ===== ! Initialize Yelmo objects (multiple yelmo objects can be initialized if needed) ! In this case `yelmo1` is the Yelmo object to initialize and `path_par` is the ! path to the parameter file to load for the configuration information. This ! command will also initialize the domain grid and load initial topographic ! variables. call yelmo_init(yelmo1,filename=path_par,grid_def=\"file\",time=time_init) ! === Load initial boundary conditions for current time and yelmo state ===== ! These variables can be loaded from a file, or passed from another ! component being simulated. Yelmo does not care about the source, ! it only needs all variables in the `bnd` class to be populated. ! ybound: z_bed, z_sl, H_sed, H_w, smb, T_srf, bmb_shlf, T_shlf, Q_geo yelmo1%bnd%z_bed = [2D array] yelmo1%bnd%z_sl = [2D array] yelmo1%bnd%H_sed = [2D array] yelmo1%bnd%H_w = [2D array] yelmo1%bnd%smb = [2D array] yelmo1%bnd%T_srf = [2D array] yelmo1%bnd%bmb_shlf = [2D array] yelmo1%bnd%T_shlf = [2D array] yelmo1%bnd%Q_geo = [2D array] ! Print summary of initial boundary conditions call yelmo_print_bound(yelmo1%bnd) ! Next, initialize the state variables (dyn,therm,mat) ! (in this case, initialize temps with robin method) call yelmo_init_state(yelmo1,time=time_init,thrm_method=\"robin\") ! Run yelmo for eg 100.0 years with constant boundary conditions and topo ! to equilibrate thermodynamics and dynamics ! (impose a constant, small dt=1yr to reduce possibility for instabilities) call yelmo_update_equil(yelmo1,time,time_tot=100.0,topo_fixed=.FALSE.,dt=1.0) ! == YELMO INITIALIZATION COMPLETE == ! Note: the above routines `yelmo_init_state` and `yelmo_update_equil` ! are optional, if the user prefers another way to initialize the state variables. ! == Start time looping and run the model == ! Advance timesteps do n = 1, ntot ! Get current time time = time_init + n*dt ! Update the Yelmo ice sheet call yelmo_update(yelmo1,time) ! Here you may be updating `yelmo1%bnd` variables to drive the model transiently. end do ! == Finalize Yelmo instance == call yelmo_end(yelmo1,time=time) That's it! See Getting started to see how to get the code, compile a test program and run simulations.","title":"Example model domain intialization"},{"location":"dependencies/","text":"Dependencies Yelmo is dependent on the following libraries: NetCDF Library of Iterative Solvers for Linear Systems 'runner' Python library (cxesmc fork) YelmoX is additionally dependent on the following library: FFTW (ver. 3.9+) Installation tips for each dependency can be found below. Installing NetCDF (preferably version 4.0 or higher) The NetCDF library is typically available with different distributions (Linux, Mac, etc). Along with installing libnetcdf , it will be necessary to install the package libnetcdf-dev . Installing the NetCDF viewing program ncview is also recommended. If you want to install NetCDF from source, then you must install both the netcdf-c and subsequently netcdf-fortran libraries. The source code and installation instructions are available from the Unidata website: https://www.unidata.ucar.edu/software/netcdf/docs/getting_and_building_netcdf.html Install LIS and FFTW These packages could be installed individually and linked into the libs directory of Yelmox and Yelmo. However, to ensure the right versions are used, etc., we have now made a separate repository for managing the installation of LIS and FFTW from the versions available in that repository. This repository is managed as part of the Fast Earth System Model Community (FESMC). Please download the code from this repository and see the README for installation instructions: https://github.com/fesmc-utils Installing runner Install runner to your system's Python installation via pip , along with dependency tabulate . pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip That's it! Now check that system command job is available by running job -h . If the command is not found, it means that the Python bin directory is not available in your PATH. To add it, typically something like this is needed in your .profile or .bashrc file: PATH=${PATH}:${HOME}/.local/bin export PATH","title":"Dependencies"},{"location":"dependencies/#dependencies","text":"Yelmo is dependent on the following libraries: NetCDF Library of Iterative Solvers for Linear Systems 'runner' Python library (cxesmc fork) YelmoX is additionally dependent on the following library: FFTW (ver. 3.9+) Installation tips for each dependency can be found below.","title":"Dependencies"},{"location":"dependencies/#installing-netcdf-preferably-version-40-or-higher","text":"The NetCDF library is typically available with different distributions (Linux, Mac, etc). Along with installing libnetcdf , it will be necessary to install the package libnetcdf-dev . Installing the NetCDF viewing program ncview is also recommended. If you want to install NetCDF from source, then you must install both the netcdf-c and subsequently netcdf-fortran libraries. The source code and installation instructions are available from the Unidata website: https://www.unidata.ucar.edu/software/netcdf/docs/getting_and_building_netcdf.html","title":"Installing NetCDF (preferably version 4.0 or higher)"},{"location":"dependencies/#install-lis-and-fftw","text":"These packages could be installed individually and linked into the libs directory of Yelmox and Yelmo. However, to ensure the right versions are used, etc., we have now made a separate repository for managing the installation of LIS and FFTW from the versions available in that repository. This repository is managed as part of the Fast Earth System Model Community (FESMC). Please download the code from this repository and see the README for installation instructions: https://github.com/fesmc-utils","title":"Install LIS and FFTW"},{"location":"dependencies/#installing-runner","text":"Install runner to your system's Python installation via pip , along with dependency tabulate . pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip That's it! Now check that system command job is available by running job -h . If the command is not found, it means that the Python bin directory is not available in your PATH. To add it, typically something like this is needed in your .profile or .bashrc file: PATH=${PATH}:${HOME}/.local/bin export PATH","title":"Installing runner"},{"location":"example-programs/","text":"Example programs The Yelmo base code provides a static library interface that can be used in other programs, as well as a couple of stand-alone programs for running certain benchmarks. Here we provide more examples of how to use Yelmo: Program template to connect with other models/components. Stand-alone ice sheet with full boundary forcing. In both cases, it is necessary to download the Yelmo repository separately, as well as compile the Yelmo static library (see Getting started ). Program template This is a minimalistic setup that allows you to run Yelmo with no dependencies and a straightforward Makefile. This template can be used to design a new stand-alone Yelmo experiment, or to provide guidance when adding Yelmo to another program. Clone the repository from https://github.com/palma-ice/yelmot Stand-alone ice sheet with full boundary forcing (yelmox) This setup is suitable for glacial-cycle simulations, future simulations or any other typical (realistic) ice-sheet model simulation. Clone the repository from https://github.com/palma-ice/yelmox","title":"Examples"},{"location":"example-programs/#example-programs","text":"The Yelmo base code provides a static library interface that can be used in other programs, as well as a couple of stand-alone programs for running certain benchmarks. Here we provide more examples of how to use Yelmo: Program template to connect with other models/components. Stand-alone ice sheet with full boundary forcing. In both cases, it is necessary to download the Yelmo repository separately, as well as compile the Yelmo static library (see Getting started ).","title":"Example programs"},{"location":"example-programs/#program-template","text":"This is a minimalistic setup that allows you to run Yelmo with no dependencies and a straightforward Makefile. This template can be used to design a new stand-alone Yelmo experiment, or to provide guidance when adding Yelmo to another program. Clone the repository from https://github.com/palma-ice/yelmot","title":"Program template"},{"location":"example-programs/#stand-alone-ice-sheet-with-full-boundary-forcing-yelmox","text":"This setup is suitable for glacial-cycle simulations, future simulations or any other typical (realistic) ice-sheet model simulation. Clone the repository from https://github.com/palma-ice/yelmox","title":"Stand-alone ice sheet with full boundary forcing (yelmox)"},{"location":"getting-started/","text":"Getting started Here you can find the basic information and steps needed to get Yelmo running. Dependencies Yelmo dependencies: LIS YelmoX dependencies: FFTW (for FastIsostasy), FastIsostasy, REMBO1 Job submission: Python3.x, runner See: Dependencies for more details. Directory structure config/ Configuration files for compilation on different systems. input/ Location of any input data needed by the model. libs/ Auxiliary libraries nesecessary for running the model. libyelmo/ Folder containing all compiled files in a standard way with lib/, include/ and bin/ folders. output/ Default location for model output. par/ Default parameter files that manage the model configuration. src/ Source code for Yelmo. tests/ Source code and analysis scripts for specific model benchmarks and tests. Usage Follow the steps below to (1) obtain the code, (2) configure the Makefile for your system, (3) compile the Yelmo static library and an executable program and (4) run a test simulation. 1. Get the code Clone the repository from https://github.com/palma-ice/yelmo : # Clone repository git clone https://github.com/palma-ice/yelmo.git $YELMOROOT git clone git@github.com:palma-ice/yelmo.git $YELMOROOT # via ssh cd $YELMOROOT where $YELMOROOT is the installation directory. If you plan to make changes to the code, it is wise to check out a new branch: git checkout -b user-dev You should now be working on the branch user-dev . 2. Create the system-specific Makefile To compile Yelmo, you need to generate a Makefile that is appropriate for your system. In the folder config , you need to specify a configuration file that defines the compiler and flags, including definition of the paths to the NetCDF and LIS libraries. You can use another file in the config folder as a template, e.g., cd config cp pik_ifort myhost_mycompiler then modify the file myhost_mycompiler to match your paths. Back in $YELMOROOT , you can then generate your Makefile with the provided python configuration script: cd $YELMOROOT python3 config.py config/myhost_mycompiler The result should be a Makefile in $YELMOROOT that is ready for use. 3. Prepare system-specific .runme_config file To use the runme script for submitting jobs, first you need to configure a few options to match the system you are using (so the script knows which queues are available, etc.). To do so, first copy the template config file to your directory: cp .runme/runme_config .runme_config Next, edit the file. If you are running on an HPC with a job submission system via SLURM, then specify the right HPC. So far the available HPCs are defined in the file .runme/queues_info.json . If you have a new HPC, you should add the information here and inform the runme developers to add it to the main repository. You should also specify the account associated with your jobs on the HPC (which usually indicates the resources available to you on the system). Finally, if you have not already, make sure to install the Python runner module via: pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip See Dependencies for more details if you have trouble. 3. Link to external libraries The external libraries held in the fesm-utils repository need to be linked here for use with Yelmo: ln -s $FESMUSRC ./libs/ Note that $FESMUSRC should be the root directory where fesm-utils was downloaded, and it should be an absolute path. 4. Compile the code Now you are ready to compile Yelmo as a static library: make clean # This step is very important to avoid errors!! make yelmo-static [debug=1] This will compile all of the Yelmo modules and libraries (as defined in config/Makefile_yelmo.mk ), and link them in a static library. All compiled files can be found in the folder libyelmo/ . Once the static library has been compiled, it can be used inside of external Fortran programs and modules via the statement use yelmo . To include/link yelmo-static during compilation of another program, its location must be defined: INC_YELMO = -I${YELMOROOT}/include LIB_YELMO = -L${YELMOROOT}/include -lyelmo Alternatively, several test programs exist in the folder tests/ to run Yelmo as a stand-alone ice sheet. For example, it's possible to run different EISMINT benchmarks, MISMIP benchmarks and the ISIMIP6 INITMIP simulation for Greenland, respectively: make benchmarks # compiles the program `libyelmo/bin/yelmo_benchmarks.x` make mismip # compiles the program `libyelmo/bin/yelmo_mismip.x` make initmip # compiles the program `libyelmo/bin/yelmo_initmip.x` The Makefile additionally allows you to specify debugging compiler flags with the option debug=1 , in case you need to debug the code (e.g., make benchmarks debug=1 ). Using this option, the code will run much slower, so this option is not recommended unless necessary. 5. Run the model Once an executable has been created, you can run the model. This can be achieved via the included Python job submission script runme . The following steps are carried out via the script: The output directory is created. The executable is copied to the output directory The relevant parameter files are copied to the output directory. Links to the input data paths ( input and ice_data ) are created in the output directory. Note that many simulations, such as benchmark experiments, do not depend on these external data sources, but the links are made anyway. The executable is run from the output directory, either as a background process or it is submitted to the queue via sbatch (the SLURM workload manager). To run a benchmark simulation, for example, use the following command: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml where the option -r implies that the model should be run as a background process. If this is omitted, then the output directory will be populated, but no executable will be run, while -s instead will submit the simulation to cluster queue system instead of running in the background. The option -e lets you specify the executable. For some standard cases, shortcuts have been created: benchmarks = libyelmo/bin/yelmo_benchmarks.x mismip = libyelmo/bin/yemo_mismip.x initmip = libyelmo/bin/yelmo_initmip.x The last two mandatory arguments -o OUTDIR and -n PAR_PATH are the output/run directory and the parameter file to be used for this simulation, respectively. In the case of the above simulation, the output directory is defined as output/test , where all model parameters (loaded from the file par/yelmo_EISMINT.nml ) and model output can be found. It is also possible to modify parameters inline via the option -p KEY=VAL [KEY=VAL ...] . The parameter should be specified with its namelist group and its name. E.g., to change the resolution of the EISMINT benchmark experiment to 10km, use: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml -p ctrl.dx=10 See runme -h for more details on the run script. Test cases The published model description includes several test simulations for validation of the model's performance. The following section describes how to perform these tests using the same model version documented in the article. From this point, it is assumed that the user has already configured the model for their system (see https://palma-ice.github.io/yelmo-docs ) and is ready to compile the mode. 1. EISMINT1 moving margin experiment To perform the moving margin experiment, compile the benchmarks executable and call it with the EISMINT parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-moving -n par-gmd/yelmo_EISMINT_moving.nml 2. EISMINT2 EXPA To perform Experiment A from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPA parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expa -n par-gmd/yelmo_EISMINT_expa.nml 3. EISMINT2 EXPF To perform Experiment F from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPF parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expf -n par-gmd/yelmo_EISMINT_expf.nml 4. MISMIP RF To perform the MISMIP rate factor experiment, compile the mismip executable and call it with the MISMIP parameter file the three parameter permutations of interest (default, subgrid and subgrid+gl-scaling): make mismip ./runme -r -e mismip -o output/mismip-rf-0 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-1 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-2 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=2 To additionally change the resolution of the simulations change the parameter mismip.dx , e.g. for the default simulation with 10km resolution , call: ./runme -r -e mismip -o output/mismip-rf-0-10km -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 mismip.dx=10 5. Age profile experiments To perform the age profile experiments, compile the Fortran program tests/test_icetemp.f90 and run it: make icetemp ./libyelmo/bin/test_icetemp.x To perform the different permutations, it is necessary to recompile for single or double precision after changing the precision parameter prec in the file src/yelmo_defs.f90 . The number of vertical grid points can be specified in the main program file, as well as the output filename. 6. Antarctica present-day and glacial simulations To perform the Antarctica simulations as presented in the paper, it is necessary to compile the initmip executable and run with the present-day (pd) and glacial (lgm) parameter values: make initmip ./runme -r -e initmip -o output/ant-pd -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_pd\" ./runme -r -e initmip -o output/ant-lgm -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_lgm\"","title":"Getting started"},{"location":"getting-started/#getting-started","text":"Here you can find the basic information and steps needed to get Yelmo running.","title":"Getting started"},{"location":"getting-started/#dependencies","text":"Yelmo dependencies: LIS YelmoX dependencies: FFTW (for FastIsostasy), FastIsostasy, REMBO1 Job submission: Python3.x, runner See: Dependencies for more details.","title":"Dependencies"},{"location":"getting-started/#directory-structure","text":"config/ Configuration files for compilation on different systems. input/ Location of any input data needed by the model. libs/ Auxiliary libraries nesecessary for running the model. libyelmo/ Folder containing all compiled files in a standard way with lib/, include/ and bin/ folders. output/ Default location for model output. par/ Default parameter files that manage the model configuration. src/ Source code for Yelmo. tests/ Source code and analysis scripts for specific model benchmarks and tests.","title":"Directory structure"},{"location":"getting-started/#usage","text":"Follow the steps below to (1) obtain the code, (2) configure the Makefile for your system, (3) compile the Yelmo static library and an executable program and (4) run a test simulation.","title":"Usage"},{"location":"getting-started/#1-get-the-code","text":"Clone the repository from https://github.com/palma-ice/yelmo : # Clone repository git clone https://github.com/palma-ice/yelmo.git $YELMOROOT git clone git@github.com:palma-ice/yelmo.git $YELMOROOT # via ssh cd $YELMOROOT where $YELMOROOT is the installation directory. If you plan to make changes to the code, it is wise to check out a new branch: git checkout -b user-dev You should now be working on the branch user-dev .","title":"1. Get the code"},{"location":"getting-started/#2-create-the-system-specific-makefile","text":"To compile Yelmo, you need to generate a Makefile that is appropriate for your system. In the folder config , you need to specify a configuration file that defines the compiler and flags, including definition of the paths to the NetCDF and LIS libraries. You can use another file in the config folder as a template, e.g., cd config cp pik_ifort myhost_mycompiler then modify the file myhost_mycompiler to match your paths. Back in $YELMOROOT , you can then generate your Makefile with the provided python configuration script: cd $YELMOROOT python3 config.py config/myhost_mycompiler The result should be a Makefile in $YELMOROOT that is ready for use.","title":"2. Create the system-specific Makefile"},{"location":"getting-started/#3-prepare-system-specific-runme_config-file","text":"To use the runme script for submitting jobs, first you need to configure a few options to match the system you are using (so the script knows which queues are available, etc.). To do so, first copy the template config file to your directory: cp .runme/runme_config .runme_config Next, edit the file. If you are running on an HPC with a job submission system via SLURM, then specify the right HPC. So far the available HPCs are defined in the file .runme/queues_info.json . If you have a new HPC, you should add the information here and inform the runme developers to add it to the main repository. You should also specify the account associated with your jobs on the HPC (which usually indicates the resources available to you on the system). Finally, if you have not already, make sure to install the Python runner module via: pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip See Dependencies for more details if you have trouble.","title":"3. Prepare system-specific .runme_config file"},{"location":"getting-started/#3-link-to-external-libraries","text":"The external libraries held in the fesm-utils repository need to be linked here for use with Yelmo: ln -s $FESMUSRC ./libs/ Note that $FESMUSRC should be the root directory where fesm-utils was downloaded, and it should be an absolute path.","title":"3. Link to external libraries"},{"location":"getting-started/#4-compile-the-code","text":"Now you are ready to compile Yelmo as a static library: make clean # This step is very important to avoid errors!! make yelmo-static [debug=1] This will compile all of the Yelmo modules and libraries (as defined in config/Makefile_yelmo.mk ), and link them in a static library. All compiled files can be found in the folder libyelmo/ . Once the static library has been compiled, it can be used inside of external Fortran programs and modules via the statement use yelmo . To include/link yelmo-static during compilation of another program, its location must be defined: INC_YELMO = -I${YELMOROOT}/include LIB_YELMO = -L${YELMOROOT}/include -lyelmo Alternatively, several test programs exist in the folder tests/ to run Yelmo as a stand-alone ice sheet. For example, it's possible to run different EISMINT benchmarks, MISMIP benchmarks and the ISIMIP6 INITMIP simulation for Greenland, respectively: make benchmarks # compiles the program `libyelmo/bin/yelmo_benchmarks.x` make mismip # compiles the program `libyelmo/bin/yelmo_mismip.x` make initmip # compiles the program `libyelmo/bin/yelmo_initmip.x` The Makefile additionally allows you to specify debugging compiler flags with the option debug=1 , in case you need to debug the code (e.g., make benchmarks debug=1 ). Using this option, the code will run much slower, so this option is not recommended unless necessary.","title":"4. Compile the code"},{"location":"getting-started/#5-run-the-model","text":"Once an executable has been created, you can run the model. This can be achieved via the included Python job submission script runme . The following steps are carried out via the script: The output directory is created. The executable is copied to the output directory The relevant parameter files are copied to the output directory. Links to the input data paths ( input and ice_data ) are created in the output directory. Note that many simulations, such as benchmark experiments, do not depend on these external data sources, but the links are made anyway. The executable is run from the output directory, either as a background process or it is submitted to the queue via sbatch (the SLURM workload manager). To run a benchmark simulation, for example, use the following command: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml where the option -r implies that the model should be run as a background process. If this is omitted, then the output directory will be populated, but no executable will be run, while -s instead will submit the simulation to cluster queue system instead of running in the background. The option -e lets you specify the executable. For some standard cases, shortcuts have been created: benchmarks = libyelmo/bin/yelmo_benchmarks.x mismip = libyelmo/bin/yemo_mismip.x initmip = libyelmo/bin/yelmo_initmip.x The last two mandatory arguments -o OUTDIR and -n PAR_PATH are the output/run directory and the parameter file to be used for this simulation, respectively. In the case of the above simulation, the output directory is defined as output/test , where all model parameters (loaded from the file par/yelmo_EISMINT.nml ) and model output can be found. It is also possible to modify parameters inline via the option -p KEY=VAL [KEY=VAL ...] . The parameter should be specified with its namelist group and its name. E.g., to change the resolution of the EISMINT benchmark experiment to 10km, use: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml -p ctrl.dx=10 See runme -h for more details on the run script.","title":"5. Run the model"},{"location":"getting-started/#test-cases","text":"The published model description includes several test simulations for validation of the model's performance. The following section describes how to perform these tests using the same model version documented in the article. From this point, it is assumed that the user has already configured the model for their system (see https://palma-ice.github.io/yelmo-docs ) and is ready to compile the mode.","title":"Test cases"},{"location":"getting-started/#1-eismint1-moving-margin-experiment","text":"To perform the moving margin experiment, compile the benchmarks executable and call it with the EISMINT parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-moving -n par-gmd/yelmo_EISMINT_moving.nml","title":"1. EISMINT1 moving margin experiment"},{"location":"getting-started/#2-eismint2-expa","text":"To perform Experiment A from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPA parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expa -n par-gmd/yelmo_EISMINT_expa.nml","title":"2. EISMINT2 EXPA"},{"location":"getting-started/#3-eismint2-expf","text":"To perform Experiment F from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPF parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expf -n par-gmd/yelmo_EISMINT_expf.nml","title":"3. EISMINT2 EXPF"},{"location":"getting-started/#4-mismip-rf","text":"To perform the MISMIP rate factor experiment, compile the mismip executable and call it with the MISMIP parameter file the three parameter permutations of interest (default, subgrid and subgrid+gl-scaling): make mismip ./runme -r -e mismip -o output/mismip-rf-0 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-1 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-2 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=2 To additionally change the resolution of the simulations change the parameter mismip.dx , e.g. for the default simulation with 10km resolution , call: ./runme -r -e mismip -o output/mismip-rf-0-10km -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 mismip.dx=10","title":"4. MISMIP RF"},{"location":"getting-started/#5-age-profile-experiments","text":"To perform the age profile experiments, compile the Fortran program tests/test_icetemp.f90 and run it: make icetemp ./libyelmo/bin/test_icetemp.x To perform the different permutations, it is necessary to recompile for single or double precision after changing the precision parameter prec in the file src/yelmo_defs.f90 . The number of vertical grid points can be specified in the main program file, as well as the output filename.","title":"5. Age profile experiments"},{"location":"getting-started/#6-antarctica-present-day-and-glacial-simulations","text":"To perform the Antarctica simulations as presented in the paper, it is necessary to compile the initmip executable and run with the present-day (pd) and glacial (lgm) parameter values: make initmip ./runme -r -e initmip -o output/ant-pd -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_pd\" ./runme -r -e initmip -o output/ant-lgm -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_lgm\"","title":"6. Antarctica present-day and glacial simulations"},{"location":"hpc-notes/","text":"HPC Notes Running at PIK on HPC2024 (foote) The following modules have to be loaded in order to compile and run the model. For convenience you can also add those commands to your .profile file in your home directory. module purge module use /p/system/modulefiles/compiler \\ /p/system/modulefiles/gpu \\ /p/system/modulefiles/libraries \\ /p/system/modulefiles/parallel \\ /p/system/modulefiles/tools module load intel/oneAPI/2024.0.0 module load netcdf-c/4.9.2 module load netcdf-fortran-intel/4.6.1 module load udunits/2.2.28 module load ncview/2.1.10 module load cdo/2.4.2 When installing climber-x-exlib (see further below) use the pik script: ./install_pik.sh ifx To link to data sources, use the following path: datapath=/p/projects/megarun Running at AWI on albedo Load the following modules in your .bashrc or .bash_profile file in your home directory. module load intel-oneapi-compilers/2024.0.0 module load netcdf-c/4.8.1-openmpi4.1.3-oneapi2022.1.0 module load netcdf-fortran/4.5.4-oneapi2022.1.0 module load udunits/2.2.28 module load ncview/2.1.8 module load cdo/2.2.0 module load python/3.10.4 When installing climber-x-exlib (see further below) use the awi script (which is a link to the dkrz script): ./install_awi.sh ifx To link to data sources, use the following path: datapath=/albedo/work/projects/p_forclima Running at DKRZ on levante Load the following modules in your .bashrc file in your home directory. # Tools module load cdo/2.4.0-gcc-11.2.0 module load esmvaltool/2.5.0 module load ncview/2.1.8-gcc-11.2.0 module load git/2.43.3-gcc-11.2.0 module load python3/2023.01-gcc-11.2.0 # Compilers and libs module load intel-oneapi-compilers/2023.2.1-gcc-11.2.0 module load netcdf-c/4.8.1-openmpi-4.1.2-intel-2021.5.0 module load netcdf-fortran/4.5.3-openmpi-4.1.2-intel-2021.5.0 When installing fesm-utils (see Dependencies ) use the dkrz script: ./install_dkrz.sh ifx To link to data sources, use the following path: datapath=/work/ba1442","title":"HPC notes"},{"location":"hpc-notes/#hpc-notes","text":"","title":"HPC Notes"},{"location":"hpc-notes/#running-at-pik-on-hpc2024-foote","text":"The following modules have to be loaded in order to compile and run the model. For convenience you can also add those commands to your .profile file in your home directory. module purge module use /p/system/modulefiles/compiler \\ /p/system/modulefiles/gpu \\ /p/system/modulefiles/libraries \\ /p/system/modulefiles/parallel \\ /p/system/modulefiles/tools module load intel/oneAPI/2024.0.0 module load netcdf-c/4.9.2 module load netcdf-fortran-intel/4.6.1 module load udunits/2.2.28 module load ncview/2.1.10 module load cdo/2.4.2 When installing climber-x-exlib (see further below) use the pik script: ./install_pik.sh ifx To link to data sources, use the following path: datapath=/p/projects/megarun","title":"Running at PIK on HPC2024 (foote)"},{"location":"hpc-notes/#running-at-awi-on-albedo","text":"Load the following modules in your .bashrc or .bash_profile file in your home directory. module load intel-oneapi-compilers/2024.0.0 module load netcdf-c/4.8.1-openmpi4.1.3-oneapi2022.1.0 module load netcdf-fortran/4.5.4-oneapi2022.1.0 module load udunits/2.2.28 module load ncview/2.1.8 module load cdo/2.2.0 module load python/3.10.4 When installing climber-x-exlib (see further below) use the awi script (which is a link to the dkrz script): ./install_awi.sh ifx To link to data sources, use the following path: datapath=/albedo/work/projects/p_forclima","title":"Running at AWI on albedo"},{"location":"hpc-notes/#running-at-dkrz-on-levante","text":"Load the following modules in your .bashrc file in your home directory. # Tools module load cdo/2.4.0-gcc-11.2.0 module load esmvaltool/2.5.0 module load ncview/2.1.8-gcc-11.2.0 module load git/2.43.3-gcc-11.2.0 module load python3/2023.01-gcc-11.2.0 # Compilers and libs module load intel-oneapi-compilers/2023.2.1-gcc-11.2.0 module load netcdf-c/4.8.1-openmpi-4.1.2-intel-2021.5.0 module load netcdf-fortran/4.5.3-openmpi-4.1.2-intel-2021.5.0 When installing fesm-utils (see Dependencies ) use the dkrz script: ./install_dkrz.sh ifx To link to data sources, use the following path: datapath=/work/ba1442","title":"Running at DKRZ on levante"},{"location":"jupyter-over-ssh/","text":"How to use Jupyter Notebook over ssh Step 1 On the remote machine, open a Jupyter Notebook instance by running: jupyter notebook --no-browser --port 1235 Here port 1235 is chosen, but another port could be used too. In the remote terminal, this message should appear: http://localhost:1235/?token=LARGERANDOMNUMBER Step 2 Open another terminal on the local machine and run: ssh -L 1235:localhost:1235 user@snowball.fis.ucm.es IMPORTANT: use the same port as chosen in step (1). Step 3 Go back to the remote terminal and copy the link shown into a browser on the local machine. The Jupyter Notebook running on the remote machine should now be open in a local browser. Enjoy!","title":"How to use Jupyter Notebook over ssh"},{"location":"jupyter-over-ssh/#how-to-use-jupyter-notebook-over-ssh","text":"","title":"How to use Jupyter Notebook over ssh"},{"location":"jupyter-over-ssh/#step-1","text":"On the remote machine, open a Jupyter Notebook instance by running: jupyter notebook --no-browser --port 1235 Here port 1235 is chosen, but another port could be used too. In the remote terminal, this message should appear: http://localhost:1235/?token=LARGERANDOMNUMBER","title":"Step 1"},{"location":"jupyter-over-ssh/#step-2","text":"Open another terminal on the local machine and run: ssh -L 1235:localhost:1235 user@snowball.fis.ucm.es IMPORTANT: use the same port as chosen in step (1).","title":"Step 2"},{"location":"jupyter-over-ssh/#step-3","text":"Go back to the remote terminal and copy the link shown into a browser on the local machine. The Jupyter Notebook running on the remote machine should now be open in a local browser. Enjoy!","title":"Step 3"},{"location":"notes/","text":"Notes timeout module Example parameters for using the timeout module. The name of the section should be specified when calling timeout_init . &tm_1D method = \"file\" ! \"const\", \"file\", \"times\" dt = 1.0 file = \"input/timeout_ramp_100kyr.txt\" times = -10, -5, 0, 1, 2, 3, 4, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55 / ! Get output times call timeout_init(tm_1D,path_par,\"tm_1D\",\"small\", time_init,time_end) If we are loading the desired output times from a file, the format is one time per line, or a range of times using the format t0:dt:t1 : 0:10:200 200:20:300 300:50:500 500:100:1000 1000:200:5000 5000:500:10000 10e3:1e3:20e3 20e3:2e3:200e3 200e3:5e3:1e6 Duplicate times will be removed, as well as times outside of the range of time_init and time_end . In this way, once timeout_init is called, we know how many timesteps of output will be generated. This can help confirm that we designed the experiment well, and how much data to expect. Then during the timeloop, simply use the function timeout_check to determine if the current time should be written to output: if (timeout_check(tm_1D,time)) then !call write_step_1D_combined(yelmo1,hyst1,file1D_hyst,time=time) end if timing module All calls to the intrinsic routine cpu_time() have been replaced by timing calculations performed in the new timing module. This has the benefit of ensuring timing will work properly for parallel and serial programs, and allows us to keep track of multiple timing objectives with one simple object and subroutine. The control of a timing object is handled via timer_step : First, initialize and reset the timer object: call timer_step(tmrs,comp=-1) Then, e.g., within the timeloop, get the timing for isostasy calls and for Yelmo calls: call timer_step(tmrs,comp=0) ! == ISOSTASY ========================================================== call isos_update(isos1,yelmo1%tpo%now%H_ice,yelmo1%bnd%z_sl,time,yelmo1%bnd%dzbdt_corr) yelmo1%bnd%z_bed = isos1%now%z_bed call timer_step(tmrs,comp=1,time_mod=[time-dtt_now,time]*1e-3,label=\"isostasy\") ! Update ice sheet to current time call yelmo_update(yelmo1,time) call timer_step(tmrs,comp=2,time_mod=[time-dtt_now,time]*1e-3,label=\"yelmo\") The option comp tells us which component the timing is being calculated for, and we can additionally provide a label to associate with this component. This is useful for printing a table later. After all components have been calculated, we can print to a summary file: if (mod(time_elapsed,10.0)==0) then ! Print timestep timing info and write log table call timer_write_table(tmrs,[time,dtt_now]*1e-3,\"m\",tmr_file,init=time_elapsed .eq. 0.0) end if The resulting file will look something like this, here for 4 components measured during during the time loop: time dt yelmo isostasy climate io total rate 0.000 0.010 0.000 0.000 0.016 0.051 0.067 6.694 0.010 0.010 0.000 0.000 0.025 0.000 0.025 2.533 0.020 0.010 0.000 0.000 0.024 0.000 0.025 2.458 Based on the options supplied, the time units are in [m] and the model time in [kyr] . The rate is then calculated as [m/kyr] - this is the inverse of what we used to measure [kyr/hr] . The rate as defined now is easier to manage in terms of summing the contribution of different components, and so is preferred moving forward. To recover [kyr/hr] , simply take 60/rate. master to main Following updated conventions, the default branch is now called main and the branch master has been deleted. To update a working copy locally that already contains a master branch and therefore points to it as the default branch, the following steps should be applied: Get the branch main . Delete the local branch master . Make sure your local repository sees main as the default branch. # Get all branch information from the origin (github): git fetch --all # Get onto the new default branch: git checkout main # Delete the branch master: git branch -d master # Clean up any branches that no longer exist at origin: git fetch --prune origin # Set the local 'head' to whatever is specified at the origin (which will be main): git remote set-head origin -a Done! Now your local copy should work like normal, with main instead of master . Thermodynamics equations Ice column Prognostic equation: \\frac{\\partial T}{\\partial t} = \\frac{k}{\\rho c} \\frac{\\partial^2 T}{\\partial z^2} - u \\frac{\\partial T}{\\partial x} - v \\frac{\\partial T}{\\partial y} - w \\frac{\\partial T}{\\partial z} + \\frac{\\Phi}{\\rho c} Ice surface boundary condition: T(z=z_{\\rm srf}) = {\\rm min}(T_{\\rm 2m},T_0) Ice base (temperate) boundary condition: T(z=z_{\\rm bed}) = T_{\\rm pmp} Ice base (frozen) boundary condition: k \\frac{\\partial T}{\\partial z} = k_r \\frac{\\partial T_r}{\\partial z} Note, the following internal Yelmo variables are defined for convenience: Q_{\\rm ice,b} = -k \\frac{\\partial T}{\\partial z}; \\quad Q_{\\rm rock} = -k_r \\frac{\\partial T_r}{\\partial z} Bedrock column Prognostic equation: \\frac{\\partial T_r}{\\partial t} = \\frac{k_r}{\\rho_r c_r} \\frac{\\partial^2 T_r}{\\partial z^2} Bedrock surface boundary condition: T_r(z=z_{\\rm bed}) = T(z=z_{\\rm bed}) Bedrock base boundary condition: \\frac{\\partial T_r}{\\partial z} = -\\frac{Q_{\\rm geo}}{k_r} Equilibrium bedrock In this case, the bedrock temperature profile is prescribed to the equilibrium linear temperature profile. The slope follows: \\frac{\\partial T_r}{\\partial z} = -\\frac{Q_{\\rm geo}}{k_r} and the bedrock surface temperature is given by the ice temperature at its base: T_r(z=z_{\\rm bed}) = T(z=z_{\\rm bed}) Active bedrock Yelmo calculates the temperature in the lithosphere along with the ice temperature. This can be achieved by assuming equilibrium conditions in the bedrock, i.e., that the temperature profile in the bedrock is always linear with T_lith_s = T_ice_b and the slope equal to dT/dz = -Q_geo / k_lith . Or, the temperature equation can be solved in the lithosphere together with the temperature in the ice column. The parameter block ytherm_lith controls how the lithosphere is calculated with ytherm_lith.method=['equil','active'] deciding the two cases above. Density of the upper lithosphere Heat capacity of the upper lithosphere In both SICOPOLIS and GRISLI, a value of cp = 1000.0 [J kg-1 K-1] is used (referenced in Rogozhina et al., 2012; Greve, 2005; Greve, 1997). This value is adopted in Yelmo as well. cp = 1000.0 ! [J kg-1 K-1] Heat conductivity of the upper lithosphere Note, Yelmo expects input parameter values in units of [J a-1 m-1 K-1] , while much literature uses [W m-1 K-1] . Given the number of seconds in a year sec_year = 31536000.0 , kt [W m-1 K-1] * sec_year = kt [J a-1 m-1 K-1] . Rogozhina et al. (2012) use kt = 2 [W m-1 K-1] for Greenland: kt = 6.3e7 ! [J a-1 m-1 K-1] This value is supported by L\u00f6sing et al. (2020), who perform a Bayesian inversion for GHF in Antarctica. Assuming exponentially decreasing heat production with depth, lower values of kt are supported (see Fig. 7b). In a study on the global thermal characteristics of the lithosphere, Cammarano and Guerri (2017) adopt an upper crust thermal conductivity of kt = 2.5 [W m-1 K-1] . To do: This study is also potentially relevant: https://link.springer.com/article/10.1186/s40517-020-0159-y . They show ranges of on the order of kt = 2-3 [W m-1 K-1] for the Canadian shield. The above value of kt = 2 [W m-1 K-1] = 6.3e7 [J a-1 m-1 K-1] is adopted as the default thermal conductivity of the upper crust in Yelmo. For historical context, see other estimates below. From Greve (1997) and Greve (2005): kt = 9.46e7 ! [J a-1 m-1 K-1] which is equivalent to kt = 3 [W m-1 K-1] . The source of this value is not known. From GRISLI: kt = 1.04e8 ! [J a-1 m-1 K-1] which is equivalent to kt = 3.3 [W m-1 K-1] . The source of this value is not known. How to read yelmo_check_kill output The subroutine yelmo_check_kill is used to see if any instability is arising in the model. If so, then a restart file is written at that moment (the earlier in the instability, the better), and the model is stopped with diagnostic output to the log file. Note that pc_eps is the parameter that defines our target error tolerance in the time stepping of ice thickness evolution. At each time step, the diagnosed model error pc_eta is compared with pc_eps . If pc_eta >> pc_eps , this is interpreted as instability and the model is stopped. Margin-front mass balance Following Pollard and DeConto (2012,2016), an ice-margin front melting scheme has been implemented that accounts for the melt rate along the vertical face of ice submerged by seawater. The frontal mass balance ($\\dot{f}$, m yr$^{-1}$) is calculated as: \\dot{f} = \\dot{b}_{\\rm eff} \\frac{A_f}{A_{\\rm tot}} \\theta_f where $\\dot{b} {\\rm eff}$ is the effective basal mass balance (the mean of the basal mass balance calculated for the ice-free neighbors), $A {\\rm tot}=\\Delta x \\Delta x$ is the horizontal grid area and $A_f$ is the area of the submerged faces (i.e., the sum of the depth of submerged ice for each face of the grid cell adjacent to an ice-free cell -- potentially four faces in total). $\\theta_f=10$ is a scaling coefficient that implies the face mass balance should be ~10 times higher than the basal mass balance (Pollard and DeConto, 2016, appendix). Calving schemes Here is a summary of calving schemes. Lipscomb et al. (2019) c = k_\\tau \\tau_{\\rm ec} where $k_\\tau$ (m yr$^{-1}$ Pa$^{-1}$) is an empirical constant and $\\tau_{\\rm ec}$ (Pa) is the effective calving stress, which is defined by: \\tau_{\\rm ec}^2 = \\max(\\tau_1,0)^2 + \\omega_2 \\max(\\tau_2,0)^2 $\\tau_1$ and $\\tau_2$ are the eigenvalues of the 2D horizontal deviatoric stress tensor and $\\omega_2$ is an empirical weighting constant. For partially ice-covered grid cells (with $f_{\\rm ice} < 1$), these stresses are taken from the upstream neighbor. The eigenvalues $\\tau_1$ and $\\tau_2$ are calculated from the depth-averaged (2D) stress tensor $\\tau_{\\rm ij}$ as follows. Given the stress tensor components $\\tau_{\\rm xx}$, $\\tau_{\\rm yy}$ and $\\tau_{\\rm xy}$, we can solve for the real roots $\\lambda$ of the tensor from the quadratic equation: a \\lambda^2 + b \\lambda + c = 0 where a = 1.0 \\\\ b = -(\\tau_{\\rm xx} + \\tau_{\\rm yy}) \\\\ c = \\tau_{\\rm xx}*\\tau_{\\rm yy} - \\tau_{\\rm xy}^2 glissade_velo_higher.F90: tau_xz(k,i,j) = tau_xz(k,i,j) + efvs_qp * du_dz ! 2 * efvs * eps_xz tau_yz(k,i,j) = tau_yz(k,i,j) + efvs_qp * dv_dz ! 2 * efvs * eps_yz tau_xx(k,i,j) = tau_xx(k,i,j) + 2.d0 * efvs_qp * du_dx ! 2 * efvs * eps_xx tau_yy(k,i,j) = tau_yy(k,i,j) + 2.d0 * efvs_qp * dv_dy ! 2 * efvs * eps_yy tau_xy(k,i,j) = tau_xy(k,i,j) + efvs_qp * (dv_dx + du_dy) ! 2 * efvs * eps_xy Vertical velocity w = u_b \\frac{\\partial b}{\\partial x} + v_b \\frac{\\partial b}{\\partial y} - \\int_b^z \\left( \\frac{\\partial u}{\\partial x} + \\frac{\\partial v}{\\partial y} \\right) dz' Ice margin, calving rates, mass conservation ajr, 2021-06-22 Through v1.42 , ice margins were not fully consistently treated in Yelmo. This has been thoroughly revised. Now the following should be true: The variable f_ice contains information of the ice area fraction of a grid cell. If f_ice=0 , no ice is present, if f_ice=1 , the cell is fully ice covered, and for a fractional value, this cell is designated an ice margin point with partial ice cover. To determine f_ice , we need to calculate the \"effective ice thickness\" of a grid point. For floating cells at the margin, the effective ice thickness is equivalent to either the ice thickness or the minimum ice thickness of the neighboring cell, whichever is larger. For grounded cells, the effective ice thickness must at least be that of 1/2 of the minimum ice thickness of a neighboring cell. With effective ice thickness known, f_ice = H_ice / H_eff . Any grid cell with fractional ice cover 0> pc_eps , this is interpreted as instability and the model is stopped.","title":"How to read yelmo_check_kill output"},{"location":"notes/#margin-front-mass-balance","text":"Following Pollard and DeConto (2012,2016), an ice-margin front melting scheme has been implemented that accounts for the melt rate along the vertical face of ice submerged by seawater. The frontal mass balance ($\\dot{f}$, m yr$^{-1}$) is calculated as: \\dot{f} = \\dot{b}_{\\rm eff} \\frac{A_f}{A_{\\rm tot}} \\theta_f where $\\dot{b} {\\rm eff}$ is the effective basal mass balance (the mean of the basal mass balance calculated for the ice-free neighbors), $A {\\rm tot}=\\Delta x \\Delta x$ is the horizontal grid area and $A_f$ is the area of the submerged faces (i.e., the sum of the depth of submerged ice for each face of the grid cell adjacent to an ice-free cell -- potentially four faces in total). $\\theta_f=10$ is a scaling coefficient that implies the face mass balance should be ~10 times higher than the basal mass balance (Pollard and DeConto, 2016, appendix).","title":"Margin-front mass balance"},{"location":"notes/#calving-schemes","text":"Here is a summary of calving schemes.","title":"Calving schemes"},{"location":"notes/#lipscomb-et-al-2019","text":"c = k_\\tau \\tau_{\\rm ec} where $k_\\tau$ (m yr$^{-1}$ Pa$^{-1}$) is an empirical constant and $\\tau_{\\rm ec}$ (Pa) is the effective calving stress, which is defined by: \\tau_{\\rm ec}^2 = \\max(\\tau_1,0)^2 + \\omega_2 \\max(\\tau_2,0)^2 $\\tau_1$ and $\\tau_2$ are the eigenvalues of the 2D horizontal deviatoric stress tensor and $\\omega_2$ is an empirical weighting constant. For partially ice-covered grid cells (with $f_{\\rm ice} < 1$), these stresses are taken from the upstream neighbor. The eigenvalues $\\tau_1$ and $\\tau_2$ are calculated from the depth-averaged (2D) stress tensor $\\tau_{\\rm ij}$ as follows. Given the stress tensor components $\\tau_{\\rm xx}$, $\\tau_{\\rm yy}$ and $\\tau_{\\rm xy}$, we can solve for the real roots $\\lambda$ of the tensor from the quadratic equation: a \\lambda^2 + b \\lambda + c = 0 where a = 1.0 \\\\ b = -(\\tau_{\\rm xx} + \\tau_{\\rm yy}) \\\\ c = \\tau_{\\rm xx}*\\tau_{\\rm yy} - \\tau_{\\rm xy}^2 glissade_velo_higher.F90: tau_xz(k,i,j) = tau_xz(k,i,j) + efvs_qp * du_dz ! 2 * efvs * eps_xz tau_yz(k,i,j) = tau_yz(k,i,j) + efvs_qp * dv_dz ! 2 * efvs * eps_yz tau_xx(k,i,j) = tau_xx(k,i,j) + 2.d0 * efvs_qp * du_dx ! 2 * efvs * eps_xx tau_yy(k,i,j) = tau_yy(k,i,j) + 2.d0 * efvs_qp * dv_dy ! 2 * efvs * eps_yy tau_xy(k,i,j) = tau_xy(k,i,j) + efvs_qp * (dv_dx + du_dy) ! 2 * efvs * eps_xy","title":"Lipscomb et al. (2019)"},{"location":"notes/#vertical-velocity","text":"w = u_b \\frac{\\partial b}{\\partial x} + v_b \\frac{\\partial b}{\\partial y} - \\int_b^z \\left( \\frac{\\partial u}{\\partial x} + \\frac{\\partial v}{\\partial y} \\right) dz'","title":"Vertical velocity"},{"location":"notes/#ice-margin-calving-rates-mass-conservation","text":"ajr, 2021-06-22 Through v1.42 , ice margins were not fully consistently treated in Yelmo. This has been thoroughly revised. Now the following should be true: The variable f_ice contains information of the ice area fraction of a grid cell. If f_ice=0 , no ice is present, if f_ice=1 , the cell is fully ice covered, and for a fractional value, this cell is designated an ice margin point with partial ice cover. To determine f_ice , we need to calculate the \"effective ice thickness\" of a grid point. For floating cells at the margin, the effective ice thickness is equivalent to either the ice thickness or the minimum ice thickness of the neighboring cell, whichever is larger. For grounded cells, the effective ice thickness must at least be that of 1/2 of the minimum ice thickness of a neighboring cell. With effective ice thickness known, f_ice = H_ice / H_eff . Any grid cell with fractional ice cover 0 1 , then the scaling is non-linear with an exponent of rel_q (this helps maintain small values of tau longer which seems to help keep errors low). Once rel_time2 is reached, relaxation in the model is disabled, and the ice shelves are allowed to freely evolve. Analogously, H_scale is modified the same way: it is constant at the value of scale_H1 until scale_time1 , linearly scaled between scale_time1 and scale_time2 , and then constant thereafter at the value of scale_H2 . Increasing the value of H_scale over time helps to avoid oscillations in the optimization procedure as cf_ref approaches the best fit. Finally, after qmax-1 iterations or time=(qmax-1)*time_iter , cf_ref is held constant, and the simulation runs for time_steady years to equilibrate the model with the current conditions. This step minimizes drift in the final result and confirms that the optimized cf_ref field works well.","title":"Basal friction optimization"},{"location":"optimization/#basal-friction-optimization","text":"A simple optimization program has developed that attempts to optimize the basal friction field applied in Yelmo so that the errors between simulated and observed ice thickness are minimized. Program: tests/yelmo_opt.f90 To compile: make opt To run: ./runme -rs -e opt -o output/test -n par/yelmo_Antarctica_opt.nml The program consists of the following steps:","title":"Basal friction optimization"},{"location":"optimization/#1-spin-up-a-steady-state-ice-sheet-with-constant-forcing-and-fixed-topography","text":"For this step, the restart parameter should be set to yelmo.restart='none' , to ensure that the spin-up is performed with the current parameters. Currently, the program is hard-coded to spin-up the ice sheet for 20 kyr using SIA only, followed by another 10 kyr using the solver of choice, as seen in the following lines of code: call yelmo_update_equil_external(yelmo1,hyd1,cf_ref,time_init,time_tot=20e3,topo_fixed=.TRUE.,dt=5.0,ssa_vel_max=0.0) call yelmo_update_equil_external(yelmo1,hyd1,cf_ref,time_init,time_tot=10e3, topo_fixed=.TRUE.,dt=1.0,ssa_vel_max=5000.0) Note that this spin-up is obtained with a fixed topography set to the present-day observed fields ( H_ice , z_bed ). After the spin-up finishes, a restart file is written in the output directory with the name yelmo_restart.nc . The simulation will terminate at this point.","title":"1. Spin-up a steady-state ice sheet with constant forcing and fixed topography"},{"location":"optimization/#2-optimization","text":"The restart file from Step 1 should be saved somewhere convenient for the model (like in the input folder). Then the restart parameter should be set to that location yelmo.restart='PATH_TO_RESTART.nc' . This will ensure that the spin-up step is skipped, and instead the program will start directly with the optimization iterations. The optimization method follows Pollard and DeConto (2012), in that the basal friction coefficient is scaled as a function of the error in elevation. Here we do not modify beta directly, however, we assume that beta = cf_ref * lambda_bed * N_eff * f(u) . lambda_bed , N_eff and f(u) are all controlled by parameter choices in the .nml file like normal. Thus we are left with a unitless field cf_ref , which for any given friction law varies within the range of about [0:1]. When cf_ref=1.0 , sliding will diminish to near zero, and cf_ref~0.0 (near, but not zero) will give fast sliding. This gives a convenient range for optimization. Parameters that control the total run time are hard coded: qmax : number of total iterations to run, where qmax-1 is the number of optimization steps, during which cf_ref is updated, and the last step is a steady-state run with cf_ref held constant. time_iter : time to run the model for each iteration before updating cf_ref . time_steady : Time to run the model to steady state with cf_ref held constant (last iteration step). So, the program runs for, e.g., time_iter=500 years with a given initial field of cf_ref (with C_bed and beta updating every time step to follow changes in u/v and N_eff ). At the end of time_iter , the error in ice thickness is determined and used to update cf_ref via the function update_cf_ref_thickness_simple . The model is again run for time_iter years and the process is repeated. Two important parameters control the optimization process: tau and H_scale . The optimization works best when the ice shelves are relaxed to the reference (observed) ice thickness in the beginning of the simulation, and then gradually allowed to freely evolve. tau is the time scale of relaxation, which is applied in Yelmo as yelmo1%tpo%par%topo_rel_tau . A lower value of tau means that the ice shelves are more tightly held to the observed thickness. Likewise, H_scale controls the scaling of the ice thickness error, which determines how to modify cf_ref at each iteration. A higher value of H_scale means that changes to cf_ref will be applied more slowly. These parameters are designed to change over time with the simulation. tau is set to rel_tau1 from the start of the simulation until rel_time1 . Between rel_time1 and rel_time2 , tau is linearly scaled from the value of rel_tau1 to rel_tau2 . Or, if rel_q > 1 , then the scaling is non-linear with an exponent of rel_q (this helps maintain small values of tau longer which seems to help keep errors low). Once rel_time2 is reached, relaxation in the model is disabled, and the ice shelves are allowed to freely evolve. Analogously, H_scale is modified the same way: it is constant at the value of scale_H1 until scale_time1 , linearly scaled between scale_time1 and scale_time2 , and then constant thereafter at the value of scale_H2 . Increasing the value of H_scale over time helps to avoid oscillations in the optimization procedure as cf_ref approaches the best fit. Finally, after qmax-1 iterations or time=(qmax-1)*time_iter , cf_ref is held constant, and the simulation runs for time_steady years to equilibrate the model with the current conditions. This step minimizes drift in the final result and confirms that the optimized cf_ref field works well.","title":"2. Optimization"},{"location":"parameters/","text":"Parameters Here important parameter choices pertinent to running Yelmo will be documented. Each section will outline a specific parameter or set of related parameters. The author of each section and the date last updated will apear in the heading, to maintain traceability in the documentation (since code usually changes over time). This is a work in progress! Basal friction Yelmo includes the representation of several friction laws that all take the form: \\tau_b = -\\beta u_b where $\\beta$ is composed of a coefficient $c_b$ and potentially another contribution that depends on $u_b$ too: \\beta = c_b f(u_b) In Yelmo, the field $c_b$ is defined by the variable c_bed and has units of Pa. The term $f(u_b)$ is not output in the model, but it contributes with units of yr m$^{-1}$, so $\\beta$ finally has units of Pa yr m$^{-1}$. When multiplied with $u_b$, we arrive at $\\tau_b$ with units of Pa. Yelmo calculates $c_b$ ( c_bed ) internally as either: c_b = c_{\\rm b,ref} * N_{\\rm eff} or c_b = {\\rm tan}(c_{\\rm b,ref}) * N_{\\rm eff} This is controlled by the user option ytill.is_angle . If ytill.is_angle=True , then $c_{\\rm b,ref}$ (variable cb_ref in the code) is considered as an angle and the latter formulation above is used, following e.g., Bueler and van Pelt (2015). If ytill.is_angle=False , then cb_ref is used as a scalar field directly. In both cases, this field represents the till or basal properties (roughness, etc.) that are rather independent from how the effective pressure $N_{\\rm eff}$ (variable N_eff ) may be defined. With the variables formulated as above, it is possible to consider cb_ref as a tunable field that can be adjusted to improve model performance on a given domain. This can be achieved, for example, by performing optimization via the ice_optimization module, which adjusts cb_ref as a function of the mismatch of the simulated ice thickness with a target field. Also, cb_ref can either be optimized as a scalar field itself, or as an angle that is input to ${\\rm tan}(c_{\\rm b,ref})$ above. Another possibility is to tune cb_ref as a function of other model or boundary variables. The most common approach is to tune it as as function of the bedrock elevation relative to present-day sea level (e.g., Winkelmann et al., 2011). In Yelmo, this is controlled by the parameter choices in the ytill section, and in particular the parameter ytill.scale=['none','lin','exp'] . When ytill.scale='none' , no scaling function is applied and then cb_ref=ytill.cf_ref everywhere. When ytill.scale='lin' , a linear scaling is applied so that cb_ref goes from ytill.cf_min to ytill.cb_ref for bedrock elevations between ytill.z0 and ytill.z1 (saturating otherwise). Finally, if ytill.scale='exp' , an exponential decay function is applied, such that cb_ref=ytill.cf_ref for z_bed >= ytill.z1 , and decays following a curve that reaches ~30% of its value at z_bed=ytill.z0 . Finally, all values are limited to a minimum value of ytill.cf_min . Effective pressure Effective pressure ( N_eff , $N_{\\rm eff}$) in Yelmo is currently only used in the basal friction formulation as shown above. It provides a mechanism to alter the basal friction as a function of the state of the ice sheet, which is separate from $c_{\\rm b,ref}$ ( cb_ref ), which represents the properties of the bed beneath the ice sheet. The calculation of N_eff can be done with several methods: bash yneff.method = [-1,0,1,2,3] -1: Set N_eff external to Yelmo, do not modify it internally. 0: Impose a constant value, N_eff = yneff.const 1: Impose the overburden pressure, N_eff = rho_ice*g*H_ice 2: Calculate N_eff following the Leguy formulation 3: Calculate N_eff as till pressure following Bueler and van Pelt (2015). 4: Calculate N_eff as a 'two-valued' function scaled by f_pmp using yneff.delta.","title":"Parameters"},{"location":"parameters/#parameters","text":"Here important parameter choices pertinent to running Yelmo will be documented. Each section will outline a specific parameter or set of related parameters. The author of each section and the date last updated will apear in the heading, to maintain traceability in the documentation (since code usually changes over time). This is a work in progress!","title":"Parameters"},{"location":"parameters/#basal-friction","text":"Yelmo includes the representation of several friction laws that all take the form: \\tau_b = -\\beta u_b where $\\beta$ is composed of a coefficient $c_b$ and potentially another contribution that depends on $u_b$ too: \\beta = c_b f(u_b) In Yelmo, the field $c_b$ is defined by the variable c_bed and has units of Pa. The term $f(u_b)$ is not output in the model, but it contributes with units of yr m$^{-1}$, so $\\beta$ finally has units of Pa yr m$^{-1}$. When multiplied with $u_b$, we arrive at $\\tau_b$ with units of Pa. Yelmo calculates $c_b$ ( c_bed ) internally as either: c_b = c_{\\rm b,ref} * N_{\\rm eff} or c_b = {\\rm tan}(c_{\\rm b,ref}) * N_{\\rm eff} This is controlled by the user option ytill.is_angle . If ytill.is_angle=True , then $c_{\\rm b,ref}$ (variable cb_ref in the code) is considered as an angle and the latter formulation above is used, following e.g., Bueler and van Pelt (2015). If ytill.is_angle=False , then cb_ref is used as a scalar field directly. In both cases, this field represents the till or basal properties (roughness, etc.) that are rather independent from how the effective pressure $N_{\\rm eff}$ (variable N_eff ) may be defined. With the variables formulated as above, it is possible to consider cb_ref as a tunable field that can be adjusted to improve model performance on a given domain. This can be achieved, for example, by performing optimization via the ice_optimization module, which adjusts cb_ref as a function of the mismatch of the simulated ice thickness with a target field. Also, cb_ref can either be optimized as a scalar field itself, or as an angle that is input to ${\\rm tan}(c_{\\rm b,ref})$ above. Another possibility is to tune cb_ref as a function of other model or boundary variables. The most common approach is to tune it as as function of the bedrock elevation relative to present-day sea level (e.g., Winkelmann et al., 2011). In Yelmo, this is controlled by the parameter choices in the ytill section, and in particular the parameter ytill.scale=['none','lin','exp'] . When ytill.scale='none' , no scaling function is applied and then cb_ref=ytill.cf_ref everywhere. When ytill.scale='lin' , a linear scaling is applied so that cb_ref goes from ytill.cf_min to ytill.cb_ref for bedrock elevations between ytill.z0 and ytill.z1 (saturating otherwise). Finally, if ytill.scale='exp' , an exponential decay function is applied, such that cb_ref=ytill.cf_ref for z_bed >= ytill.z1 , and decays following a curve that reaches ~30% of its value at z_bed=ytill.z0 . Finally, all values are limited to a minimum value of ytill.cf_min .","title":"Basal friction"},{"location":"parameters/#effective-pressure","text":"Effective pressure ( N_eff , $N_{\\rm eff}$) in Yelmo is currently only used in the basal friction formulation as shown above. It provides a mechanism to alter the basal friction as a function of the state of the ice sheet, which is separate from $c_{\\rm b,ref}$ ( cb_ref ), which represents the properties of the bed beneath the ice sheet. The calculation of N_eff can be done with several methods: bash yneff.method = [-1,0,1,2,3] -1: Set N_eff external to Yelmo, do not modify it internally. 0: Impose a constant value, N_eff = yneff.const 1: Impose the overburden pressure, N_eff = rho_ice*g*H_ice 2: Calculate N_eff following the Leguy formulation 3: Calculate N_eff as till pressure following Bueler and van Pelt (2015). 4: Calculate N_eff as a 'two-valued' function scaled by f_pmp using yneff.delta.","title":"Effective pressure"},{"location":"remapping/","text":"Remapping Yelmo runs on a Cartesian (x/y) grid. Often input data comes in many formats, global lat/lon grids, projections and sets of points. It is important to have robust remapping tools. Typically for a given domain, we define a Polar Stereographic projection to be able to convert lat/lon data points onto a Cartesian plane. For Antarctica, for example, the standard projection has the following parameters: int polar_stereographic ; polar_stereographic:grid_mapping_name = \"polar_stereographic\" ; polar_stereographic:straight_vertical_longitude_from_pole = 0. ; polar_stereographic:latitude_of_projection_origin = -71. ; polar_stereographic:angle_of_oblique_tangent = 19. ; polar_stereographic:scale_factor_at_projection_origin = 1. ; polar_stereographic:false_easting = 0. ; polar_stereographic:false_northing = 0. ; Naming files For grids used by Yelmo, we generally use an abbreviation for the domain name followed by the resolution. So for Antarctica, we could have the grids ANT-32KM or ANT-16KM for a 32km or 16km grid, respectively. Data that have been projected onto these grids are saved with the grid name as a prefix followed by a general name that specifies the type of data, e.g., CLIM or TOPO , finally followed by more descriptive information about the specific dataset IPSL-14Ma or IPSL-PD-CTRL . For example, the latest topopgraphy dataset we use is called the RTopo2.0.1 dataset, so this is processed into a file called ANT-32KM_TOPO-RTOPO-2.0.1.nc . Fields Yelmo needs To drive Yelmo with boundary conditions derived from a climate model, it needs the following fields to be defined on the Polar Stereographic grid: Climatological mean near-surface air temperature [monthly] Climatological mean precipitation [monthly] Surface elevation Sea level Ice thickness Climatological mean 3D ocean temperature [annual] Climatological mean 3D ocean salinity [annual] Oceanic bathymetry Likely these would be processed into two or more separate files, e.g., one for climate CLIM variables and another for ocean OCN variables. Preprocessing data using cdo As a first step, the Climate Data Operators cdo package is great for most preprocessing steps. It can handle averaging data over time and space, merging data files, extracting individual variables etc. See the extensive documentation and examples online. For example, it is possible to use the command cdo selvar to extract specific variables from a file: cdo selvar,t2m,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc If you have several variables in individual files, you can then conveniently merge them into one file usine merge (it's better if they have the same shape): # Extract t2m to a temporary file cdo selvar,t2m diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc # Extract precip to a temporary file cdo selvar,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp2.nc # Merge the two individual variable files into one convenient file cdo merge ipsl_tmp1.nc ipsl_tmp2.nc ipsl_tmp3.nc There are many other useful commands, particularly for getting monthly means cdo monmean ... and other statistics. Resources: CDO Documentation page: https://code.mpimet.mpg.de/projects/cdo/wiki/Cdo#Documentation CDO User guide: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo.pdf CDO Reference card: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo_refcard.pdf Using cdo for remapping To remap a data file from lat/lon coordinates to our projection, cdo needs a grid description file that describes the target Polar Stereographic projection grid. For example, for a 32km resolution domain, we would use the following file named grid_ANT-32KM.txt : gridtype = projection gridsize = 36481 xsize = 191 ysize = 191 xname = xc xunits = km yname = yc yunits = km xfirst = -3040.000000 xinc = 32.000000 yfirst = -3040.000000 yinc = 32.000000 grid_mapping = crs grid_mapping_name = polar_stereographic straight_vertical_longitude_from_pole = 0.000 latitude_of_projection_origin = -90.000 standard_parallel = -71.000 false_easting = 0.000 false_northing = 0.000 semi_major_axis = 6378137.000 inverse_flattening = 298.25722356 With this file defined, it's easy to perform projections using the cdo remap* commands. To perform a bicubic interpolation, call: cdo remapbic,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-bic.nc Here, remapbic specifies bicubic interpolation and grid_ANT-32KM.txt defines the target grid as above. Then the source dataset is specified and the desired output file ANT-32KM_test.nc . To perform conservative interpolation, replace remapbic with remapcon : cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc Conservative interpolation is generally preferred, especially when going from a high resolution to a lower resolution, as it avoids unwanted interpolation artifacts and conserves the quantity being remapped. However, from low resolution to high resolution, conservative interpolation can result in more \"blocky\" fields with abrupt changes in values. Thus, in this case, bicubic interpolation, or conservative interpolation with additional Gaussian smoothing is better. The latter is not supported by cdo , but can be acheived with other tools. One option for processing may be a conservative remapping, following by a smoothing step: cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc cdo smooth,radius=128km ANT-32KM_test-con.nc ANT-32KM_test-con-smooth.nc The smoothing radius should be chosen such that it is the smallest value possible that removes blocky artifacts from the field. Summary It can be tedious to process data from a climate model into the right format to drive Yelmo. Tools like cdo help to reduce this burden. Other tools like NetCDF Operator NCO and today numerous Python-based libraries and tools can also be used. It is best to define a script or program with all the processing steps clearly defined. That way, when new data becomes available from the same model, it is easy to process it systematically (and reproducibly) in the same way without any trouble. Remapping restart file Sometimes we may want to restart a simulation at a new resolution - i.e., perform a spinup simulation at relatively low resolution and then continue the simulation at higher resolution. Use cdo to remap the restart file based on the grid definition files. # Define env variables as shortcuts to locations of grid files grid_src=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-32KM.txt grid_tgt=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-16KM.txt # Call remapping cdo remapcon,${grid_tgt} -setgrid,${grid_src} yelmo_restart.nc yelmo_restart_16km.nc Let's do a test. First, run a short 32km Greenland simulation and generate a restart file: ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim0-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" That simulation should have produced a nice restart file. Let's test a normal 32km simulation that continues from this restart file. ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim1-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" Ok, now generate scrip map file to interpolate from 32km down to 16km. domain=Greenland grid_name_src=GRL-32KM grid_name_tgt=GRL-4KM nc_src=../ice_data/${domain}/${grid_name_src}/${grid_name_src}_REGIONS.nc cdo gencon,grid_${grid_name_tgt}.txt -setgrid,grid_${grid_name_src}.txt ${nc_src} scrip-con_${grid_name_src}_${grid_name_tgt}.nc Now let's try to run a simulation at 16km, loading the restart file from 32km ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim2-16km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-16KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" The simulation is successful! (as of branch alex-dev-2 , revision 1d9783fb ).","title":"Remapping"},{"location":"remapping/#remapping","text":"Yelmo runs on a Cartesian (x/y) grid. Often input data comes in many formats, global lat/lon grids, projections and sets of points. It is important to have robust remapping tools. Typically for a given domain, we define a Polar Stereographic projection to be able to convert lat/lon data points onto a Cartesian plane. For Antarctica, for example, the standard projection has the following parameters: int polar_stereographic ; polar_stereographic:grid_mapping_name = \"polar_stereographic\" ; polar_stereographic:straight_vertical_longitude_from_pole = 0. ; polar_stereographic:latitude_of_projection_origin = -71. ; polar_stereographic:angle_of_oblique_tangent = 19. ; polar_stereographic:scale_factor_at_projection_origin = 1. ; polar_stereographic:false_easting = 0. ; polar_stereographic:false_northing = 0. ;","title":"Remapping"},{"location":"remapping/#naming-files","text":"For grids used by Yelmo, we generally use an abbreviation for the domain name followed by the resolution. So for Antarctica, we could have the grids ANT-32KM or ANT-16KM for a 32km or 16km grid, respectively. Data that have been projected onto these grids are saved with the grid name as a prefix followed by a general name that specifies the type of data, e.g., CLIM or TOPO , finally followed by more descriptive information about the specific dataset IPSL-14Ma or IPSL-PD-CTRL . For example, the latest topopgraphy dataset we use is called the RTopo2.0.1 dataset, so this is processed into a file called ANT-32KM_TOPO-RTOPO-2.0.1.nc .","title":"Naming files"},{"location":"remapping/#fields-yelmo-needs","text":"To drive Yelmo with boundary conditions derived from a climate model, it needs the following fields to be defined on the Polar Stereographic grid: Climatological mean near-surface air temperature [monthly] Climatological mean precipitation [monthly] Surface elevation Sea level Ice thickness Climatological mean 3D ocean temperature [annual] Climatological mean 3D ocean salinity [annual] Oceanic bathymetry Likely these would be processed into two or more separate files, e.g., one for climate CLIM variables and another for ocean OCN variables.","title":"Fields Yelmo needs"},{"location":"remapping/#preprocessing-data-using-cdo","text":"As a first step, the Climate Data Operators cdo package is great for most preprocessing steps. It can handle averaging data over time and space, merging data files, extracting individual variables etc. See the extensive documentation and examples online. For example, it is possible to use the command cdo selvar to extract specific variables from a file: cdo selvar,t2m,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc If you have several variables in individual files, you can then conveniently merge them into one file usine merge (it's better if they have the same shape): # Extract t2m to a temporary file cdo selvar,t2m diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc # Extract precip to a temporary file cdo selvar,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp2.nc # Merge the two individual variable files into one convenient file cdo merge ipsl_tmp1.nc ipsl_tmp2.nc ipsl_tmp3.nc There are many other useful commands, particularly for getting monthly means cdo monmean ... and other statistics. Resources: CDO Documentation page: https://code.mpimet.mpg.de/projects/cdo/wiki/Cdo#Documentation CDO User guide: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo.pdf CDO Reference card: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo_refcard.pdf","title":"Preprocessing data using cdo"},{"location":"remapping/#using-cdo-for-remapping","text":"To remap a data file from lat/lon coordinates to our projection, cdo needs a grid description file that describes the target Polar Stereographic projection grid. For example, for a 32km resolution domain, we would use the following file named grid_ANT-32KM.txt : gridtype = projection gridsize = 36481 xsize = 191 ysize = 191 xname = xc xunits = km yname = yc yunits = km xfirst = -3040.000000 xinc = 32.000000 yfirst = -3040.000000 yinc = 32.000000 grid_mapping = crs grid_mapping_name = polar_stereographic straight_vertical_longitude_from_pole = 0.000 latitude_of_projection_origin = -90.000 standard_parallel = -71.000 false_easting = 0.000 false_northing = 0.000 semi_major_axis = 6378137.000 inverse_flattening = 298.25722356 With this file defined, it's easy to perform projections using the cdo remap* commands. To perform a bicubic interpolation, call: cdo remapbic,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-bic.nc Here, remapbic specifies bicubic interpolation and grid_ANT-32KM.txt defines the target grid as above. Then the source dataset is specified and the desired output file ANT-32KM_test.nc . To perform conservative interpolation, replace remapbic with remapcon : cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc Conservative interpolation is generally preferred, especially when going from a high resolution to a lower resolution, as it avoids unwanted interpolation artifacts and conserves the quantity being remapped. However, from low resolution to high resolution, conservative interpolation can result in more \"blocky\" fields with abrupt changes in values. Thus, in this case, bicubic interpolation, or conservative interpolation with additional Gaussian smoothing is better. The latter is not supported by cdo , but can be acheived with other tools. One option for processing may be a conservative remapping, following by a smoothing step: cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc cdo smooth,radius=128km ANT-32KM_test-con.nc ANT-32KM_test-con-smooth.nc The smoothing radius should be chosen such that it is the smallest value possible that removes blocky artifacts from the field.","title":"Using cdo for remapping"},{"location":"remapping/#summary","text":"It can be tedious to process data from a climate model into the right format to drive Yelmo. Tools like cdo help to reduce this burden. Other tools like NetCDF Operator NCO and today numerous Python-based libraries and tools can also be used. It is best to define a script or program with all the processing steps clearly defined. That way, when new data becomes available from the same model, it is easy to process it systematically (and reproducibly) in the same way without any trouble.","title":"Summary"},{"location":"remapping/#remapping-restart-file","text":"Sometimes we may want to restart a simulation at a new resolution - i.e., perform a spinup simulation at relatively low resolution and then continue the simulation at higher resolution. Use cdo to remap the restart file based on the grid definition files. # Define env variables as shortcuts to locations of grid files grid_src=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-32KM.txt grid_tgt=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-16KM.txt # Call remapping cdo remapcon,${grid_tgt} -setgrid,${grid_src} yelmo_restart.nc yelmo_restart_16km.nc Let's do a test. First, run a short 32km Greenland simulation and generate a restart file: ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim0-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" That simulation should have produced a nice restart file. Let's test a normal 32km simulation that continues from this restart file. ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim1-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" Ok, now generate scrip map file to interpolate from 32km down to 16km. domain=Greenland grid_name_src=GRL-32KM grid_name_tgt=GRL-4KM nc_src=../ice_data/${domain}/${grid_name_src}/${grid_name_src}_REGIONS.nc cdo gencon,grid_${grid_name_tgt}.txt -setgrid,grid_${grid_name_src}.txt ${nc_src} scrip-con_${grid_name_src}_${grid_name_tgt}.nc Now let's try to run a simulation at 16km, loading the restart file from 32km ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim2-16km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-16KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" The simulation is successful! (as of branch alex-dev-2 , revision 1d9783fb ).","title":"Remapping restart file"},{"location":"running-antarctica/","text":"Standard YelmoX simulations To run YelmoX, by default we use the program yelmox.f90 . This program currently makes use of snapclim for the climatic forcing and smbpal for the snowpack and surface mass balance calculations. Spin-up simulation for ISMIP6-based runs (ISMIP6, ABUMIP) First make sure your distribution of yelmox and yelmo are up to date. cd yelmo git pull cd .. # In the main yelmox directory, change to the branch 'tfm2021' or 'abumip-2021': git pull git checkout tfm2021 # From main directory of yelmox, also reconfigure to adopt all changes: python3 config.py config/snowball_gfortran # Link to ice_data path as needed: ln -s /media/Data/ice_data ice_data Now compile as normal, but with the yelmox_ismip6 program: make clean make yelmox_ismip6 You are now ready to run some ISMIP6 simulations. If you have a spinup simulation available you can skip the rest of this section. The next step is to get a spin-up simulation ready. To do so, we will run a small ensemble of simulations that apply different calving coefficients ( ytopo.kt ) and shear-regime enhancement factors ( ymat.enh_shear ). Each simulation will run with these parameter values set, while optimizing the basal friction coefficient field cb_ref and the temperature anomalies imposed in different basins tf_corr . To run this ensemble, use the following commands: # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Define the output folder fldr=tmp/ismip6/spinup_32km_68 # Run the Yelmo ensemble jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1,3 An alternative spinup procedure # First run for 30kyr with topo relax on to spinup thermodynamics... runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_69 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 spinup_ismip6.equil_method=\"relax\" spinup_ismip6.time_end=30e3 spinup_ismip6.time_equil=30e3 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1 # Testing opt spinup but with 'robin' initial temp profile runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_70 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=1.0 # robin-cold but only by -2deg instead of -10deg runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_71 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=0.2 opt_L21.cf_init=0.2 runopt='-rs -q short -w 10' fldr=tmp/ismip6/spinup_32km_72 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 opt_L21.cf_max=10,20,40,45 ydyn.beta_u0=100,300 ISMIP6 simulations Make sure you already have a spinup simulation available, and that the parameters of the spinup will match those supplied here. The next step is to run different experiments of interest that restart from the spinup experiment. Some commands for running diagnostic short runs ### Diagnostic short runs ### # Run a 16km spinup run with relaxation runopt='-rs -q priority -w 1' fldr=tmp/ismip6/spinup_16km_72_diag jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 spinup_ismip6.equil_method=\"relax\" yelmo.grid_name=\"ANT-16KM\" spinup_ismip6.time_end=20 spinup_ismip6.dt2D_out=1 # Run with 16km restarting from 32km file # (currently crashes probably because tf_corr cannot be interpolated) runopt='-rs -q priority -w 1' fldr=tmp/ismip6/ismip_32km_68_diag file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} yelmo.grid_name=\"ANT-16KM\" transient_proj.time_end=1920 transient_proj.dt2D_out=1 ytill.is_angle=False ### Actual ISMIP6 commands # Define output folder as a bash variable fldr=tmp/ismip6/ismip_32km_71 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" ## First run a steady-state simulation to stabilize everything # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_71/0/yelmo_restart.nc # ctrl-0 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl-0 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 transient_proj.time_end=11900 transient_proj.dt1D_out=10 transient_proj.dt2D_out=200 ${paropt} ## Next, call the Yelmo commands for the individual cases... # Define restart file path as a bash variable, for example, on snowball: #file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc file_restart=/p/tmp/robinson/ismip6/ismip_32km_68/ctrl-0/yelmo_restart.nc # ctrl ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp05 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp05 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp09 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp09 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_95\" marine_shelf.gamma_quad_nl=21000 ${paropt} # exp10 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp10 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_5\" marine_shelf.gamma_quad_nl=9620 ${paropt} # exp13 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp13 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_pigl\" marine_shelf.gamma_quad_nl=159000 ${paropt} ABUMIP Make sure you already have a spinup simulation available. Use the following commands to run the three main experiments of interest. Note that abuk and abum may run much more slowly than abuc . The parameter values applied in the commands below ensure that the model parameters correspond to those used in the restart simulation, although many of them like ocean temp. anomalies in different basins or calving parameters, are no longer relevant in the ABUMIP context. It is important, however, to specify ydyn.ssa_lat_bc='marine' , as it is relevant for this experiment to apply marine boundary conditions. This is generally not used currently, as it makes the model much less stable. Note that an equilibrium spin-up simulation has already been performed, which gives good agreement with the present-day ice sheet. These results have been saved in a restart file, from which your simulations will begin (see below). # Define restart file path as a bash variable file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/abumip_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' debugopt=\"abumip_proj.time_end=20 abumip_proj.dt2D_out=1\" # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Call the Yelmo commands... # ABUC - control experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuc -p abumip.scenario=\"abuc\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUM - High shelf melt (400 m/yr) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk-marine -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" # ABUM - High shelf melt (400 m/yr) (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum-marine -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" Simulations with hyster # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/ramp_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Now run simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r1 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=30e3 hyster.method='ramp-time' hyster.df_sign=-1 hyster.dt_init=0 hyster.dt_ramp=10e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} # Try a periodic simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r2 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=100e3 hyster.method='sin' hyster.df_sign=1 hyster.dt_init=0 hyster.dt_ramp=20e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} That's it!","title":"Standard YelmoX simulations"},{"location":"running-antarctica/#standard-yelmox-simulations","text":"To run YelmoX, by default we use the program yelmox.f90 . This program currently makes use of snapclim for the climatic forcing and smbpal for the snowpack and surface mass balance calculations.","title":"Standard YelmoX simulations"},{"location":"running-antarctica/#spin-up-simulation-for-ismip6-based-runs-ismip6-abumip","text":"First make sure your distribution of yelmox and yelmo are up to date. cd yelmo git pull cd .. # In the main yelmox directory, change to the branch 'tfm2021' or 'abumip-2021': git pull git checkout tfm2021 # From main directory of yelmox, also reconfigure to adopt all changes: python3 config.py config/snowball_gfortran # Link to ice_data path as needed: ln -s /media/Data/ice_data ice_data Now compile as normal, but with the yelmox_ismip6 program: make clean make yelmox_ismip6 You are now ready to run some ISMIP6 simulations. If you have a spinup simulation available you can skip the rest of this section. The next step is to get a spin-up simulation ready. To do so, we will run a small ensemble of simulations that apply different calving coefficients ( ytopo.kt ) and shear-regime enhancement factors ( ymat.enh_shear ). Each simulation will run with these parameter values set, while optimizing the basal friction coefficient field cb_ref and the temperature anomalies imposed in different basins tf_corr . To run this ensemble, use the following commands: # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Define the output folder fldr=tmp/ismip6/spinup_32km_68 # Run the Yelmo ensemble jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1,3 An alternative spinup procedure # First run for 30kyr with topo relax on to spinup thermodynamics... runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_69 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 spinup_ismip6.equil_method=\"relax\" spinup_ismip6.time_end=30e3 spinup_ismip6.time_equil=30e3 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1 # Testing opt spinup but with 'robin' initial temp profile runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_70 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=1.0 # robin-cold but only by -2deg instead of -10deg runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_71 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=0.2 opt_L21.cf_init=0.2 runopt='-rs -q short -w 10' fldr=tmp/ismip6/spinup_32km_72 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 opt_L21.cf_max=10,20,40,45 ydyn.beta_u0=100,300","title":"Spin-up simulation for ISMIP6-based runs (ISMIP6, ABUMIP)"},{"location":"running-antarctica/#ismip6-simulations","text":"Make sure you already have a spinup simulation available, and that the parameters of the spinup will match those supplied here. The next step is to run different experiments of interest that restart from the spinup experiment. Some commands for running diagnostic short runs ### Diagnostic short runs ### # Run a 16km spinup run with relaxation runopt='-rs -q priority -w 1' fldr=tmp/ismip6/spinup_16km_72_diag jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 spinup_ismip6.equil_method=\"relax\" yelmo.grid_name=\"ANT-16KM\" spinup_ismip6.time_end=20 spinup_ismip6.dt2D_out=1 # Run with 16km restarting from 32km file # (currently crashes probably because tf_corr cannot be interpolated) runopt='-rs -q priority -w 1' fldr=tmp/ismip6/ismip_32km_68_diag file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} yelmo.grid_name=\"ANT-16KM\" transient_proj.time_end=1920 transient_proj.dt2D_out=1 ytill.is_angle=False ###","title":"ISMIP6 simulations"},{"location":"running-antarctica/#actual-ismip6-commands","text":"# Define output folder as a bash variable fldr=tmp/ismip6/ismip_32km_71 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" ## First run a steady-state simulation to stabilize everything # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_71/0/yelmo_restart.nc # ctrl-0 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl-0 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 transient_proj.time_end=11900 transient_proj.dt1D_out=10 transient_proj.dt2D_out=200 ${paropt} ## Next, call the Yelmo commands for the individual cases... # Define restart file path as a bash variable, for example, on snowball: #file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc file_restart=/p/tmp/robinson/ismip6/ismip_32km_68/ctrl-0/yelmo_restart.nc # ctrl ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp05 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp05 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp09 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp09 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_95\" marine_shelf.gamma_quad_nl=21000 ${paropt} # exp10 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp10 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_5\" marine_shelf.gamma_quad_nl=9620 ${paropt} # exp13 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp13 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_pigl\" marine_shelf.gamma_quad_nl=159000 ${paropt}","title":"Actual ISMIP6 commands"},{"location":"running-antarctica/#abumip","text":"Make sure you already have a spinup simulation available. Use the following commands to run the three main experiments of interest. Note that abuk and abum may run much more slowly than abuc . The parameter values applied in the commands below ensure that the model parameters correspond to those used in the restart simulation, although many of them like ocean temp. anomalies in different basins or calving parameters, are no longer relevant in the ABUMIP context. It is important, however, to specify ydyn.ssa_lat_bc='marine' , as it is relevant for this experiment to apply marine boundary conditions. This is generally not used currently, as it makes the model much less stable. Note that an equilibrium spin-up simulation has already been performed, which gives good agreement with the present-day ice sheet. These results have been saved in a restart file, from which your simulations will begin (see below). # Define restart file path as a bash variable file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/abumip_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' debugopt=\"abumip_proj.time_end=20 abumip_proj.dt2D_out=1\" # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Call the Yelmo commands... # ABUC - control experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuc -p abumip.scenario=\"abuc\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUM - High shelf melt (400 m/yr) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk-marine -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" # ABUM - High shelf melt (400 m/yr) (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum-marine -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\"","title":"ABUMIP"},{"location":"running-antarctica/#simulations-with-hyster","text":"# Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/ramp_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Now run simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r1 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=30e3 hyster.method='ramp-time' hyster.df_sign=-1 hyster.dt_init=0 hyster.dt_ramp=10e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} # Try a periodic simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r2 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=100e3 hyster.method='sin' hyster.df_sign=1 hyster.dt_init=0 hyster.dt_ramp=20e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} That's it!","title":"Simulations with hyster"},{"location":"running-greenland-ismip6/","text":"Testing without the ice sheet jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" spinup_ismip6.with_ice_sheet=False spinup_ismip6.time_end=10 spinup_ismip6.dt2D_out=10 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"transient_proj\" transient_proj.with_ice_sheet=False Specify run choices, to run locally in the background runopt='-r' or, to submit job to a cluster, eg runopt='-rs -q priority -w 05:00:00' Define the output folder fldr=tmp/ismip6/spinup-grl_16km_2 Run the Yelmo ensemble fldr=tmp/ismip6/spinup-grl_16km_1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ymat.enh_shear=1,3 fldr=tmp/ismip6/spinup-grl_16km_2 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.opt_tf=True opt_L21.tf_min=-1 opt_L21.tf_max=1 fldr=tmp/ismip6/spinup-grl_16km_3 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" Spinup is not great, but functions. Try a transient simulation now file_restart=/p/tmp/robinson/ismip6/spinup-grl_16km_3/0/yelmo_restart.nc fldr=tmp/ismip6/ismip-grl_16km_3-1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p yelmo.restart=${file_restart} ctrl.run_step=\"transient_proj\" transient_proj.scenario=\"ctrl\",\"rcp26\",\"rcp85\" ismip6.gcm=\"miroc5\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" ytill.method=-1","title":"Testing without the ice sheet"},{"location":"running-greenland-ismip6/#testing-without-the-ice-sheet","text":"jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" spinup_ismip6.with_ice_sheet=False spinup_ismip6.time_end=10 spinup_ismip6.dt2D_out=10 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"transient_proj\" transient_proj.with_ice_sheet=False","title":"Testing without the ice sheet"},{"location":"running-greenland-ismip6/#specify-run-choices-to-run-locally-in-the-background","text":"runopt='-r'","title":"Specify run choices, to run locally in the background"},{"location":"running-greenland-ismip6/#or-to-submit-job-to-a-cluster-eg","text":"runopt='-rs -q priority -w 05:00:00'","title":"or, to submit job to a cluster, eg"},{"location":"running-greenland-ismip6/#define-the-output-folder","text":"fldr=tmp/ismip6/spinup-grl_16km_2","title":"Define the output folder"},{"location":"running-greenland-ismip6/#run-the-yelmo-ensemble","text":"fldr=tmp/ismip6/spinup-grl_16km_1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ymat.enh_shear=1,3 fldr=tmp/ismip6/spinup-grl_16km_2 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.opt_tf=True opt_L21.tf_min=-1 opt_L21.tf_max=1 fldr=tmp/ismip6/spinup-grl_16km_3 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\"","title":"Run the Yelmo ensemble"},{"location":"running-greenland-ismip6/#spinup-is-not-great-but-functions-try-a-transient-simulation-now","text":"file_restart=/p/tmp/robinson/ismip6/spinup-grl_16km_3/0/yelmo_restart.nc fldr=tmp/ismip6/ismip-grl_16km_3-1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p yelmo.restart=${file_restart} ctrl.run_step=\"transient_proj\" transient_proj.scenario=\"ctrl\",\"rcp26\",\"rcp85\" ismip6.gcm=\"miroc5\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" ytill.method=-1","title":"Spinup is not great, but functions. Try a transient simulation now"},{"location":"running-greenland-paleo/","text":"Running with YelmoX: Greenland paleo simulations This document describes how to get a transient paleo simulation running. It will assume that you already have cloned yelmo and yelmox and have checked out the right version and/or branch, and that you have access to the boundary data needed (e.g., in an ice_data folder). This setup will use the standard yelmox.f90 program. To run a present-day spinup simulation that will also optimize the basal friction coefficient cb_ref using the default parameter values, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o output/spinup-grl-1 Once a spinup is available, you can run a transient simulation. # Define output folder and restart file path fldr=output/paleo-grl-1 restart=output/spinup-grl-1/yelmo_restart.nc # Call run command ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o ${fldr} -p ctrl.time_init=-158e3 ctrl.time_end=2000 ctrl.transient_clim=True ctrl.equil_method=\"none\" yelmo.restart=${restart} Note that transient simulations have the time defined in [years CE]. In other words, time=2000 corresponds to the time 2000 CE. There is another variable available in the code time_bp which is used to represent time before present, where the present day is year 0, assumed to occur at the year time=1950 . To run without a restart file, then it is possible to run the above command but leave out the option yelmo.restart=${restart} . That's it!","title":"Running with YelmoX: Greenland paleo simulations"},{"location":"running-greenland-paleo/#running-with-yelmox-greenland-paleo-simulations","text":"This document describes how to get a transient paleo simulation running. It will assume that you already have cloned yelmo and yelmox and have checked out the right version and/or branch, and that you have access to the boundary data needed (e.g., in an ice_data folder). This setup will use the standard yelmox.f90 program. To run a present-day spinup simulation that will also optimize the basal friction coefficient cb_ref using the default parameter values, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o output/spinup-grl-1 Once a spinup is available, you can run a transient simulation. # Define output folder and restart file path fldr=output/paleo-grl-1 restart=output/spinup-grl-1/yelmo_restart.nc # Call run command ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o ${fldr} -p ctrl.time_init=-158e3 ctrl.time_end=2000 ctrl.transient_clim=True ctrl.equil_method=\"none\" yelmo.restart=${restart} Note that transient simulations have the time defined in [years CE]. In other words, time=2000 corresponds to the time 2000 CE. There is another variable available in the code time_bp which is used to represent time before present, where the present day is year 0, assumed to occur at the year time=1950 . To run without a restart file, then it is possible to run the above command but leave out the option yelmo.restart=${restart} . That's it!","title":"Running with YelmoX: Greenland paleo simulations"},{"location":"running-hysteresis/","text":"Running hysteresis experiments for Antarctica For now, we will: Use optimized basal friction. Spinup with a constant present-day climate based on the ISMIP6 protocol. To run yelmox with this setup, we need the hyst-2021 branches: cd yelmox git checkout hyst-2021 cd yelmo git checkout hyst-2021 # Reconfigure python3 config.py config/snowball_gfortran cd .. python3 config.py config/snowball_gfortran # If not done already, link to ice_data ln -s /media/Data/ice_data ice_data We will run with the ISMIP6 standard YelmoX program, so compile: make clean make yelmox_ismip6 The hysteresis runs can be done in two steps: First, generate a spun-up simulation with optimized basal friction. Restart from the optimized, spun-up state and continue with transient forcing from the hysteresis module. Step 1: spinup The spinup simulation runs for 30.000 years, by default. For the first opt_L21.rel_time1=5e3 years, the shelves and grounding line are relaxed (tightly) to the present-day reference state, while the optimization of the basal friction field cf_ref is active. Next between opt_L21.rel_time1=5e3 kyr and opt_L21.rel_time2=10e3 years, the relaxation timescale is slowly increased from opt_L21.rel_tau1=10 yrs to opt_L21.rel_tau2=1000 yrs, to slowly allow the ice sheet more freedom to adjust its state. Basal optimization is further optimized during this time period. After opt_L21.rel_time2=10e3 years, the relation is disabled and the ice sheet if fully prognostic. The simulation is further run until the end with continual optimization adjustments to cf_ref , although these are usually minor after the initial spinup period. To run a spinup simulation as above, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/spinup01 -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2 tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 Or an ensemble to test different parameters too: fldr=output/hyst/spinup02 jobrun ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2,0.20e-2,0.30e-2,0.40e-2 tf_corr_ant.ronne=0.0,0.25 tf_corr_ant.ross=0.0,0.2 tf_corr_ant.pine=-0.5,0.0 To make the spinup run for a longer time, like 50 kyr, set ctrl.time_end=50e3 . If you already have a spinup simulation available, you can skip that step. Alternatively, you can specify one that is ready on snowball : yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc Step 2: transient simulations To run transient simulations the run_step should be specified as ctrl.run_step=\"hysteresis_proj\" . Typically model parameters should be defined to be equivalent to those used by the restart simulation. The time control parameters of the simulation are defined in the parameter section &hysteresis_proj . Parameters associated with the hysteresis module can be changed in the &hyster section. To be consistent with the restart file above, the following reference parameter values should be set in the parameter file (or at the command line, if used as part of the ensemble): ctrl.run_step=\"hysteresis_proj\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 # For setting output frequency hysteresis_proj.dt2D_out=5e3 hysteresis.dt2D_small_out=100 Example transient simulation of hysteresis_proj.time_end=500 years, with ramp forcing via the hyster module with 100 years of constant forcing, followed by a ramp over 250 years from an anomaly of 0 degC to 5 degC: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test1 -p ytopo.kt=0.001 \\ hyster.method=\"ramp\" hyster.dt_init=100 hyster.dt_ramp=250 hyster.f_min=0 hyster.f_max=5 Example transient simulation using Adaptive Quasi-Equilibrium Forcing (AQEF) with no lead-in time: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test2 -p ytopo.kt=0.001 \\ hyster.method=\"PI42\" hyster.dt_init=0 hyster.f_min=0 hyster.f_max=5 Or simply further constant forcing of a control run by setting hyster.method=\"const\" . You can add white noise (Normal distribution) to your forcing by setting the standard deviation greater than zero: hyster.sigma=1.0 .","title":"Running hysteresis experiments for Antarctica"},{"location":"running-hysteresis/#running-hysteresis-experiments-for-antarctica","text":"For now, we will: Use optimized basal friction. Spinup with a constant present-day climate based on the ISMIP6 protocol. To run yelmox with this setup, we need the hyst-2021 branches: cd yelmox git checkout hyst-2021 cd yelmo git checkout hyst-2021 # Reconfigure python3 config.py config/snowball_gfortran cd .. python3 config.py config/snowball_gfortran # If not done already, link to ice_data ln -s /media/Data/ice_data ice_data We will run with the ISMIP6 standard YelmoX program, so compile: make clean make yelmox_ismip6 The hysteresis runs can be done in two steps: First, generate a spun-up simulation with optimized basal friction. Restart from the optimized, spun-up state and continue with transient forcing from the hysteresis module.","title":"Running hysteresis experiments for Antarctica"},{"location":"running-hysteresis/#step-1-spinup","text":"The spinup simulation runs for 30.000 years, by default. For the first opt_L21.rel_time1=5e3 years, the shelves and grounding line are relaxed (tightly) to the present-day reference state, while the optimization of the basal friction field cf_ref is active. Next between opt_L21.rel_time1=5e3 kyr and opt_L21.rel_time2=10e3 years, the relaxation timescale is slowly increased from opt_L21.rel_tau1=10 yrs to opt_L21.rel_tau2=1000 yrs, to slowly allow the ice sheet more freedom to adjust its state. Basal optimization is further optimized during this time period. After opt_L21.rel_time2=10e3 years, the relation is disabled and the ice sheet if fully prognostic. The simulation is further run until the end with continual optimization adjustments to cf_ref , although these are usually minor after the initial spinup period. To run a spinup simulation as above, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/spinup01 -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2 tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 Or an ensemble to test different parameters too: fldr=output/hyst/spinup02 jobrun ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2,0.20e-2,0.30e-2,0.40e-2 tf_corr_ant.ronne=0.0,0.25 tf_corr_ant.ross=0.0,0.2 tf_corr_ant.pine=-0.5,0.0 To make the spinup run for a longer time, like 50 kyr, set ctrl.time_end=50e3 . If you already have a spinup simulation available, you can skip that step. Alternatively, you can specify one that is ready on snowball : yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc","title":"Step 1: spinup"},{"location":"running-hysteresis/#step-2-transient-simulations","text":"To run transient simulations the run_step should be specified as ctrl.run_step=\"hysteresis_proj\" . Typically model parameters should be defined to be equivalent to those used by the restart simulation. The time control parameters of the simulation are defined in the parameter section &hysteresis_proj . Parameters associated with the hysteresis module can be changed in the &hyster section. To be consistent with the restart file above, the following reference parameter values should be set in the parameter file (or at the command line, if used as part of the ensemble): ctrl.run_step=\"hysteresis_proj\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 # For setting output frequency hysteresis_proj.dt2D_out=5e3 hysteresis.dt2D_small_out=100 Example transient simulation of hysteresis_proj.time_end=500 years, with ramp forcing via the hyster module with 100 years of constant forcing, followed by a ramp over 250 years from an anomaly of 0 degC to 5 degC: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test1 -p ytopo.kt=0.001 \\ hyster.method=\"ramp\" hyster.dt_init=100 hyster.dt_ramp=250 hyster.f_min=0 hyster.f_max=5 Example transient simulation using Adaptive Quasi-Equilibrium Forcing (AQEF) with no lead-in time: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test2 -p ytopo.kt=0.001 \\ hyster.method=\"PI42\" hyster.dt_init=0 hyster.f_min=0 hyster.f_max=5 Or simply further constant forcing of a control run by setting hyster.method=\"const\" . You can add white noise (Normal distribution) to your forcing by setting the standard deviation greater than zero: hyster.sigma=1.0 .","title":"Step 2: transient simulations"},{"location":"running-with-yelmox/","text":"Running with YelmoX YelmoX is a separate repository that is designed to provide supplementary libraries and programs that allow running ice-sheet simulations with realistic boundary (e.g., climate and ocean) forcing and interactions (e.g., isostatic rebound). Here you can find the basic information and steps needed to get YelmoX running. Super-quick start A summary of commands to get started is given below. Make sure all Dependencies are installed and that you follow the HPC notes ! Also note, below it is assumed that you are setting up on the pik_hpc2024 system. If not, make sure to specify the config file for your own system, as well as the locations of ice_data and isostasy_data (see HPC notes ). # yelmox git clone git@github.com:palma-ice/yelmox.git cd yelmox python3 config.py config/pik_hpc2024_ifx # yelmo git clone git@github.com:palma-ice/yelmo.git cd yelmo python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ cd .. # FastIsostasy git clone git@github.com:palma-ice/FastIsostasy.git cd FastIsostasy python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./ cd .. # coordinates git clone git@github.com:cxesmc/coordinates.git cd coordinates COORDSRC=$PWD python3 config.py config/pik_hpc2024_ifx cd .. # REMBOv1 git clone git@github.com:alex-robinson/rembo1.git cd rembo1 python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ ln -s $COORDSRC ./ cd .. # Now, compile the default program make clean make yelmox # Link to `ice_data` and `isostasy_data` repositories wherever you have them saved on your system datapath=/p/projects/megarun ln -s $datapath/ice_data ln -s $datapath/isostasy_data # Copy the runme config file to the main directory and modify for your system cp .runme/runme_config .runme_config # Run a test simulation of Antarctica for 1000 yrs ./runme -r -e yelmox -n par/yelmo_Antarctica.nml -o output/ant-test -p ctrl.time_end=1e3 That's it!","title":"Running with YelmoX"},{"location":"running-with-yelmox/#running-with-yelmox","text":"YelmoX is a separate repository that is designed to provide supplementary libraries and programs that allow running ice-sheet simulations with realistic boundary (e.g., climate and ocean) forcing and interactions (e.g., isostatic rebound). Here you can find the basic information and steps needed to get YelmoX running.","title":"Running with YelmoX"},{"location":"running-with-yelmox/#super-quick-start","text":"A summary of commands to get started is given below. Make sure all Dependencies are installed and that you follow the HPC notes ! Also note, below it is assumed that you are setting up on the pik_hpc2024 system. If not, make sure to specify the config file for your own system, as well as the locations of ice_data and isostasy_data (see HPC notes ). # yelmox git clone git@github.com:palma-ice/yelmox.git cd yelmox python3 config.py config/pik_hpc2024_ifx # yelmo git clone git@github.com:palma-ice/yelmo.git cd yelmo python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ cd .. # FastIsostasy git clone git@github.com:palma-ice/FastIsostasy.git cd FastIsostasy python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./ cd .. # coordinates git clone git@github.com:cxesmc/coordinates.git cd coordinates COORDSRC=$PWD python3 config.py config/pik_hpc2024_ifx cd .. # REMBOv1 git clone git@github.com:alex-robinson/rembo1.git cd rembo1 python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ ln -s $COORDSRC ./ cd .. # Now, compile the default program make clean make yelmox # Link to `ice_data` and `isostasy_data` repositories wherever you have them saved on your system datapath=/p/projects/megarun ln -s $datapath/ice_data ln -s $datapath/isostasy_data # Copy the runme config file to the main directory and modify for your system cp .runme/runme_config .runme_config # Run a test simulation of Antarctica for 1000 yrs ./runme -r -e yelmox -n par/yelmo_Antarctica.nml -o output/ant-test -p ctrl.time_end=1e3 That's it!","title":"Super-quick start"},{"location":"running-yelmox-rembo/","text":"Running with YelmoX-REMBO Before doing anything, make sure dependencies are installed (Lis, NetCDF, Python:runner) Step 1: Clone the repositories # Clone repository: YelmoX git clone https://github.com/palma-ice/yelmox.git # Clone yelmo into a sub-directory too cd yelmox git clone https://github.com/palma-ice/yelmo.git # Clone isostasy into a sub-directory too git clone https://github.com/palma-ice/isostasy.git # Clone rembo into a sub-directory too git clone https://github.com/alex-robinson/rembo1 At this point all the code is downloaded onto the machine. Now we need to configure it for compiling properly. Step 2: Run configuration scripts for each code base # Enter Yelmo directory and configure it for compiling cd yelmo python config.py config/snowball_gfortran make clean cd .. # Enter isostasy directory and configure it for compiling cd isostasy python config.py config/snowball_gfortran make clean cd .. # Enter rembo1 directory and configure it for compiling cd rembo1 python config.py config/snowball_gfortran make clean cd .. # From YelmoX directory, configure it for compiling too python config.py config/snowball_gfortran make clean Note that the example assumes we are using the machine called snowball and we will use the compiler gfortran . If this is not correct, you will need to use the right configuration file available in the config/ directory, or make your own using the others as a template. Step 3: Compile the default program make clean make yelmox_rembo Step 4: Make a link to ice_data # Link to `ice_data` repository wherever you have it saved on your system ln -s /media/Data/ice_data # Take a look at some data to make sure it worked well ncview ice_data/Greenland/GRL-16KM/GRL-16KM_TOPO-M17-v5.nc Step 5: Run a test simulation # Run a test simulation of Greenland for 100 yrs ./runme -r -e rembo -n par/yelmo_Greenland_rembo.nml -o output/test1 -p ctrl.time_end=1e2 That's it, YelmoX-REMBO is working.","title":"Running with YelmoX-REMBO"},{"location":"running-yelmox-rembo/#running-with-yelmox-rembo","text":"Before doing anything, make sure dependencies are installed (Lis, NetCDF, Python:runner)","title":"Running with YelmoX-REMBO"},{"location":"running-yelmox-rembo/#step-1-clone-the-repositories","text":"# Clone repository: YelmoX git clone https://github.com/palma-ice/yelmox.git # Clone yelmo into a sub-directory too cd yelmox git clone https://github.com/palma-ice/yelmo.git # Clone isostasy into a sub-directory too git clone https://github.com/palma-ice/isostasy.git # Clone rembo into a sub-directory too git clone https://github.com/alex-robinson/rembo1 At this point all the code is downloaded onto the machine. Now we need to configure it for compiling properly.","title":"Step 1: Clone the repositories"},{"location":"running-yelmox-rembo/#step-2-run-configuration-scripts-for-each-code-base","text":"# Enter Yelmo directory and configure it for compiling cd yelmo python config.py config/snowball_gfortran make clean cd .. # Enter isostasy directory and configure it for compiling cd isostasy python config.py config/snowball_gfortran make clean cd .. # Enter rembo1 directory and configure it for compiling cd rembo1 python config.py config/snowball_gfortran make clean cd .. # From YelmoX directory, configure it for compiling too python config.py config/snowball_gfortran make clean Note that the example assumes we are using the machine called snowball and we will use the compiler gfortran . If this is not correct, you will need to use the right configuration file available in the config/ directory, or make your own using the others as a template.","title":"Step 2: Run configuration scripts for each code base"},{"location":"running-yelmox-rembo/#step-3-compile-the-default-program","text":"make clean make yelmox_rembo","title":"Step 3: Compile the default program"},{"location":"running-yelmox-rembo/#step-4-make-a-link-to-ice_data","text":"# Link to `ice_data` repository wherever you have it saved on your system ln -s /media/Data/ice_data # Take a look at some data to make sure it worked well ncview ice_data/Greenland/GRL-16KM/GRL-16KM_TOPO-M17-v5.nc","title":"Step 4: Make a link to ice_data"},{"location":"running-yelmox-rembo/#step-5-run-a-test-simulation","text":"# Run a test simulation of Greenland for 100 yrs ./runme -r -e rembo -n par/yelmo_Greenland_rembo.nml -o output/test1 -p ctrl.time_end=1e2 That's it, YelmoX-REMBO is working.","title":"Step 5: Run a test simulation"},{"location":"snapclim/","text":"Snapshot climate (snapclim) The snapclim module is designed to determine climatic forcing, i.e., monthly temperature and precipitation, for a given point in time. This can be acheived by applying a temperature anomaly, or by interpolating snapshots of climate states available for different times. The \"hybrid\" method This is my preferred method and is set up to be rather flexible, and I think is a good place to start for these simulations. It is comprised of an annual mean temperature anomaly time series from 300 kyr ago to today obtained from several spliced paleo reconstructions plus a monthly seasonal cycle over the 300 kyr obtained from a climber2 paleo run. So with the monthly values and the annual mean, you can get monthly temp anomalies over 300 kyr. There are more details in the attached manuscript that was never yet submitted... To activate this method, in the parameter file, set the following parameters in the group \"snapclim\": atm_type = \"hybrid\" ocn_type = \"hybrid\" Then in the group \"snapclim_hybrid\", you can specify: f_eem = 0.4 # Controls the maximum temp anomaly during the Eemian f_glac = 1.0 # Controls the minimum temp anomaly during the glacial period f_hol = 0.5 # Controls the maximum temp anomaly during the Holocene f_seas = 1.0 # Controls the magnitude of the seasonal cycle f_to = 0.2 # Defines the oceanic temperature anomaly relative # to the annual mean atmospheric temp anomaly","title":"Snapshot climate (snapclim)"},{"location":"snapclim/#snapshot-climate-snapclim","text":"The snapclim module is designed to determine climatic forcing, i.e., monthly temperature and precipitation, for a given point in time. This can be acheived by applying a temperature anomaly, or by interpolating snapshots of climate states available for different times.","title":"Snapshot climate (snapclim)"},{"location":"snapclim/#the-hybrid-method","text":"This is my preferred method and is set up to be rather flexible, and I think is a good place to start for these simulations. It is comprised of an annual mean temperature anomaly time series from 300 kyr ago to today obtained from several spliced paleo reconstructions plus a monthly seasonal cycle over the 300 kyr obtained from a climber2 paleo run. So with the monthly values and the annual mean, you can get monthly temp anomalies over 300 kyr. There are more details in the attached manuscript that was never yet submitted... To activate this method, in the parameter file, set the following parameters in the group \"snapclim\": atm_type = \"hybrid\" ocn_type = \"hybrid\" Then in the group \"snapclim_hybrid\", you can specify: f_eem = 0.4 # Controls the maximum temp anomaly during the Eemian f_glac = 1.0 # Controls the minimum temp anomaly during the glacial period f_hol = 0.5 # Controls the maximum temp anomaly during the Holocene f_seas = 1.0 # Controls the magnitude of the seasonal cycle f_to = 0.2 # Defines the oceanic temperature anomaly relative # to the annual mean atmospheric temp anomaly","title":"The \"hybrid\" method"},{"location":"yelmo-io/","text":"Yelmo IO Writing output Multiple generalized routines are available for writing the variables of a Yelmo instance (yelmo_class) to a NetCDF file. The main public facing routines are the following: yelmo_write_init yelmo_write_var yelmo_write_step yelmo_restart_write These routines will be described briefly below. yelmo_write_init subroutine yelmo_write_init(ylmo,filename,time_init,units,irange,jrange) This routine can be used to initialize any file that will make use of one or more dimension axes of Yelmo variables. The dimension variables that will be written to the file are the following: xc, yc, month, zeta, zeta_ac, zeta_rock, age_iso, pd_age_iso, pc_steps, time [unlimited] Some of the dimension variables above are typically only needed for restart files ( age_iso, pd_age_iso, pc_steps ), but are written as well to maintain generality. Importantly, yelmo_write_init can be used to initialize a regional output file by specifying the indices of the bounding box for the region of interest via the arguments irange=[i1,i2], jrange=[j1,j2] . yelmo_write_var subroutine yelmo_write_var(filename,varname,ylmo,n,ncid,irange,jrange) This routine will write a variable to a given filename of an already existing NetCDF file, most likely but not necessarily initialized using yelmo_write_init . This routine will accept any variable varname that is listed in the Yelmo variable tables , which will be written with the attributes specified in the table. This routine can also be used to write regional output using the arguments irange, jrange . yelmo_write_step subroutine yelmo_write_step(ylmo,filename,time,nms,compare_pd,irange,jrange) This routine will write several variables to a file for a given timestep. The variable names can be provided as a vector of strings via the nms argument (e.g., nms=[\"H_ice\",\"z_srf\"] ). The routine will write relevant model performance information and then individually call yelmo_write_var for each variable listed. Optionally it is possible to write comparison fields with present-day data ( compare_pd=.TRUE. ), assuming it has been loaded into the ylmo%dta fields. This routine can also be used to write regional output using the arguements irange, jrange . Note that this routine can be challenging to use in Fortran, when custom variable names ( nms argument) is used. This is because of the Fortran limitation on defining string arrays as inline arguments - namely, all strings in the array are required to have the same length. Passing this argument would give an error: nms=[\"H_ice\",\"z_srf\",\"mask_bed\"] while this would be ok: nms=[\"H_ice \",\"z_srf \",\"mask_bed\"] For three variables this is not so cumbersome, but can be when many variables are listed. If no argument is used, then a subset of useful variables is written: names(1) = \"H_ice\" names(2) = \"z_srf\" names(3) = \"z_bed\" names(4) = \"mask_bed\" names(5) = \"uxy_b\" names(6) = \"uxy_s\" names(7) = \"uxy_bar\" names(8) = \"beta\" names(9) = \"visc_bar\" names(10) = \"T_prime_b\" names(11) = \"H_w\" names(12) = \"mb_net\" names(13) = \"smb\" names(14) = \"bmb\" names(15) = \"cmb\" names(16) = \"z_sl\" yelmo_write_restart subroutine yelmo_restart_write(ylmo,filename,time,init,irange,jrange) This routine will save a snapshot of the Yelmo instance. Essentially the routine will loop over every field found in the Yelmo variable tables and write them to a NetCDF file. Optionally init=.FALSE. will allow writing of multiple timesteps to the same file (largely useful for diagnostic purposes, since the files can get very large). This routine can also be used to write regional output using the arguements irange, jrange . Reading input By specifying the parameter yelmo.restart to a restart file path, Yelmo will read the NetCDF file with a saved snapshot. The routines yelmo_restart_read_topo_bnd and yelmo_restart_read are generally used internally during yelmo_init and yelmo_init_state , respectively. So these routines will not typically be needed by a user externally.","title":"Input/output"},{"location":"yelmo-io/#yelmo-io","text":"","title":"Yelmo IO"},{"location":"yelmo-io/#writing-output","text":"Multiple generalized routines are available for writing the variables of a Yelmo instance (yelmo_class) to a NetCDF file. The main public facing routines are the following: yelmo_write_init yelmo_write_var yelmo_write_step yelmo_restart_write These routines will be described briefly below.","title":"Writing output"},{"location":"yelmo-io/#yelmo_write_init","text":"subroutine yelmo_write_init(ylmo,filename,time_init,units,irange,jrange) This routine can be used to initialize any file that will make use of one or more dimension axes of Yelmo variables. The dimension variables that will be written to the file are the following: xc, yc, month, zeta, zeta_ac, zeta_rock, age_iso, pd_age_iso, pc_steps, time [unlimited] Some of the dimension variables above are typically only needed for restart files ( age_iso, pd_age_iso, pc_steps ), but are written as well to maintain generality. Importantly, yelmo_write_init can be used to initialize a regional output file by specifying the indices of the bounding box for the region of interest via the arguments irange=[i1,i2], jrange=[j1,j2] .","title":"yelmo_write_init"},{"location":"yelmo-io/#yelmo_write_var","text":"subroutine yelmo_write_var(filename,varname,ylmo,n,ncid,irange,jrange) This routine will write a variable to a given filename of an already existing NetCDF file, most likely but not necessarily initialized using yelmo_write_init . This routine will accept any variable varname that is listed in the Yelmo variable tables , which will be written with the attributes specified in the table. This routine can also be used to write regional output using the arguments irange, jrange .","title":"yelmo_write_var"},{"location":"yelmo-io/#yelmo_write_step","text":"subroutine yelmo_write_step(ylmo,filename,time,nms,compare_pd,irange,jrange) This routine will write several variables to a file for a given timestep. The variable names can be provided as a vector of strings via the nms argument (e.g., nms=[\"H_ice\",\"z_srf\"] ). The routine will write relevant model performance information and then individually call yelmo_write_var for each variable listed. Optionally it is possible to write comparison fields with present-day data ( compare_pd=.TRUE. ), assuming it has been loaded into the ylmo%dta fields. This routine can also be used to write regional output using the arguements irange, jrange . Note that this routine can be challenging to use in Fortran, when custom variable names ( nms argument) is used. This is because of the Fortran limitation on defining string arrays as inline arguments - namely, all strings in the array are required to have the same length. Passing this argument would give an error: nms=[\"H_ice\",\"z_srf\",\"mask_bed\"] while this would be ok: nms=[\"H_ice \",\"z_srf \",\"mask_bed\"] For three variables this is not so cumbersome, but can be when many variables are listed. If no argument is used, then a subset of useful variables is written: names(1) = \"H_ice\" names(2) = \"z_srf\" names(3) = \"z_bed\" names(4) = \"mask_bed\" names(5) = \"uxy_b\" names(6) = \"uxy_s\" names(7) = \"uxy_bar\" names(8) = \"beta\" names(9) = \"visc_bar\" names(10) = \"T_prime_b\" names(11) = \"H_w\" names(12) = \"mb_net\" names(13) = \"smb\" names(14) = \"bmb\" names(15) = \"cmb\" names(16) = \"z_sl\"","title":"yelmo_write_step"},{"location":"yelmo-io/#yelmo_write_restart","text":"subroutine yelmo_restart_write(ylmo,filename,time,init,irange,jrange) This routine will save a snapshot of the Yelmo instance. Essentially the routine will loop over every field found in the Yelmo variable tables and write them to a NetCDF file. Optionally init=.FALSE. will allow writing of multiple timesteps to the same file (largely useful for diagnostic purposes, since the files can get very large). This routine can also be used to write regional output using the arguements irange, jrange .","title":"yelmo_write_restart"},{"location":"yelmo-io/#reading-input","text":"By specifying the parameter yelmo.restart to a restart file path, Yelmo will read the NetCDF file with a saved snapshot. The routines yelmo_restart_read_topo_bnd and yelmo_restart_read are generally used internally during yelmo_init and yelmo_init_state , respectively. So these routines will not typically be needed by a user externally.","title":"Reading input"},{"location":"yelmo-variables-ybound/","text":"ybound id variable dimensions units long_name 1 z_bed xc, yc m Bedrock elevation 2 z_bed_sd xc, yc m Standard deviation of bedrock elevation 3 z_sl xc, yc m Sea level elevation 4 H_sed xc, yc m Sediment thickness 5 smb_ref xc, yc m/yr Surface mass balance 6 T_srf xc, yc K Surface temperature 7 bmb_shlf xc, yc m/yr Basal mass balance for ice shelf 8 fmb_shlf xc, yc m/yr Frontal mass balance for ice shelf 9 T_shlf xc, yc K Ice shelf temperature 10 Q_geo xc, yc mW m^-2 Geothermal heat flow at depth 11 enh_srf xc, yc - Enhancement factor at the surface 12 basins xc, yc - Basin identification numbers 13 basin_mask xc, yc - Mask for basins 14 regions xc, yc - Region identification numbers 15 region_mask xc, yc - Mask for regions 16 ice_allowed xc, yc - Locations where ice thickness can be greater than zero 17 calv_mask xc, yc - Locations where calving is not allowed 18 H_ice_ref xc, yc m Reference ice thickness for relaxation routines 19 z_bed_ref xc, yc m Reference bedrock elevation for relaxation routines","title":"ybound"},{"location":"yelmo-variables-ybound/#ybound","text":"id variable dimensions units long_name 1 z_bed xc, yc m Bedrock elevation 2 z_bed_sd xc, yc m Standard deviation of bedrock elevation 3 z_sl xc, yc m Sea level elevation 4 H_sed xc, yc m Sediment thickness 5 smb_ref xc, yc m/yr Surface mass balance 6 T_srf xc, yc K Surface temperature 7 bmb_shlf xc, yc m/yr Basal mass balance for ice shelf 8 fmb_shlf xc, yc m/yr Frontal mass balance for ice shelf 9 T_shlf xc, yc K Ice shelf temperature 10 Q_geo xc, yc mW m^-2 Geothermal heat flow at depth 11 enh_srf xc, yc - Enhancement factor at the surface 12 basins xc, yc - Basin identification numbers 13 basin_mask xc, yc - Mask for basins 14 regions xc, yc - Region identification numbers 15 region_mask xc, yc - Mask for regions 16 ice_allowed xc, yc - Locations where ice thickness can be greater than zero 17 calv_mask xc, yc - Locations where calving is not allowed 18 H_ice_ref xc, yc m Reference ice thickness for relaxation routines 19 z_bed_ref xc, yc m Reference bedrock elevation for relaxation routines","title":"ybound"},{"location":"yelmo-variables-ydata/","text":"ydata id variable dimensions units long_name 1 pd_H_ice xc, yc m PD ice thickness 2 pd_z_srf xc, yc m PD surface elevation 3 pd_z_bed xc, yc m PD bedrock elevation 4 pd_H_grnd xc, yc m PD overburden ice thickness 5 pd_mask_bed xc, yc - PD mask 6 pd_ux_s xc, yc m/yr PD surface velocity in the x-direction 7 pd_uy_s xc, yc m/yr PD surface velocity in the y-direction 8 pd_uxy_s xc, yc m/yr PD surface velocity magnitude 9 pd_T_srf xc, yc K PD surface temperature 10 pd_smb_ref xc, yc m/yr PD surface mass balance 11 pd_depth_iso xc, yc, pd_age_iso m PD depth of specific isochrones 12 pd_err_H_ice xc, yc m PD error in ice thickness 13 pd_err_z_srf xc, yc m PD error in surface elevation 14 pd_err_z_bed xc, yc m PD error in bedrock elevation 15 pd_err_smb_ref xc, yc m/yr PD error in surface mass balance 16 pd_err_uxy_s xc, yc m/yr PD error in surface velocity magnitude 17 pd_err_depth_iso xc, yc, pd_age_iso m PD error in isochrone depth","title":"ydata"},{"location":"yelmo-variables-ydata/#ydata","text":"id variable dimensions units long_name 1 pd_H_ice xc, yc m PD ice thickness 2 pd_z_srf xc, yc m PD surface elevation 3 pd_z_bed xc, yc m PD bedrock elevation 4 pd_H_grnd xc, yc m PD overburden ice thickness 5 pd_mask_bed xc, yc - PD mask 6 pd_ux_s xc, yc m/yr PD surface velocity in the x-direction 7 pd_uy_s xc, yc m/yr PD surface velocity in the y-direction 8 pd_uxy_s xc, yc m/yr PD surface velocity magnitude 9 pd_T_srf xc, yc K PD surface temperature 10 pd_smb_ref xc, yc m/yr PD surface mass balance 11 pd_depth_iso xc, yc, pd_age_iso m PD depth of specific isochrones 12 pd_err_H_ice xc, yc m PD error in ice thickness 13 pd_err_z_srf xc, yc m PD error in surface elevation 14 pd_err_z_bed xc, yc m PD error in bedrock elevation 15 pd_err_smb_ref xc, yc m/yr PD error in surface mass balance 16 pd_err_uxy_s xc, yc m/yr PD error in surface velocity magnitude 17 pd_err_depth_iso xc, yc, pd_age_iso m PD error in isochrone depth","title":"ydata"},{"location":"yelmo-variables-ydyn/","text":"ydyn id variable dimensions units long_name 1 ux xc, yc, zeta m/yr x-velocity 2 uy xc, yc, zeta m/yr y-velocity 3 uxy xc, yc, zeta m/yr Horizonal velocity magnitude 4 uz xc, yc, zeta_ac m/yr z-component velocity 5 uz_star xc, yc, zeta_ac m/yr z-velocity with corr. for thermal advection 6 ux_bar xc, yc m/yr Depth-averaged x-velocity 7 uy_bar xc, yc m/yr Depth-averaged y-velocity 8 uxy_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 9 ux_bar_prev xc, yc m/yr Previous depth-averaged x-velocity 10 uy_bar_prev xc, yc m/yr Previous depth-averaged y-velocity 11 ux_b xc, yc m/yr Basal x-velocity 12 uy_b xc, yc m/yr Basal y-velocity 13 uz_b xc, yc m/yr Basal z-velocity 14 uxy_b xc, yc m/yr Basal horizontal velocity magnitude 15 ux_s xc, yc m/yr Surface x-velocity 16 uy_s xc, yc m/yr Surface y-velocity 17 uz_s xc, yc m/yr Surface z-velocity 18 uxy_s xc, yc m/yr Surface horizontal velocity magnitude 19 ux_i xc, yc, zeta m/yr Shearing x-velocity 20 uy_i xc, yc, zeta m/yr Shearing y-velocity 21 ux_i_bar xc, yc m/yr Depth-averaged shearing x-velocity 22 uy_i_bar xc, yc m/yr Depth-averaged shearing y-velocity 23 uxy_i_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 24 duxydt xc, yc m/yr^2 Time derivative of uxy 25 duxdz xc, yc, zeta 1/yr x-velocity vertical gradient 26 duydz xc, yc, zeta 1/yr y-velocity vertical gradient 27 duxdz_bar xc, yc 1/yr Depth-averaged x-velocity vertical gradient 28 duydz_bar xc, yc 1/yr Depth-averaged y-velocity vertical gradient 29 taud_acx xc, yc Pa Driving stress (x-dir) 30 taud_acy xc, yc Pa Driving stress (y-dir) 31 taud xc, yc Pa Driving stress magnitude 32 taub_acx xc, yc Pa Basal stress (x-dir) 33 taub_acy xc, yc Pa Basal stress (y-dir) 34 taub xc, yc Pa Basal stress magnitude 35 taul_int_acx xc, yc Pa Depth-integrated lateral stress (x-dir) 36 taul_int_acy xc, yc Pa Depth-integrated lateral stress (y-dir) 37 qq_gl_acx xc, yc m^3/yr Flux across grounding line 38 qq_gl_acy xc, yc m^3/yr Flux across grounding line 39 qq_acx xc, yc m^3/yr Flux (x-dir) 40 qq_acy xc, yc m^3/yr Flux (y-dir) 41 qq xc, yc m^3/yr Flux magnitude 42 de_eff xc, yc, zeta 1/yr Effective strain rate 43 visc_eff xc, yc, zeta Pa yr Effective viscosity 44 visc_eff_int xc, yc Pa yr m Depth-integrated viscosity 45 N_eff xc, yc Pa Effective pressure 46 cb_tgt xc, yc Pa Target basal parameter 47 cb_ref xc, yc -- Reference basal parameter 48 c_bed xc, yc Pa Basal drag coefficient 49 beta_acx xc, yc Pa yr m^-1 Basal stress factor (x) 50 beta_acy xc, yc Pa yr m^-1 Basal stress factor (y) 51 beta xc, yc Pa yr m^-1 Basal stress factor mag. 52 beta_eff xc, yc Pa yr m^-1 Effective basal factor 53 f_vbvs xc, yc - Vertical basal stress 54 ssa_mask_acx xc, yc - SSA mask (x-dir) 55 ssa_mask_acy xc, yc - SSA mask (y-dir) 56 ssa_err_acx xc, yc m/yr SSA error (x-dir) 57 ssa_err_acy xc, yc m/yr SSA error (y-dir) 58 jvel_dxx xc, yc, zeta 1/yr Velocity Jacobian component duxdx 59 jvel_dxy xc, yc, zeta 1/yr Velocity Jacobian component duxdy 60 jvel_dxz xc, yc, zeta 1/yr Velocity Jacobian component duxdz 61 jvel_dyx xc, yc, zeta 1/yr Velocity Jacobian component duydx 62 jvel_dyy xc, yc, zeta 1/yr Velocity Jacobian component duydy 63 jvel_dyz xc, yc, zeta 1/yr Velocity Jacobian component duydz 64 jvel_dzx xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdx 65 jvel_dzy xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdy 66 jvel_dzz xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdz","title":"ydyn"},{"location":"yelmo-variables-ydyn/#ydyn","text":"id variable dimensions units long_name 1 ux xc, yc, zeta m/yr x-velocity 2 uy xc, yc, zeta m/yr y-velocity 3 uxy xc, yc, zeta m/yr Horizonal velocity magnitude 4 uz xc, yc, zeta_ac m/yr z-component velocity 5 uz_star xc, yc, zeta_ac m/yr z-velocity with corr. for thermal advection 6 ux_bar xc, yc m/yr Depth-averaged x-velocity 7 uy_bar xc, yc m/yr Depth-averaged y-velocity 8 uxy_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 9 ux_bar_prev xc, yc m/yr Previous depth-averaged x-velocity 10 uy_bar_prev xc, yc m/yr Previous depth-averaged y-velocity 11 ux_b xc, yc m/yr Basal x-velocity 12 uy_b xc, yc m/yr Basal y-velocity 13 uz_b xc, yc m/yr Basal z-velocity 14 uxy_b xc, yc m/yr Basal horizontal velocity magnitude 15 ux_s xc, yc m/yr Surface x-velocity 16 uy_s xc, yc m/yr Surface y-velocity 17 uz_s xc, yc m/yr Surface z-velocity 18 uxy_s xc, yc m/yr Surface horizontal velocity magnitude 19 ux_i xc, yc, zeta m/yr Shearing x-velocity 20 uy_i xc, yc, zeta m/yr Shearing y-velocity 21 ux_i_bar xc, yc m/yr Depth-averaged shearing x-velocity 22 uy_i_bar xc, yc m/yr Depth-averaged shearing y-velocity 23 uxy_i_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 24 duxydt xc, yc m/yr^2 Time derivative of uxy 25 duxdz xc, yc, zeta 1/yr x-velocity vertical gradient 26 duydz xc, yc, zeta 1/yr y-velocity vertical gradient 27 duxdz_bar xc, yc 1/yr Depth-averaged x-velocity vertical gradient 28 duydz_bar xc, yc 1/yr Depth-averaged y-velocity vertical gradient 29 taud_acx xc, yc Pa Driving stress (x-dir) 30 taud_acy xc, yc Pa Driving stress (y-dir) 31 taud xc, yc Pa Driving stress magnitude 32 taub_acx xc, yc Pa Basal stress (x-dir) 33 taub_acy xc, yc Pa Basal stress (y-dir) 34 taub xc, yc Pa Basal stress magnitude 35 taul_int_acx xc, yc Pa Depth-integrated lateral stress (x-dir) 36 taul_int_acy xc, yc Pa Depth-integrated lateral stress (y-dir) 37 qq_gl_acx xc, yc m^3/yr Flux across grounding line 38 qq_gl_acy xc, yc m^3/yr Flux across grounding line 39 qq_acx xc, yc m^3/yr Flux (x-dir) 40 qq_acy xc, yc m^3/yr Flux (y-dir) 41 qq xc, yc m^3/yr Flux magnitude 42 de_eff xc, yc, zeta 1/yr Effective strain rate 43 visc_eff xc, yc, zeta Pa yr Effective viscosity 44 visc_eff_int xc, yc Pa yr m Depth-integrated viscosity 45 N_eff xc, yc Pa Effective pressure 46 cb_tgt xc, yc Pa Target basal parameter 47 cb_ref xc, yc -- Reference basal parameter 48 c_bed xc, yc Pa Basal drag coefficient 49 beta_acx xc, yc Pa yr m^-1 Basal stress factor (x) 50 beta_acy xc, yc Pa yr m^-1 Basal stress factor (y) 51 beta xc, yc Pa yr m^-1 Basal stress factor mag. 52 beta_eff xc, yc Pa yr m^-1 Effective basal factor 53 f_vbvs xc, yc - Vertical basal stress 54 ssa_mask_acx xc, yc - SSA mask (x-dir) 55 ssa_mask_acy xc, yc - SSA mask (y-dir) 56 ssa_err_acx xc, yc m/yr SSA error (x-dir) 57 ssa_err_acy xc, yc m/yr SSA error (y-dir) 58 jvel_dxx xc, yc, zeta 1/yr Velocity Jacobian component duxdx 59 jvel_dxy xc, yc, zeta 1/yr Velocity Jacobian component duxdy 60 jvel_dxz xc, yc, zeta 1/yr Velocity Jacobian component duxdz 61 jvel_dyx xc, yc, zeta 1/yr Velocity Jacobian component duydx 62 jvel_dyy xc, yc, zeta 1/yr Velocity Jacobian component duydy 63 jvel_dyz xc, yc, zeta 1/yr Velocity Jacobian component duydz 64 jvel_dzx xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdx 65 jvel_dzy xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdy 66 jvel_dzz xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdz","title":"ydyn"},{"location":"yelmo-variables-ymat/","text":"ymat id variable dimensions units long_name 1 enh xc, yc, zeta - Enhancement factor 2 enh_bnd xc, yc, zeta - Imposed enhancement factor 3 enh_bar xc, yc - Depth-averaged enhancement 4 ATT xc, yc, zeta - Rate factor 5 ATT_bar xc, yc - Depth-averaged rate factor 6 visc xc, yc, zeta Pa yr Ice viscosity 7 visc_bar xc, yc Pa yr Depth-averaged ice viscosity 8 visc_int xc, yc Pa yr m Ice viscosity interpolated at interfaces 9 f_shear_bar xc, yc - Depth-averaged shear fraction 10 dep_time xc, yc, zeta yr Ice deposition time (for online age tracing) 11 depth_iso xc, yc, age_iso m Depth of specific isochronal layers 12 strn2D_dxx xc, yc 1/yr 2D strain rate tensor component dxx 13 strn2D_dyy xc, yc 1/yr 2D strain rate tensor component dyy 14 strn2D_dxy xc, yc 1/yr 2D strain rate tensor component dxy 15 strn2D_dxz xc, yc 1/yr 2D strain rate tensor component dxz 16 strn2D_dyz xc, yc 1/yr 2D strain rate tensor component dyz 17 strn2D_de xc, yc 1/yr 2D effective strain rate 18 strn2D_div xc, yc 1/yr 2D horizontal divergence 19 strn2D_f_shear xc, yc 2D strain rate shear fraction 20 strn_dxx xc, yc, zeta 1/yr Strain rate tensor component dxx 21 strn_dyy xc, yc, zeta 1/yr Strain rate tensor component dyy 22 strn_dxy xc, yc, zeta 1/yr Strain rate tensor component dxy 23 strn_dxz xc, yc, zeta 1/yr Strain rate tensor component dxz 24 strn_dyz xc, yc, zeta 1/yr Strain rate tensor component dyz 25 strn_de xc, yc, zeta 1/yr Effective strain rate 26 strn_div xc, yc, zeta 1/yr Horizontal divergence 27 strn_f_shear xc, yc, zeta Strain rate shear fraction 28 strs2D_txx xc, yc Pa 2D stress tensor component txx 29 strs2D_tyy xc, yc Pa 2D stress tensor component tyy 30 strs2D_txy xc, yc Pa 2D stress tensor component txy 31 strs2D_txz xc, yc Pa 2D stress tensor component txz 32 strs2D_tyz xc, yc Pa 2D stress tensor component tyz 33 strs2D_te xc, yc Pa 2D effective stress 34 strs2D_tau_eig_1 xc, yc Pa 2D stress first principal eigenvalue 35 strs2D_tau_eig_2 xc, yc Pa 2D stress second principal eigenvalue 36 strs_txx xc, yc, zeta Pa Stress tensor component txx 37 strs_tyy xc, yc, zeta Pa Stress tensor component tyy 38 strs_txy xc, yc, zeta Pa Stress tensor component txy 39 strs_txz xc, yc, zeta Pa Stress tensor component txz 40 strs_tyz xc, yc, zeta Pa Stress tensor component tyz 41 strs_te xc, yc, zeta Pa Effective stress","title":"ymat"},{"location":"yelmo-variables-ymat/#ymat","text":"id variable dimensions units long_name 1 enh xc, yc, zeta - Enhancement factor 2 enh_bnd xc, yc, zeta - Imposed enhancement factor 3 enh_bar xc, yc - Depth-averaged enhancement 4 ATT xc, yc, zeta - Rate factor 5 ATT_bar xc, yc - Depth-averaged rate factor 6 visc xc, yc, zeta Pa yr Ice viscosity 7 visc_bar xc, yc Pa yr Depth-averaged ice viscosity 8 visc_int xc, yc Pa yr m Ice viscosity interpolated at interfaces 9 f_shear_bar xc, yc - Depth-averaged shear fraction 10 dep_time xc, yc, zeta yr Ice deposition time (for online age tracing) 11 depth_iso xc, yc, age_iso m Depth of specific isochronal layers 12 strn2D_dxx xc, yc 1/yr 2D strain rate tensor component dxx 13 strn2D_dyy xc, yc 1/yr 2D strain rate tensor component dyy 14 strn2D_dxy xc, yc 1/yr 2D strain rate tensor component dxy 15 strn2D_dxz xc, yc 1/yr 2D strain rate tensor component dxz 16 strn2D_dyz xc, yc 1/yr 2D strain rate tensor component dyz 17 strn2D_de xc, yc 1/yr 2D effective strain rate 18 strn2D_div xc, yc 1/yr 2D horizontal divergence 19 strn2D_f_shear xc, yc 2D strain rate shear fraction 20 strn_dxx xc, yc, zeta 1/yr Strain rate tensor component dxx 21 strn_dyy xc, yc, zeta 1/yr Strain rate tensor component dyy 22 strn_dxy xc, yc, zeta 1/yr Strain rate tensor component dxy 23 strn_dxz xc, yc, zeta 1/yr Strain rate tensor component dxz 24 strn_dyz xc, yc, zeta 1/yr Strain rate tensor component dyz 25 strn_de xc, yc, zeta 1/yr Effective strain rate 26 strn_div xc, yc, zeta 1/yr Horizontal divergence 27 strn_f_shear xc, yc, zeta Strain rate shear fraction 28 strs2D_txx xc, yc Pa 2D stress tensor component txx 29 strs2D_tyy xc, yc Pa 2D stress tensor component tyy 30 strs2D_txy xc, yc Pa 2D stress tensor component txy 31 strs2D_txz xc, yc Pa 2D stress tensor component txz 32 strs2D_tyz xc, yc Pa 2D stress tensor component tyz 33 strs2D_te xc, yc Pa 2D effective stress 34 strs2D_tau_eig_1 xc, yc Pa 2D stress first principal eigenvalue 35 strs2D_tau_eig_2 xc, yc Pa 2D stress second principal eigenvalue 36 strs_txx xc, yc, zeta Pa Stress tensor component txx 37 strs_tyy xc, yc, zeta Pa Stress tensor component tyy 38 strs_txy xc, yc, zeta Pa Stress tensor component txy 39 strs_txz xc, yc, zeta Pa Stress tensor component txz 40 strs_tyz xc, yc, zeta Pa Stress tensor component tyz 41 strs_te xc, yc, zeta Pa Effective stress","title":"ymat"},{"location":"yelmo-variables-ytherm/","text":"ytherm id variable dimensions units long_name 1 enth xc, yc, zeta J m^-3 Ice enthalpy 2 T_ice xc, yc, zeta K Ice temperature 3 omega xc, yc, zeta - Ice water content 4 T_pmp xc, yc, zeta K Pressure-corrected melting point 5 T_prime xc, yc, zeta deg C Homologous ice temperature 6 f_pmp xc, yc - Fraction of cell at pressure melting point 7 bmb_grnd xc, yc m/yr Grounded basal mass balance 8 Q_strn xc, yc, zeta W m^-3 Internal strain heat production 9 dQsdt xc, yc, zeta W m^-3 yr^-1 Rate of change of internal heat production 10 Q_b xc, yc mW m^-2 Basal friction heat production 11 Q_ice_b xc, yc mW m^-2 Basal ice heat flux 12 T_prime_b xc, yc K Homologous temperature at the base 13 H_w xc, yc m Basal water layer thickness 14 dHwdt xc, yc m/yr Rate of change of basal water layer thickness 15 cp xc, yc, zeta J kg^-1 K^-1 Specific heat capacity 16 kt xc, yc, zeta W m^-1 K^-1 Heat conductivity 17 H_cts xc, yc m Height of the CTS (cold-temperate surface) 18 advecxy xc, yc, zeta - Horizontal advection 19 Q_rock xc, yc W m^-2 Heat flux from bedrock 20 enth_rock xc, yc, zeta_rock J m^-3 Bedrock enthalpy 21 T_rock xc, yc, zeta_rock K Bedrock temperature","title":"ytherm"},{"location":"yelmo-variables-ytherm/#ytherm","text":"id variable dimensions units long_name 1 enth xc, yc, zeta J m^-3 Ice enthalpy 2 T_ice xc, yc, zeta K Ice temperature 3 omega xc, yc, zeta - Ice water content 4 T_pmp xc, yc, zeta K Pressure-corrected melting point 5 T_prime xc, yc, zeta deg C Homologous ice temperature 6 f_pmp xc, yc - Fraction of cell at pressure melting point 7 bmb_grnd xc, yc m/yr Grounded basal mass balance 8 Q_strn xc, yc, zeta W m^-3 Internal strain heat production 9 dQsdt xc, yc, zeta W m^-3 yr^-1 Rate of change of internal heat production 10 Q_b xc, yc mW m^-2 Basal friction heat production 11 Q_ice_b xc, yc mW m^-2 Basal ice heat flux 12 T_prime_b xc, yc K Homologous temperature at the base 13 H_w xc, yc m Basal water layer thickness 14 dHwdt xc, yc m/yr Rate of change of basal water layer thickness 15 cp xc, yc, zeta J kg^-1 K^-1 Specific heat capacity 16 kt xc, yc, zeta W m^-1 K^-1 Heat conductivity 17 H_cts xc, yc m Height of the CTS (cold-temperate surface) 18 advecxy xc, yc, zeta - Horizontal advection 19 Q_rock xc, yc W m^-2 Heat flux from bedrock 20 enth_rock xc, yc, zeta_rock J m^-3 Bedrock enthalpy 21 T_rock xc, yc, zeta_rock K Bedrock temperature","title":"ytherm"},{"location":"yelmo-variables-ytopo/","text":"ytopo id variable dimensions units long_name 1 H_ice xc, yc m Ice thickness 2 dHidt xc, yc m/yr Ice thickness rate of change 3 dHidt_dyn xc, yc m/yr Ice thickness change due to dynamics 4 mb_net xc, yc m/yr Actual mass balance applied 5 mb_relax xc, yc m/yr Change in mass balance due to relaxation 6 mb_resid xc, yc m/yr Residual mass balance 7 mb_err xc, yc m/yr Residual error in mass balance accounting 8 smb xc, yc m/yr Surface mass balance 9 bmb xc, yc m/yr Combined basal mass balance 10 fmb xc, yc m/yr Combined frontal mass balance 11 dmb xc, yc m/yr Subgrid discharge mass balance 12 cmb xc, yc m/yr Calving mass balance 13 bmb_ref xc, yc m/yr Reference basal mass balance 14 fmb_ref xc, yc m/yr Reference frontal mass balance 15 dmb_ref xc, yc m/yr Reference subgrid discharge mass balance 16 cmb_flt xc, yc m/yr Floating calving rate 17 cmb_grnd xc, yc m/yr Grounded calving rate 18 z_srf xc, yc m Surface elevation 19 dzsdt xc, yc m/yr Surface elevation rate of change 20 mask_adv xc, yc Advection mask 21 eps_eff xc, yc 1/yr Effective strain rate 22 tau_eff xc, yc Pa Effective stress 23 z_base xc, yc m Ice-base elevation 24 dzsdx xc, yc m/m Surface elevation slope, acx nodes 25 dzsdy xc, yc m/m Surface elevation slope, acy nodes 26 dHidx xc, yc m/m Ice thickness gradient, acx nodes 27 dHidy xc, yc m/m Ice thickness gradient, acy nodes 28 dzbdx xc, yc m/m Bedrock slope, acx nodes 29 dzbdy xc, yc m/m Bedrock slope, acy nodes 30 H_eff xc, yc m Effective ice thickness (margin-corrected) 31 H_grnd xc, yc m Grounded ice thickness 32 H_calv xc, yc m Calving parameter field, ice thickness limit 33 kt_calv xc, yc Calving parameter field, vm-l19 34 z_bed_filt xc, yc m Filtered bedrock elevation 35 f_grnd xc, yc Grounded fraction 36 f_grnd_acx xc, yc Grounded fraction (acx nodes) 37 f_grnd_acy xc, yc Grounded fraction (acy nodes) 38 f_grnd_ab xc, yc Grounded fraction (ab nodes) 39 f_ice xc, yc Ice-covered fraction 40 f_grnd_bmb xc, yc Grounded fraction for basal mass balance 41 f_grnd_pin xc, yc Grounded fraction from subgrid pinning points 42 dist_margin xc, yc m Distance to nearest margin point 43 dist_grline xc, yc m Distance to nearest grounding-line point 44 mask_bed xc, yc Multi-valued bed mask 45 mask_grz xc, yc Multi-valued grounding-line zone mask 46 mask_frnt xc, yc Multi-valued ice front mask 47 dHidt_dyn_n xc, yc m/yr Ice thickness change due to advection (previous) 48 H_ice_n xc, yc m Ice thickness from previous timestep 49 z_srf_n xc, yc m Surface elevation from previous timestep 50 H_ice_dyn xc, yc m Dynamic ice thickness 51 f_ice_dyn xc, yc Dynamic ice-covered fraction 52 pc_pred_H_ice xc, yc m Predicted ice thickness 53 pc_pred_dHidt_dyn xc, yc m/yr Predicted dynamic ice thickness rate of change 54 pc_pred_mb_net xc, yc m/yr Predicted net mass balance 55 pc_pred_smb xc, yc m/yr Predicted surface mass balance 56 pc_pred_bmb xc, yc m/yr Predicted basal mass balance 57 pc_pred_fmb xc, yc m/yr Predicted frontal mass balance 58 pc_pred_dmb xc, yc m/yr Predicted discharge mass balance 59 pc_pred_cmb xc, yc m/yr Predicted calving mass balance 60 pc_corr_H_ice xc, yc m Corrected ice thickness 61 pc_corr_dHidt_dyn xc, yc m/yr Corrected dynamic ice thickness rate of change 62 pc_corr_mb_net xc, yc m/yr Corrected net mass balance 63 pc_corr_smb xc, yc m/yr Corrected surface mass balance 64 pc_corr_bmb xc, yc m/yr Corrected basal mass balance 65 pc_corr_fmb xc, yc m/yr Corrected frontal mass balance 66 pc_corr_dmb xc, yc m/yr Corrected discharge mass balance 67 pc_corr_cmb xc, yc m/yr Corrected calving mass balance","title":"ytopo"},{"location":"yelmo-variables-ytopo/#ytopo","text":"id variable dimensions units long_name 1 H_ice xc, yc m Ice thickness 2 dHidt xc, yc m/yr Ice thickness rate of change 3 dHidt_dyn xc, yc m/yr Ice thickness change due to dynamics 4 mb_net xc, yc m/yr Actual mass balance applied 5 mb_relax xc, yc m/yr Change in mass balance due to relaxation 6 mb_resid xc, yc m/yr Residual mass balance 7 mb_err xc, yc m/yr Residual error in mass balance accounting 8 smb xc, yc m/yr Surface mass balance 9 bmb xc, yc m/yr Combined basal mass balance 10 fmb xc, yc m/yr Combined frontal mass balance 11 dmb xc, yc m/yr Subgrid discharge mass balance 12 cmb xc, yc m/yr Calving mass balance 13 bmb_ref xc, yc m/yr Reference basal mass balance 14 fmb_ref xc, yc m/yr Reference frontal mass balance 15 dmb_ref xc, yc m/yr Reference subgrid discharge mass balance 16 cmb_flt xc, yc m/yr Floating calving rate 17 cmb_grnd xc, yc m/yr Grounded calving rate 18 z_srf xc, yc m Surface elevation 19 dzsdt xc, yc m/yr Surface elevation rate of change 20 mask_adv xc, yc Advection mask 21 eps_eff xc, yc 1/yr Effective strain rate 22 tau_eff xc, yc Pa Effective stress 23 z_base xc, yc m Ice-base elevation 24 dzsdx xc, yc m/m Surface elevation slope, acx nodes 25 dzsdy xc, yc m/m Surface elevation slope, acy nodes 26 dHidx xc, yc m/m Ice thickness gradient, acx nodes 27 dHidy xc, yc m/m Ice thickness gradient, acy nodes 28 dzbdx xc, yc m/m Bedrock slope, acx nodes 29 dzbdy xc, yc m/m Bedrock slope, acy nodes 30 H_eff xc, yc m Effective ice thickness (margin-corrected) 31 H_grnd xc, yc m Grounded ice thickness 32 H_calv xc, yc m Calving parameter field, ice thickness limit 33 kt_calv xc, yc Calving parameter field, vm-l19 34 z_bed_filt xc, yc m Filtered bedrock elevation 35 f_grnd xc, yc Grounded fraction 36 f_grnd_acx xc, yc Grounded fraction (acx nodes) 37 f_grnd_acy xc, yc Grounded fraction (acy nodes) 38 f_grnd_ab xc, yc Grounded fraction (ab nodes) 39 f_ice xc, yc Ice-covered fraction 40 f_grnd_bmb xc, yc Grounded fraction for basal mass balance 41 f_grnd_pin xc, yc Grounded fraction from subgrid pinning points 42 dist_margin xc, yc m Distance to nearest margin point 43 dist_grline xc, yc m Distance to nearest grounding-line point 44 mask_bed xc, yc Multi-valued bed mask 45 mask_grz xc, yc Multi-valued grounding-line zone mask 46 mask_frnt xc, yc Multi-valued ice front mask 47 dHidt_dyn_n xc, yc m/yr Ice thickness change due to advection (previous) 48 H_ice_n xc, yc m Ice thickness from previous timestep 49 z_srf_n xc, yc m Surface elevation from previous timestep 50 H_ice_dyn xc, yc m Dynamic ice thickness 51 f_ice_dyn xc, yc Dynamic ice-covered fraction 52 pc_pred_H_ice xc, yc m Predicted ice thickness 53 pc_pred_dHidt_dyn xc, yc m/yr Predicted dynamic ice thickness rate of change 54 pc_pred_mb_net xc, yc m/yr Predicted net mass balance 55 pc_pred_smb xc, yc m/yr Predicted surface mass balance 56 pc_pred_bmb xc, yc m/yr Predicted basal mass balance 57 pc_pred_fmb xc, yc m/yr Predicted frontal mass balance 58 pc_pred_dmb xc, yc m/yr Predicted discharge mass balance 59 pc_pred_cmb xc, yc m/yr Predicted calving mass balance 60 pc_corr_H_ice xc, yc m Corrected ice thickness 61 pc_corr_dHidt_dyn xc, yc m/yr Corrected dynamic ice thickness rate of change 62 pc_corr_mb_net xc, yc m/yr Corrected net mass balance 63 pc_corr_smb xc, yc m/yr Corrected surface mass balance 64 pc_corr_bmb xc, yc m/yr Corrected basal mass balance 65 pc_corr_fmb xc, yc m/yr Corrected frontal mass balance 66 pc_corr_dmb xc, yc m/yr Corrected discharge mass balance 67 pc_corr_cmb xc, yc m/yr Corrected calving mass balance","title":"ytopo"},{"location":"yelmo-variables/","text":"Yelmo variable tables Here are tables containing all variables available within Yelmo with dimensions and units. These tables are used directly in the code for output writing routines, to select which variables to write. Yelmo topography Yelmo dynamics Yelmo material Yelmo thermodynamics Yelmo boundaries Yelmo data","title":"Variables"},{"location":"yelmo-variables/#yelmo-variable-tables","text":"Here are tables containing all variables available within Yelmo with dimensions and units. These tables are used directly in the code for output writing routines, to select which variables to write. Yelmo topography Yelmo dynamics Yelmo material Yelmo thermodynamics Yelmo boundaries Yelmo data","title":"Yelmo variable tables"}]} \ No newline at end of file +{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Yelmo Welcome to Yelmo , an easy to use continental ice sheet model. Yelmo is a 3D ice-sheet-shelf model solving for the coupled dynamics and thermodynamics of the ice sheet system. Yelmo can be used for idealized simulations, stand-alone ice sheet simulations and fully coupled ice-sheet and climate simulations. Yelmo has been designed to operate as a stand-alone model or to be easily plugged in as a module in another program. The key to its flexibility is that no variables are defined globally and parameters are defined according to the domain being modeled. In this way, all variables and calculations are store in an object that entirely represents the model domain. The physics and design of Yelmo are described in the following article: Robinson, A., Alvarez-Solas, J., Montoya, M., Goelzer, H., Greve, R., and Ritz, C.: Description and validation of the ice-sheet model Yelmo (version 1.0), Geosci. Model Dev., 13, 2805\u20132823, https://doi.org/10.5194/gmd-13-2805-2020 , 2020. The Yelmo code repository can be found here: https://github.com/palma-ice/yelmo General model structure - classes and usage yelmo_class The Yelmo class defines all data related to a model domain, such as Greenland or Antarctica. As seen below in the yelmo_class defintion, the 'class' is simply a user-defined Fortran type that contains additional types representing various parameters, variables or sets of module variables. type yelmo_class type(yelmo_param_class) :: par ! General domain parameters type(ygrid_class) :: grd ! Grid definition type(ytopo_class) :: tpo ! Topography variables type(ydyn_class) :: dyn ! Dynamics variables type(ymat_class) :: mat ! Material variables type(ytherm_class) :: thrm ! Thermodynamics variables type(ybound_class) :: bnd ! Boundary variables to drive model type(ydata_class) :: dta ! Data variables for comparison type(yregions_class) :: reg ! Regionally aggregated variables end type Likewise the module variables are defined in a similar way, e.g. ytopo_class that defines variables and parameters associated with the topography: type ytopo_class type(ytopo_param_class) :: par ! Parameters type(ytopo_state_class) :: now ! Variables end type Submodules such as ytopo_class include parameter definitions relevant to topography calculations, as well as all variables that define the state of the domain being modeled. Example model domain intialization The below code snippet shows an example of how to initialize an instance of Yelmo inside of a program, run the model forward in time and then terminate the instance. ! === Initialize ice sheet model ===== ! Initialize Yelmo objects (multiple yelmo objects can be initialized if needed) ! In this case `yelmo1` is the Yelmo object to initialize and `path_par` is the ! path to the parameter file to load for the configuration information. This ! command will also initialize the domain grid and load initial topographic ! variables. call yelmo_init(yelmo1,filename=path_par,grid_def=\"file\",time=time_init) ! === Load initial boundary conditions for current time and yelmo state ===== ! These variables can be loaded from a file, or passed from another ! component being simulated. Yelmo does not care about the source, ! it only needs all variables in the `bnd` class to be populated. ! ybound: z_bed, z_sl, H_sed, H_w, smb, T_srf, bmb_shlf, T_shlf, Q_geo yelmo1%bnd%z_bed = [2D array] yelmo1%bnd%z_sl = [2D array] yelmo1%bnd%H_sed = [2D array] yelmo1%bnd%H_w = [2D array] yelmo1%bnd%smb = [2D array] yelmo1%bnd%T_srf = [2D array] yelmo1%bnd%bmb_shlf = [2D array] yelmo1%bnd%T_shlf = [2D array] yelmo1%bnd%Q_geo = [2D array] ! Print summary of initial boundary conditions call yelmo_print_bound(yelmo1%bnd) ! Next, initialize the state variables (dyn,therm,mat) ! (in this case, initialize temps with robin method) call yelmo_init_state(yelmo1,time=time_init,thrm_method=\"robin\") ! Run yelmo for eg 100.0 years with constant boundary conditions and topo ! to equilibrate thermodynamics and dynamics ! (impose a constant, small dt=1yr to reduce possibility for instabilities) call yelmo_update_equil(yelmo1,time,time_tot=100.0,topo_fixed=.FALSE.,dt=1.0) ! == YELMO INITIALIZATION COMPLETE == ! Note: the above routines `yelmo_init_state` and `yelmo_update_equil` ! are optional, if the user prefers another way to initialize the state variables. ! == Start time looping and run the model == ! Advance timesteps do n = 1, ntot ! Get current time time = time_init + n*dt ! Update the Yelmo ice sheet call yelmo_update(yelmo1,time) ! Here you may be updating `yelmo1%bnd` variables to drive the model transiently. end do ! == Finalize Yelmo instance == call yelmo_end(yelmo1,time=time) That's it! See Getting started to see how to get the code, compile a test program and run simulations.","title":"Home"},{"location":"#yelmo","text":"Welcome to Yelmo , an easy to use continental ice sheet model. Yelmo is a 3D ice-sheet-shelf model solving for the coupled dynamics and thermodynamics of the ice sheet system. Yelmo can be used for idealized simulations, stand-alone ice sheet simulations and fully coupled ice-sheet and climate simulations. Yelmo has been designed to operate as a stand-alone model or to be easily plugged in as a module in another program. The key to its flexibility is that no variables are defined globally and parameters are defined according to the domain being modeled. In this way, all variables and calculations are store in an object that entirely represents the model domain. The physics and design of Yelmo are described in the following article: Robinson, A., Alvarez-Solas, J., Montoya, M., Goelzer, H., Greve, R., and Ritz, C.: Description and validation of the ice-sheet model Yelmo (version 1.0), Geosci. Model Dev., 13, 2805\u20132823, https://doi.org/10.5194/gmd-13-2805-2020 , 2020. The Yelmo code repository can be found here: https://github.com/palma-ice/yelmo","title":"Yelmo"},{"location":"#general-model-structure-classes-and-usage","text":"","title":"General model structure - classes and usage"},{"location":"#yelmo_class","text":"The Yelmo class defines all data related to a model domain, such as Greenland or Antarctica. As seen below in the yelmo_class defintion, the 'class' is simply a user-defined Fortran type that contains additional types representing various parameters, variables or sets of module variables. type yelmo_class type(yelmo_param_class) :: par ! General domain parameters type(ygrid_class) :: grd ! Grid definition type(ytopo_class) :: tpo ! Topography variables type(ydyn_class) :: dyn ! Dynamics variables type(ymat_class) :: mat ! Material variables type(ytherm_class) :: thrm ! Thermodynamics variables type(ybound_class) :: bnd ! Boundary variables to drive model type(ydata_class) :: dta ! Data variables for comparison type(yregions_class) :: reg ! Regionally aggregated variables end type Likewise the module variables are defined in a similar way, e.g. ytopo_class that defines variables and parameters associated with the topography: type ytopo_class type(ytopo_param_class) :: par ! Parameters type(ytopo_state_class) :: now ! Variables end type Submodules such as ytopo_class include parameter definitions relevant to topography calculations, as well as all variables that define the state of the domain being modeled.","title":"yelmo_class"},{"location":"#example-model-domain-intialization","text":"The below code snippet shows an example of how to initialize an instance of Yelmo inside of a program, run the model forward in time and then terminate the instance. ! === Initialize ice sheet model ===== ! Initialize Yelmo objects (multiple yelmo objects can be initialized if needed) ! In this case `yelmo1` is the Yelmo object to initialize and `path_par` is the ! path to the parameter file to load for the configuration information. This ! command will also initialize the domain grid and load initial topographic ! variables. call yelmo_init(yelmo1,filename=path_par,grid_def=\"file\",time=time_init) ! === Load initial boundary conditions for current time and yelmo state ===== ! These variables can be loaded from a file, or passed from another ! component being simulated. Yelmo does not care about the source, ! it only needs all variables in the `bnd` class to be populated. ! ybound: z_bed, z_sl, H_sed, H_w, smb, T_srf, bmb_shlf, T_shlf, Q_geo yelmo1%bnd%z_bed = [2D array] yelmo1%bnd%z_sl = [2D array] yelmo1%bnd%H_sed = [2D array] yelmo1%bnd%H_w = [2D array] yelmo1%bnd%smb = [2D array] yelmo1%bnd%T_srf = [2D array] yelmo1%bnd%bmb_shlf = [2D array] yelmo1%bnd%T_shlf = [2D array] yelmo1%bnd%Q_geo = [2D array] ! Print summary of initial boundary conditions call yelmo_print_bound(yelmo1%bnd) ! Next, initialize the state variables (dyn,therm,mat) ! (in this case, initialize temps with robin method) call yelmo_init_state(yelmo1,time=time_init,thrm_method=\"robin\") ! Run yelmo for eg 100.0 years with constant boundary conditions and topo ! to equilibrate thermodynamics and dynamics ! (impose a constant, small dt=1yr to reduce possibility for instabilities) call yelmo_update_equil(yelmo1,time,time_tot=100.0,topo_fixed=.FALSE.,dt=1.0) ! == YELMO INITIALIZATION COMPLETE == ! Note: the above routines `yelmo_init_state` and `yelmo_update_equil` ! are optional, if the user prefers another way to initialize the state variables. ! == Start time looping and run the model == ! Advance timesteps do n = 1, ntot ! Get current time time = time_init + n*dt ! Update the Yelmo ice sheet call yelmo_update(yelmo1,time) ! Here you may be updating `yelmo1%bnd` variables to drive the model transiently. end do ! == Finalize Yelmo instance == call yelmo_end(yelmo1,time=time) That's it! See Getting started to see how to get the code, compile a test program and run simulations.","title":"Example model domain intialization"},{"location":"dependencies/","text":"Dependencies Yelmo is dependent on the following libraries: NetCDF Library of Iterative Solvers for Linear Systems 'runner' Python library (cxesmc fork) YelmoX is additionally dependent on the following library: FFTW (ver. 3.9+) Installation tips for each dependency can be found below. Installing NetCDF (preferably version 4.0 or higher) The NetCDF library is typically available with different distributions (Linux, Mac, etc). Along with installing libnetcdf , it will be necessary to install the package libnetcdf-dev . Installing the NetCDF viewing program ncview is also recommended. If you want to install NetCDF from source, then you must install both the netcdf-c and subsequently netcdf-fortran libraries. The source code and installation instructions are available from the Unidata website: https://www.unidata.ucar.edu/software/netcdf/docs/getting_and_building_netcdf.html Install LIS and FFTW These packages could be installed individually and linked into the libs directory of Yelmox and Yelmo. However, to ensure the right versions are used, etc., we have now made a separate repository for managing the installation of LIS and FFTW from the versions available in that repository. This repository is managed as part of the Fast Earth System Model Community (FESMC). Please download the code from this repository and see the README for installation instructions: https://github.com/fesmc-utils Installing runner Install runner to your system's Python installation via pip , along with dependency tabulate . pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip That's it! Now check that system command job is available by running job -h . If the command is not found, it means that the Python bin directory is not available in your PATH. To add it, typically something like this is needed in your .profile or .bashrc file: PATH=${PATH}:${HOME}/.local/bin export PATH","title":"Dependencies"},{"location":"dependencies/#dependencies","text":"Yelmo is dependent on the following libraries: NetCDF Library of Iterative Solvers for Linear Systems 'runner' Python library (cxesmc fork) YelmoX is additionally dependent on the following library: FFTW (ver. 3.9+) Installation tips for each dependency can be found below.","title":"Dependencies"},{"location":"dependencies/#installing-netcdf-preferably-version-40-or-higher","text":"The NetCDF library is typically available with different distributions (Linux, Mac, etc). Along with installing libnetcdf , it will be necessary to install the package libnetcdf-dev . Installing the NetCDF viewing program ncview is also recommended. If you want to install NetCDF from source, then you must install both the netcdf-c and subsequently netcdf-fortran libraries. The source code and installation instructions are available from the Unidata website: https://www.unidata.ucar.edu/software/netcdf/docs/getting_and_building_netcdf.html","title":"Installing NetCDF (preferably version 4.0 or higher)"},{"location":"dependencies/#install-lis-and-fftw","text":"These packages could be installed individually and linked into the libs directory of Yelmox and Yelmo. However, to ensure the right versions are used, etc., we have now made a separate repository for managing the installation of LIS and FFTW from the versions available in that repository. This repository is managed as part of the Fast Earth System Model Community (FESMC). Please download the code from this repository and see the README for installation instructions: https://github.com/fesmc-utils","title":"Install LIS and FFTW"},{"location":"dependencies/#installing-runner","text":"Install runner to your system's Python installation via pip , along with dependency tabulate . pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip That's it! Now check that system command job is available by running job -h . If the command is not found, it means that the Python bin directory is not available in your PATH. To add it, typically something like this is needed in your .profile or .bashrc file: PATH=${PATH}:${HOME}/.local/bin export PATH","title":"Installing runner"},{"location":"example-programs/","text":"Example programs The Yelmo base code provides a static library interface that can be used in other programs, as well as a couple of stand-alone programs for running certain benchmarks. Here we provide more examples of how to use Yelmo: Program template to connect with other models/components. Stand-alone ice sheet with full boundary forcing. In both cases, it is necessary to download the Yelmo repository separately, as well as compile the Yelmo static library (see Getting started ). Program template This is a minimalistic setup that allows you to run Yelmo with no dependencies and a straightforward Makefile. This template can be used to design a new stand-alone Yelmo experiment, or to provide guidance when adding Yelmo to another program. Clone the repository from https://github.com/palma-ice/yelmot Stand-alone ice sheet with full boundary forcing (yelmox) This setup is suitable for glacial-cycle simulations, future simulations or any other typical (realistic) ice-sheet model simulation. Clone the repository from https://github.com/palma-ice/yelmox","title":"Examples"},{"location":"example-programs/#example-programs","text":"The Yelmo base code provides a static library interface that can be used in other programs, as well as a couple of stand-alone programs for running certain benchmarks. Here we provide more examples of how to use Yelmo: Program template to connect with other models/components. Stand-alone ice sheet with full boundary forcing. In both cases, it is necessary to download the Yelmo repository separately, as well as compile the Yelmo static library (see Getting started ).","title":"Example programs"},{"location":"example-programs/#program-template","text":"This is a minimalistic setup that allows you to run Yelmo with no dependencies and a straightforward Makefile. This template can be used to design a new stand-alone Yelmo experiment, or to provide guidance when adding Yelmo to another program. Clone the repository from https://github.com/palma-ice/yelmot","title":"Program template"},{"location":"example-programs/#stand-alone-ice-sheet-with-full-boundary-forcing-yelmox","text":"This setup is suitable for glacial-cycle simulations, future simulations or any other typical (realistic) ice-sheet model simulation. Clone the repository from https://github.com/palma-ice/yelmox","title":"Stand-alone ice sheet with full boundary forcing (yelmox)"},{"location":"getting-started/","text":"Getting started Here you can find the basic information and steps needed to get Yelmo running. Dependencies Yelmo dependencies: LIS YelmoX dependencies: FFTW (for FastIsostasy), FastIsostasy, REMBO1 Job submission: Python3.x, runner See: Dependencies for more details. Directory structure config/ Configuration files for compilation on different systems. input/ Location of any input data needed by the model. libs/ Auxiliary libraries nesecessary for running the model. libyelmo/ Folder containing all compiled files in a standard way with lib/, include/ and bin/ folders. output/ Default location for model output. par/ Default parameter files that manage the model configuration. src/ Source code for Yelmo. tests/ Source code and analysis scripts for specific model benchmarks and tests. Usage Follow the steps below to (1) obtain the code, (2) configure the Makefile for your system, (3) compile the Yelmo static library and an executable program and (4) run a test simulation. 1. Get the code Clone the repository from https://github.com/palma-ice/yelmo : # Clone repository git clone https://github.com/palma-ice/yelmo.git $YELMOROOT git clone git@github.com:palma-ice/yelmo.git $YELMOROOT # via ssh cd $YELMOROOT where $YELMOROOT is the installation directory. If you plan to make changes to the code, it is wise to check out a new branch: git checkout -b user-dev You should now be working on the branch user-dev . 2. Create the system-specific Makefile To compile Yelmo, you need to generate a Makefile that is appropriate for your system. In the folder config , you need to specify a configuration file that defines the compiler and flags, including definition of the paths to the NetCDF and LIS libraries. You can use another file in the config folder as a template, e.g., cd config cp pik_ifort myhost_mycompiler then modify the file myhost_mycompiler to match your paths. Back in $YELMOROOT , you can then generate your Makefile with the provided python configuration script: cd $YELMOROOT python3 config.py config/myhost_mycompiler The result should be a Makefile in $YELMOROOT that is ready for use. 3. Prepare system-specific .runme_config file To use the runme script for submitting jobs, first you need to configure a few options to match the system you are using (so the script knows which queues are available, etc.). To do so, first copy the template config file to your directory: cp .runme/runme_config .runme_config Next, edit the file. If you are running on an HPC with a job submission system via SLURM, then specify the right HPC. So far the available HPCs are defined in the file .runme/queues_info.json . If you have a new HPC, you should add the information here and inform the runme developers to add it to the main repository. You should also specify the account associated with your jobs on the HPC (which usually indicates the resources available to you on the system). Finally, if you have not already, make sure to install the Python runner module via: pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip See Dependencies for more details if you have trouble. 3. Link to external libraries The external libraries held in the fesm-utils repository need to be linked here for use with Yelmo: ln -s $FESMUSRC ./libs/ Note that $FESMUSRC should be the root directory where fesm-utils was downloaded, and it should be an absolute path. 4. Compile the code Now you are ready to compile Yelmo as a static library: make clean # This step is very important to avoid errors!! make yelmo-static [debug=1] This will compile all of the Yelmo modules and libraries (as defined in config/Makefile_yelmo.mk ), and link them in a static library. All compiled files can be found in the folder libyelmo/ . Once the static library has been compiled, it can be used inside of external Fortran programs and modules via the statement use yelmo . To include/link yelmo-static during compilation of another program, its location must be defined: INC_YELMO = -I${YELMOROOT}/include LIB_YELMO = -L${YELMOROOT}/include -lyelmo Alternatively, several test programs exist in the folder tests/ to run Yelmo as a stand-alone ice sheet. For example, it's possible to run different EISMINT benchmarks, MISMIP benchmarks and the ISIMIP6 INITMIP simulation for Greenland, respectively: make benchmarks # compiles the program `libyelmo/bin/yelmo_benchmarks.x` make mismip # compiles the program `libyelmo/bin/yelmo_mismip.x` make initmip # compiles the program `libyelmo/bin/yelmo_initmip.x` The Makefile additionally allows you to specify debugging compiler flags with the option debug=1 , in case you need to debug the code (e.g., make benchmarks debug=1 ). Using this option, the code will run much slower, so this option is not recommended unless necessary. 5. Run the model Once an executable has been created, you can run the model. This can be achieved via the included Python job submission script runme . The following steps are carried out via the script: The output directory is created. The executable is copied to the output directory The relevant parameter files are copied to the output directory. Links to the input data paths ( input and ice_data ) are created in the output directory. Note that many simulations, such as benchmark experiments, do not depend on these external data sources, but the links are made anyway. The executable is run from the output directory, either as a background process or it is submitted to the queue via sbatch (the SLURM workload manager). To run a benchmark simulation, for example, use the following command: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml where the option -r implies that the model should be run as a background process. If this is omitted, then the output directory will be populated, but no executable will be run, while -s instead will submit the simulation to cluster queue system instead of running in the background. The option -e lets you specify the executable. For some standard cases, shortcuts have been created: benchmarks = libyelmo/bin/yelmo_benchmarks.x mismip = libyelmo/bin/yemo_mismip.x initmip = libyelmo/bin/yelmo_initmip.x The last two mandatory arguments -o OUTDIR and -n PAR_PATH are the output/run directory and the parameter file to be used for this simulation, respectively. In the case of the above simulation, the output directory is defined as output/test , where all model parameters (loaded from the file par/yelmo_EISMINT.nml ) and model output can be found. It is also possible to modify parameters inline via the option -p KEY=VAL [KEY=VAL ...] . The parameter should be specified with its namelist group and its name. E.g., to change the resolution of the EISMINT benchmark experiment to 10km, use: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml -p ctrl.dx=10 See runme -h for more details on the run script. Test cases The published model description includes several test simulations for validation of the model's performance. The following section describes how to perform these tests using the same model version documented in the article. From this point, it is assumed that the user has already configured the model for their system (see https://palma-ice.github.io/yelmo-docs ) and is ready to compile the mode. 1. EISMINT1 moving margin experiment To perform the moving margin experiment, compile the benchmarks executable and call it with the EISMINT parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-moving -n par-gmd/yelmo_EISMINT_moving.nml 2. EISMINT2 EXPA To perform Experiment A from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPA parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expa -n par-gmd/yelmo_EISMINT_expa.nml 3. EISMINT2 EXPF To perform Experiment F from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPF parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expf -n par-gmd/yelmo_EISMINT_expf.nml 4. MISMIP RF To perform the MISMIP rate factor experiment, compile the mismip executable and call it with the MISMIP parameter file the three parameter permutations of interest (default, subgrid and subgrid+gl-scaling): make mismip ./runme -r -e mismip -o output/mismip-rf-0 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-1 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-2 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=2 To additionally change the resolution of the simulations change the parameter mismip.dx , e.g. for the default simulation with 10km resolution , call: ./runme -r -e mismip -o output/mismip-rf-0-10km -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 mismip.dx=10 5. Age profile experiments To perform the age profile experiments, compile the Fortran program tests/test_icetemp.f90 and run it: make icetemp ./libyelmo/bin/test_icetemp.x To perform the different permutations, it is necessary to recompile for single or double precision after changing the precision parameter prec in the file src/yelmo_defs.f90 . The number of vertical grid points can be specified in the main program file, as well as the output filename. 6. Antarctica present-day and glacial simulations To perform the Antarctica simulations as presented in the paper, it is necessary to compile the initmip executable and run with the present-day (pd) and glacial (lgm) parameter values: make initmip ./runme -r -e initmip -o output/ant-pd -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_pd\" ./runme -r -e initmip -o output/ant-lgm -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_lgm\"","title":"Getting started"},{"location":"getting-started/#getting-started","text":"Here you can find the basic information and steps needed to get Yelmo running.","title":"Getting started"},{"location":"getting-started/#dependencies","text":"Yelmo dependencies: LIS YelmoX dependencies: FFTW (for FastIsostasy), FastIsostasy, REMBO1 Job submission: Python3.x, runner See: Dependencies for more details.","title":"Dependencies"},{"location":"getting-started/#directory-structure","text":"config/ Configuration files for compilation on different systems. input/ Location of any input data needed by the model. libs/ Auxiliary libraries nesecessary for running the model. libyelmo/ Folder containing all compiled files in a standard way with lib/, include/ and bin/ folders. output/ Default location for model output. par/ Default parameter files that manage the model configuration. src/ Source code for Yelmo. tests/ Source code and analysis scripts for specific model benchmarks and tests.","title":"Directory structure"},{"location":"getting-started/#usage","text":"Follow the steps below to (1) obtain the code, (2) configure the Makefile for your system, (3) compile the Yelmo static library and an executable program and (4) run a test simulation.","title":"Usage"},{"location":"getting-started/#1-get-the-code","text":"Clone the repository from https://github.com/palma-ice/yelmo : # Clone repository git clone https://github.com/palma-ice/yelmo.git $YELMOROOT git clone git@github.com:palma-ice/yelmo.git $YELMOROOT # via ssh cd $YELMOROOT where $YELMOROOT is the installation directory. If you plan to make changes to the code, it is wise to check out a new branch: git checkout -b user-dev You should now be working on the branch user-dev .","title":"1. Get the code"},{"location":"getting-started/#2-create-the-system-specific-makefile","text":"To compile Yelmo, you need to generate a Makefile that is appropriate for your system. In the folder config , you need to specify a configuration file that defines the compiler and flags, including definition of the paths to the NetCDF and LIS libraries. You can use another file in the config folder as a template, e.g., cd config cp pik_ifort myhost_mycompiler then modify the file myhost_mycompiler to match your paths. Back in $YELMOROOT , you can then generate your Makefile with the provided python configuration script: cd $YELMOROOT python3 config.py config/myhost_mycompiler The result should be a Makefile in $YELMOROOT that is ready for use.","title":"2. Create the system-specific Makefile"},{"location":"getting-started/#3-prepare-system-specific-runme_config-file","text":"To use the runme script for submitting jobs, first you need to configure a few options to match the system you are using (so the script knows which queues are available, etc.). To do so, first copy the template config file to your directory: cp .runme/runme_config .runme_config Next, edit the file. If you are running on an HPC with a job submission system via SLURM, then specify the right HPC. So far the available HPCs are defined in the file .runme/queues_info.json . If you have a new HPC, you should add the information here and inform the runme developers to add it to the main repository. You should also specify the account associated with your jobs on the HPC (which usually indicates the resources available to you on the system). Finally, if you have not already, make sure to install the Python runner module via: pip install https://github.com/cxesmc/runner/archive/refs/heads/master.zip See Dependencies for more details if you have trouble.","title":"3. Prepare system-specific .runme_config file"},{"location":"getting-started/#3-link-to-external-libraries","text":"The external libraries held in the fesm-utils repository need to be linked here for use with Yelmo: ln -s $FESMUSRC ./libs/ Note that $FESMUSRC should be the root directory where fesm-utils was downloaded, and it should be an absolute path.","title":"3. Link to external libraries"},{"location":"getting-started/#4-compile-the-code","text":"Now you are ready to compile Yelmo as a static library: make clean # This step is very important to avoid errors!! make yelmo-static [debug=1] This will compile all of the Yelmo modules and libraries (as defined in config/Makefile_yelmo.mk ), and link them in a static library. All compiled files can be found in the folder libyelmo/ . Once the static library has been compiled, it can be used inside of external Fortran programs and modules via the statement use yelmo . To include/link yelmo-static during compilation of another program, its location must be defined: INC_YELMO = -I${YELMOROOT}/include LIB_YELMO = -L${YELMOROOT}/include -lyelmo Alternatively, several test programs exist in the folder tests/ to run Yelmo as a stand-alone ice sheet. For example, it's possible to run different EISMINT benchmarks, MISMIP benchmarks and the ISIMIP6 INITMIP simulation for Greenland, respectively: make benchmarks # compiles the program `libyelmo/bin/yelmo_benchmarks.x` make mismip # compiles the program `libyelmo/bin/yelmo_mismip.x` make initmip # compiles the program `libyelmo/bin/yelmo_initmip.x` The Makefile additionally allows you to specify debugging compiler flags with the option debug=1 , in case you need to debug the code (e.g., make benchmarks debug=1 ). Using this option, the code will run much slower, so this option is not recommended unless necessary.","title":"4. Compile the code"},{"location":"getting-started/#5-run-the-model","text":"Once an executable has been created, you can run the model. This can be achieved via the included Python job submission script runme . The following steps are carried out via the script: The output directory is created. The executable is copied to the output directory The relevant parameter files are copied to the output directory. Links to the input data paths ( input and ice_data ) are created in the output directory. Note that many simulations, such as benchmark experiments, do not depend on these external data sources, but the links are made anyway. The executable is run from the output directory, either as a background process or it is submitted to the queue via sbatch (the SLURM workload manager). To run a benchmark simulation, for example, use the following command: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml where the option -r implies that the model should be run as a background process. If this is omitted, then the output directory will be populated, but no executable will be run, while -s instead will submit the simulation to cluster queue system instead of running in the background. The option -e lets you specify the executable. For some standard cases, shortcuts have been created: benchmarks = libyelmo/bin/yelmo_benchmarks.x mismip = libyelmo/bin/yemo_mismip.x initmip = libyelmo/bin/yelmo_initmip.x The last two mandatory arguments -o OUTDIR and -n PAR_PATH are the output/run directory and the parameter file to be used for this simulation, respectively. In the case of the above simulation, the output directory is defined as output/test , where all model parameters (loaded from the file par/yelmo_EISMINT.nml ) and model output can be found. It is also possible to modify parameters inline via the option -p KEY=VAL [KEY=VAL ...] . The parameter should be specified with its namelist group and its name. E.g., to change the resolution of the EISMINT benchmark experiment to 10km, use: ./runme -r -e benchmarks -o output/test -n par/yelmo_EISMINT.nml -p ctrl.dx=10 See runme -h for more details on the run script.","title":"5. Run the model"},{"location":"getting-started/#test-cases","text":"The published model description includes several test simulations for validation of the model's performance. The following section describes how to perform these tests using the same model version documented in the article. From this point, it is assumed that the user has already configured the model for their system (see https://palma-ice.github.io/yelmo-docs ) and is ready to compile the mode.","title":"Test cases"},{"location":"getting-started/#1-eismint1-moving-margin-experiment","text":"To perform the moving margin experiment, compile the benchmarks executable and call it with the EISMINT parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-moving -n par-gmd/yelmo_EISMINT_moving.nml","title":"1. EISMINT1 moving margin experiment"},{"location":"getting-started/#2-eismint2-expa","text":"To perform Experiment A from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPA parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expa -n par-gmd/yelmo_EISMINT_expa.nml","title":"2. EISMINT2 EXPA"},{"location":"getting-started/#3-eismint2-expf","text":"To perform Experiment F from the EISMINT2 benchmarks, compile the benchmarks executable and call it with the EXPF parameter file: make benchmarks ./runme -r -e benchmarks -o output/eismint-expf -n par-gmd/yelmo_EISMINT_expf.nml","title":"3. EISMINT2 EXPF"},{"location":"getting-started/#4-mismip-rf","text":"To perform the MISMIP rate factor experiment, compile the mismip executable and call it with the MISMIP parameter file the three parameter permutations of interest (default, subgrid and subgrid+gl-scaling): make mismip ./runme -r -e mismip -o output/mismip-rf-0 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-1 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=0 ./runme -r -e mismip -o output/mismip-rf-2 -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=3 ydyn.beta_gl_scale=2 To additionally change the resolution of the simulations change the parameter mismip.dx , e.g. for the default simulation with 10km resolution , call: ./runme -r -e mismip -o output/mismip-rf-0-10km -n par-gmd/yelmo_MISMIP3D.nml -p ydyn.beta_gl_stag=0 ydyn.beta_gl_scale=0 mismip.dx=10","title":"4. MISMIP RF"},{"location":"getting-started/#5-age-profile-experiments","text":"To perform the age profile experiments, compile the Fortran program tests/test_icetemp.f90 and run it: make icetemp ./libyelmo/bin/test_icetemp.x To perform the different permutations, it is necessary to recompile for single or double precision after changing the precision parameter prec in the file src/yelmo_defs.f90 . The number of vertical grid points can be specified in the main program file, as well as the output filename.","title":"5. Age profile experiments"},{"location":"getting-started/#6-antarctica-present-day-and-glacial-simulations","text":"To perform the Antarctica simulations as presented in the paper, it is necessary to compile the initmip executable and run with the present-day (pd) and glacial (lgm) parameter values: make initmip ./runme -r -e initmip -o output/ant-pd -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_pd\" ./runme -r -e initmip -o output/ant-lgm -n par-gmd/yelmo_Antarctica.nml -p ctrl.clim_nm=\"clim_lgm\"","title":"6. Antarctica present-day and glacial simulations"},{"location":"hpc-notes/","text":"HPC Notes Running at PIK on HPC2024 (foote) The following modules have to be loaded in order to compile and run the model. For convenience you can also add those commands to your .profile file in your home directory. module purge module use /p/system/modulefiles/compiler \\ /p/system/modulefiles/gpu \\ /p/system/modulefiles/libraries \\ /p/system/modulefiles/parallel \\ /p/system/modulefiles/tools module load intel/oneAPI/2024.0.0 module load netcdf-c/4.9.2 module load netcdf-fortran-intel/4.6.1 module load udunits/2.2.28 module load ncview/2.1.10 module load cdo/2.4.2 When installing fesm-utils (see Dependencies ) use the pik script: ./install_pik.sh ifx To link to data sources, use the following path: datapath=/p/projects/megarun Running at AWI on albedo Load the following modules in your .bashrc or .bash_profile file in your home directory. module load intel-oneapi-compilers/2024.0.0 module load netcdf-c/4.8.1-openmpi4.1.3-oneapi2022.1.0 module load netcdf-fortran/4.5.4-oneapi2022.1.0 module load udunits/2.2.28 module load ncview/2.1.8 module load cdo/2.2.0 module load python/3.10.4 When installing fesm-utils (see Dependencies ) use the awi script (which is a link to the dkrz script): ./install_awi.sh ifx To link to data sources, use the following path: datapath=/albedo/work/projects/p_forclima Running at DKRZ on levante Load the following modules in your .bashrc file in your home directory. # Tools module load cdo/2.4.0-gcc-11.2.0 module load esmvaltool/2.5.0 module load ncview/2.1.8-gcc-11.2.0 module load git/2.43.3-gcc-11.2.0 module load python3/2023.01-gcc-11.2.0 # Compilers and libs module load intel-oneapi-compilers/2023.2.1-gcc-11.2.0 module load netcdf-c/4.8.1-openmpi-4.1.2-intel-2021.5.0 module load netcdf-fortran/4.5.3-openmpi-4.1.2-intel-2021.5.0 When installing fesm-utils (see Dependencies ) use the dkrz script: ./install_dkrz.sh ifx To link to data sources, use the following path: datapath=/work/ba1442","title":"HPC notes"},{"location":"hpc-notes/#hpc-notes","text":"","title":"HPC Notes"},{"location":"hpc-notes/#running-at-pik-on-hpc2024-foote","text":"The following modules have to be loaded in order to compile and run the model. For convenience you can also add those commands to your .profile file in your home directory. module purge module use /p/system/modulefiles/compiler \\ /p/system/modulefiles/gpu \\ /p/system/modulefiles/libraries \\ /p/system/modulefiles/parallel \\ /p/system/modulefiles/tools module load intel/oneAPI/2024.0.0 module load netcdf-c/4.9.2 module load netcdf-fortran-intel/4.6.1 module load udunits/2.2.28 module load ncview/2.1.10 module load cdo/2.4.2 When installing fesm-utils (see Dependencies ) use the pik script: ./install_pik.sh ifx To link to data sources, use the following path: datapath=/p/projects/megarun","title":"Running at PIK on HPC2024 (foote)"},{"location":"hpc-notes/#running-at-awi-on-albedo","text":"Load the following modules in your .bashrc or .bash_profile file in your home directory. module load intel-oneapi-compilers/2024.0.0 module load netcdf-c/4.8.1-openmpi4.1.3-oneapi2022.1.0 module load netcdf-fortran/4.5.4-oneapi2022.1.0 module load udunits/2.2.28 module load ncview/2.1.8 module load cdo/2.2.0 module load python/3.10.4 When installing fesm-utils (see Dependencies ) use the awi script (which is a link to the dkrz script): ./install_awi.sh ifx To link to data sources, use the following path: datapath=/albedo/work/projects/p_forclima","title":"Running at AWI on albedo"},{"location":"hpc-notes/#running-at-dkrz-on-levante","text":"Load the following modules in your .bashrc file in your home directory. # Tools module load cdo/2.4.0-gcc-11.2.0 module load esmvaltool/2.5.0 module load ncview/2.1.8-gcc-11.2.0 module load git/2.43.3-gcc-11.2.0 module load python3/2023.01-gcc-11.2.0 # Compilers and libs module load intel-oneapi-compilers/2023.2.1-gcc-11.2.0 module load netcdf-c/4.8.1-openmpi-4.1.2-intel-2021.5.0 module load netcdf-fortran/4.5.3-openmpi-4.1.2-intel-2021.5.0 When installing fesm-utils (see Dependencies ) use the dkrz script: ./install_dkrz.sh ifx To link to data sources, use the following path: datapath=/work/ba1442","title":"Running at DKRZ on levante"},{"location":"jupyter-over-ssh/","text":"How to use Jupyter Notebook over ssh Step 1 On the remote machine, open a Jupyter Notebook instance by running: jupyter notebook --no-browser --port 1235 Here port 1235 is chosen, but another port could be used too. In the remote terminal, this message should appear: http://localhost:1235/?token=LARGERANDOMNUMBER Step 2 Open another terminal on the local machine and run: ssh -L 1235:localhost:1235 user@snowball.fis.ucm.es IMPORTANT: use the same port as chosen in step (1). Step 3 Go back to the remote terminal and copy the link shown into a browser on the local machine. The Jupyter Notebook running on the remote machine should now be open in a local browser. Enjoy!","title":"How to use Jupyter Notebook over ssh"},{"location":"jupyter-over-ssh/#how-to-use-jupyter-notebook-over-ssh","text":"","title":"How to use Jupyter Notebook over ssh"},{"location":"jupyter-over-ssh/#step-1","text":"On the remote machine, open a Jupyter Notebook instance by running: jupyter notebook --no-browser --port 1235 Here port 1235 is chosen, but another port could be used too. In the remote terminal, this message should appear: http://localhost:1235/?token=LARGERANDOMNUMBER","title":"Step 1"},{"location":"jupyter-over-ssh/#step-2","text":"Open another terminal on the local machine and run: ssh -L 1235:localhost:1235 user@snowball.fis.ucm.es IMPORTANT: use the same port as chosen in step (1).","title":"Step 2"},{"location":"jupyter-over-ssh/#step-3","text":"Go back to the remote terminal and copy the link shown into a browser on the local machine. The Jupyter Notebook running on the remote machine should now be open in a local browser. Enjoy!","title":"Step 3"},{"location":"notes/","text":"Notes timeout module Example parameters for using the timeout module. The name of the section should be specified when calling timeout_init . &tm_1D method = \"file\" ! \"const\", \"file\", \"times\" dt = 1.0 file = \"input/timeout_ramp_100kyr.txt\" times = -10, -5, 0, 1, 2, 3, 4, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55 / ! Get output times call timeout_init(tm_1D,path_par,\"tm_1D\",\"small\", time_init,time_end) If we are loading the desired output times from a file, the format is one time per line, or a range of times using the format t0:dt:t1 : 0:10:200 200:20:300 300:50:500 500:100:1000 1000:200:5000 5000:500:10000 10e3:1e3:20e3 20e3:2e3:200e3 200e3:5e3:1e6 Duplicate times will be removed, as well as times outside of the range of time_init and time_end . In this way, once timeout_init is called, we know how many timesteps of output will be generated. This can help confirm that we designed the experiment well, and how much data to expect. Then during the timeloop, simply use the function timeout_check to determine if the current time should be written to output: if (timeout_check(tm_1D,time)) then !call write_step_1D_combined(yelmo1,hyst1,file1D_hyst,time=time) end if timing module All calls to the intrinsic routine cpu_time() have been replaced by timing calculations performed in the new timing module. This has the benefit of ensuring timing will work properly for parallel and serial programs, and allows us to keep track of multiple timing objectives with one simple object and subroutine. The control of a timing object is handled via timer_step : First, initialize and reset the timer object: call timer_step(tmrs,comp=-1) Then, e.g., within the timeloop, get the timing for isostasy calls and for Yelmo calls: call timer_step(tmrs,comp=0) ! == ISOSTASY ========================================================== call isos_update(isos1,yelmo1%tpo%now%H_ice,yelmo1%bnd%z_sl,time,yelmo1%bnd%dzbdt_corr) yelmo1%bnd%z_bed = isos1%now%z_bed call timer_step(tmrs,comp=1,time_mod=[time-dtt_now,time]*1e-3,label=\"isostasy\") ! Update ice sheet to current time call yelmo_update(yelmo1,time) call timer_step(tmrs,comp=2,time_mod=[time-dtt_now,time]*1e-3,label=\"yelmo\") The option comp tells us which component the timing is being calculated for, and we can additionally provide a label to associate with this component. This is useful for printing a table later. After all components have been calculated, we can print to a summary file: if (mod(time_elapsed,10.0)==0) then ! Print timestep timing info and write log table call timer_write_table(tmrs,[time,dtt_now]*1e-3,\"m\",tmr_file,init=time_elapsed .eq. 0.0) end if The resulting file will look something like this, here for 4 components measured during during the time loop: time dt yelmo isostasy climate io total rate 0.000 0.010 0.000 0.000 0.016 0.051 0.067 6.694 0.010 0.010 0.000 0.000 0.025 0.000 0.025 2.533 0.020 0.010 0.000 0.000 0.024 0.000 0.025 2.458 Based on the options supplied, the time units are in [m] and the model time in [kyr] . The rate is then calculated as [m/kyr] - this is the inverse of what we used to measure [kyr/hr] . The rate as defined now is easier to manage in terms of summing the contribution of different components, and so is preferred moving forward. To recover [kyr/hr] , simply take 60/rate. master to main Following updated conventions, the default branch is now called main and the branch master has been deleted. To update a working copy locally that already contains a master branch and therefore points to it as the default branch, the following steps should be applied: Get the branch main . Delete the local branch master . Make sure your local repository sees main as the default branch. # Get all branch information from the origin (github): git fetch --all # Get onto the new default branch: git checkout main # Delete the branch master: git branch -d master # Clean up any branches that no longer exist at origin: git fetch --prune origin # Set the local 'head' to whatever is specified at the origin (which will be main): git remote set-head origin -a Done! Now your local copy should work like normal, with main instead of master . Thermodynamics equations Ice column Prognostic equation: \\frac{\\partial T}{\\partial t} = \\frac{k}{\\rho c} \\frac{\\partial^2 T}{\\partial z^2} - u \\frac{\\partial T}{\\partial x} - v \\frac{\\partial T}{\\partial y} - w \\frac{\\partial T}{\\partial z} + \\frac{\\Phi}{\\rho c} Ice surface boundary condition: T(z=z_{\\rm srf}) = {\\rm min}(T_{\\rm 2m},T_0) Ice base (temperate) boundary condition: T(z=z_{\\rm bed}) = T_{\\rm pmp} Ice base (frozen) boundary condition: k \\frac{\\partial T}{\\partial z} = k_r \\frac{\\partial T_r}{\\partial z} Note, the following internal Yelmo variables are defined for convenience: Q_{\\rm ice,b} = -k \\frac{\\partial T}{\\partial z}; \\quad Q_{\\rm rock} = -k_r \\frac{\\partial T_r}{\\partial z} Bedrock column Prognostic equation: \\frac{\\partial T_r}{\\partial t} = \\frac{k_r}{\\rho_r c_r} \\frac{\\partial^2 T_r}{\\partial z^2} Bedrock surface boundary condition: T_r(z=z_{\\rm bed}) = T(z=z_{\\rm bed}) Bedrock base boundary condition: \\frac{\\partial T_r}{\\partial z} = -\\frac{Q_{\\rm geo}}{k_r} Equilibrium bedrock In this case, the bedrock temperature profile is prescribed to the equilibrium linear temperature profile. The slope follows: \\frac{\\partial T_r}{\\partial z} = -\\frac{Q_{\\rm geo}}{k_r} and the bedrock surface temperature is given by the ice temperature at its base: T_r(z=z_{\\rm bed}) = T(z=z_{\\rm bed}) Active bedrock Yelmo calculates the temperature in the lithosphere along with the ice temperature. This can be achieved by assuming equilibrium conditions in the bedrock, i.e., that the temperature profile in the bedrock is always linear with T_lith_s = T_ice_b and the slope equal to dT/dz = -Q_geo / k_lith . Or, the temperature equation can be solved in the lithosphere together with the temperature in the ice column. The parameter block ytherm_lith controls how the lithosphere is calculated with ytherm_lith.method=['equil','active'] deciding the two cases above. Density of the upper lithosphere Heat capacity of the upper lithosphere In both SICOPOLIS and GRISLI, a value of cp = 1000.0 [J kg-1 K-1] is used (referenced in Rogozhina et al., 2012; Greve, 2005; Greve, 1997). This value is adopted in Yelmo as well. cp = 1000.0 ! [J kg-1 K-1] Heat conductivity of the upper lithosphere Note, Yelmo expects input parameter values in units of [J a-1 m-1 K-1] , while much literature uses [W m-1 K-1] . Given the number of seconds in a year sec_year = 31536000.0 , kt [W m-1 K-1] * sec_year = kt [J a-1 m-1 K-1] . Rogozhina et al. (2012) use kt = 2 [W m-1 K-1] for Greenland: kt = 6.3e7 ! [J a-1 m-1 K-1] This value is supported by L\u00f6sing et al. (2020), who perform a Bayesian inversion for GHF in Antarctica. Assuming exponentially decreasing heat production with depth, lower values of kt are supported (see Fig. 7b). In a study on the global thermal characteristics of the lithosphere, Cammarano and Guerri (2017) adopt an upper crust thermal conductivity of kt = 2.5 [W m-1 K-1] . To do: This study is also potentially relevant: https://link.springer.com/article/10.1186/s40517-020-0159-y . They show ranges of on the order of kt = 2-3 [W m-1 K-1] for the Canadian shield. The above value of kt = 2 [W m-1 K-1] = 6.3e7 [J a-1 m-1 K-1] is adopted as the default thermal conductivity of the upper crust in Yelmo. For historical context, see other estimates below. From Greve (1997) and Greve (2005): kt = 9.46e7 ! [J a-1 m-1 K-1] which is equivalent to kt = 3 [W m-1 K-1] . The source of this value is not known. From GRISLI: kt = 1.04e8 ! [J a-1 m-1 K-1] which is equivalent to kt = 3.3 [W m-1 K-1] . The source of this value is not known. How to read yelmo_check_kill output The subroutine yelmo_check_kill is used to see if any instability is arising in the model. If so, then a restart file is written at that moment (the earlier in the instability, the better), and the model is stopped with diagnostic output to the log file. Note that pc_eps is the parameter that defines our target error tolerance in the time stepping of ice thickness evolution. At each time step, the diagnosed model error pc_eta is compared with pc_eps . If pc_eta >> pc_eps , this is interpreted as instability and the model is stopped. Margin-front mass balance Following Pollard and DeConto (2012,2016), an ice-margin front melting scheme has been implemented that accounts for the melt rate along the vertical face of ice submerged by seawater. The frontal mass balance ($\\dot{f}$, m yr$^{-1}$) is calculated as: \\dot{f} = \\dot{b}_{\\rm eff} \\frac{A_f}{A_{\\rm tot}} \\theta_f where $\\dot{b} {\\rm eff}$ is the effective basal mass balance (the mean of the basal mass balance calculated for the ice-free neighbors), $A {\\rm tot}=\\Delta x \\Delta x$ is the horizontal grid area and $A_f$ is the area of the submerged faces (i.e., the sum of the depth of submerged ice for each face of the grid cell adjacent to an ice-free cell -- potentially four faces in total). $\\theta_f=10$ is a scaling coefficient that implies the face mass balance should be ~10 times higher than the basal mass balance (Pollard and DeConto, 2016, appendix). Calving schemes Here is a summary of calving schemes. Lipscomb et al. (2019) c = k_\\tau \\tau_{\\rm ec} where $k_\\tau$ (m yr$^{-1}$ Pa$^{-1}$) is an empirical constant and $\\tau_{\\rm ec}$ (Pa) is the effective calving stress, which is defined by: \\tau_{\\rm ec}^2 = \\max(\\tau_1,0)^2 + \\omega_2 \\max(\\tau_2,0)^2 $\\tau_1$ and $\\tau_2$ are the eigenvalues of the 2D horizontal deviatoric stress tensor and $\\omega_2$ is an empirical weighting constant. For partially ice-covered grid cells (with $f_{\\rm ice} < 1$), these stresses are taken from the upstream neighbor. The eigenvalues $\\tau_1$ and $\\tau_2$ are calculated from the depth-averaged (2D) stress tensor $\\tau_{\\rm ij}$ as follows. Given the stress tensor components $\\tau_{\\rm xx}$, $\\tau_{\\rm yy}$ and $\\tau_{\\rm xy}$, we can solve for the real roots $\\lambda$ of the tensor from the quadratic equation: a \\lambda^2 + b \\lambda + c = 0 where a = 1.0 \\\\ b = -(\\tau_{\\rm xx} + \\tau_{\\rm yy}) \\\\ c = \\tau_{\\rm xx}*\\tau_{\\rm yy} - \\tau_{\\rm xy}^2 glissade_velo_higher.F90: tau_xz(k,i,j) = tau_xz(k,i,j) + efvs_qp * du_dz ! 2 * efvs * eps_xz tau_yz(k,i,j) = tau_yz(k,i,j) + efvs_qp * dv_dz ! 2 * efvs * eps_yz tau_xx(k,i,j) = tau_xx(k,i,j) + 2.d0 * efvs_qp * du_dx ! 2 * efvs * eps_xx tau_yy(k,i,j) = tau_yy(k,i,j) + 2.d0 * efvs_qp * dv_dy ! 2 * efvs * eps_yy tau_xy(k,i,j) = tau_xy(k,i,j) + efvs_qp * (dv_dx + du_dy) ! 2 * efvs * eps_xy Vertical velocity w = u_b \\frac{\\partial b}{\\partial x} + v_b \\frac{\\partial b}{\\partial y} - \\int_b^z \\left( \\frac{\\partial u}{\\partial x} + \\frac{\\partial v}{\\partial y} \\right) dz' Ice margin, calving rates, mass conservation ajr, 2021-06-22 Through v1.42 , ice margins were not fully consistently treated in Yelmo. This has been thoroughly revised. Now the following should be true: The variable f_ice contains information of the ice area fraction of a grid cell. If f_ice=0 , no ice is present, if f_ice=1 , the cell is fully ice covered, and for a fractional value, this cell is designated an ice margin point with partial ice cover. To determine f_ice , we need to calculate the \"effective ice thickness\" of a grid point. For floating cells at the margin, the effective ice thickness is equivalent to either the ice thickness or the minimum ice thickness of the neighboring cell, whichever is larger. For grounded cells, the effective ice thickness must at least be that of 1/2 of the minimum ice thickness of a neighboring cell. With effective ice thickness known, f_ice = H_ice / H_eff . Any grid cell with fractional ice cover 0> pc_eps , this is interpreted as instability and the model is stopped.","title":"How to read yelmo_check_kill output"},{"location":"notes/#margin-front-mass-balance","text":"Following Pollard and DeConto (2012,2016), an ice-margin front melting scheme has been implemented that accounts for the melt rate along the vertical face of ice submerged by seawater. The frontal mass balance ($\\dot{f}$, m yr$^{-1}$) is calculated as: \\dot{f} = \\dot{b}_{\\rm eff} \\frac{A_f}{A_{\\rm tot}} \\theta_f where $\\dot{b} {\\rm eff}$ is the effective basal mass balance (the mean of the basal mass balance calculated for the ice-free neighbors), $A {\\rm tot}=\\Delta x \\Delta x$ is the horizontal grid area and $A_f$ is the area of the submerged faces (i.e., the sum of the depth of submerged ice for each face of the grid cell adjacent to an ice-free cell -- potentially four faces in total). $\\theta_f=10$ is a scaling coefficient that implies the face mass balance should be ~10 times higher than the basal mass balance (Pollard and DeConto, 2016, appendix).","title":"Margin-front mass balance"},{"location":"notes/#calving-schemes","text":"Here is a summary of calving schemes.","title":"Calving schemes"},{"location":"notes/#lipscomb-et-al-2019","text":"c = k_\\tau \\tau_{\\rm ec} where $k_\\tau$ (m yr$^{-1}$ Pa$^{-1}$) is an empirical constant and $\\tau_{\\rm ec}$ (Pa) is the effective calving stress, which is defined by: \\tau_{\\rm ec}^2 = \\max(\\tau_1,0)^2 + \\omega_2 \\max(\\tau_2,0)^2 $\\tau_1$ and $\\tau_2$ are the eigenvalues of the 2D horizontal deviatoric stress tensor and $\\omega_2$ is an empirical weighting constant. For partially ice-covered grid cells (with $f_{\\rm ice} < 1$), these stresses are taken from the upstream neighbor. The eigenvalues $\\tau_1$ and $\\tau_2$ are calculated from the depth-averaged (2D) stress tensor $\\tau_{\\rm ij}$ as follows. Given the stress tensor components $\\tau_{\\rm xx}$, $\\tau_{\\rm yy}$ and $\\tau_{\\rm xy}$, we can solve for the real roots $\\lambda$ of the tensor from the quadratic equation: a \\lambda^2 + b \\lambda + c = 0 where a = 1.0 \\\\ b = -(\\tau_{\\rm xx} + \\tau_{\\rm yy}) \\\\ c = \\tau_{\\rm xx}*\\tau_{\\rm yy} - \\tau_{\\rm xy}^2 glissade_velo_higher.F90: tau_xz(k,i,j) = tau_xz(k,i,j) + efvs_qp * du_dz ! 2 * efvs * eps_xz tau_yz(k,i,j) = tau_yz(k,i,j) + efvs_qp * dv_dz ! 2 * efvs * eps_yz tau_xx(k,i,j) = tau_xx(k,i,j) + 2.d0 * efvs_qp * du_dx ! 2 * efvs * eps_xx tau_yy(k,i,j) = tau_yy(k,i,j) + 2.d0 * efvs_qp * dv_dy ! 2 * efvs * eps_yy tau_xy(k,i,j) = tau_xy(k,i,j) + efvs_qp * (dv_dx + du_dy) ! 2 * efvs * eps_xy","title":"Lipscomb et al. (2019)"},{"location":"notes/#vertical-velocity","text":"w = u_b \\frac{\\partial b}{\\partial x} + v_b \\frac{\\partial b}{\\partial y} - \\int_b^z \\left( \\frac{\\partial u}{\\partial x} + \\frac{\\partial v}{\\partial y} \\right) dz'","title":"Vertical velocity"},{"location":"notes/#ice-margin-calving-rates-mass-conservation","text":"ajr, 2021-06-22 Through v1.42 , ice margins were not fully consistently treated in Yelmo. This has been thoroughly revised. Now the following should be true: The variable f_ice contains information of the ice area fraction of a grid cell. If f_ice=0 , no ice is present, if f_ice=1 , the cell is fully ice covered, and for a fractional value, this cell is designated an ice margin point with partial ice cover. To determine f_ice , we need to calculate the \"effective ice thickness\" of a grid point. For floating cells at the margin, the effective ice thickness is equivalent to either the ice thickness or the minimum ice thickness of the neighboring cell, whichever is larger. For grounded cells, the effective ice thickness must at least be that of 1/2 of the minimum ice thickness of a neighboring cell. With effective ice thickness known, f_ice = H_ice / H_eff . Any grid cell with fractional ice cover 0 1 , then the scaling is non-linear with an exponent of rel_q (this helps maintain small values of tau longer which seems to help keep errors low). Once rel_time2 is reached, relaxation in the model is disabled, and the ice shelves are allowed to freely evolve. Analogously, H_scale is modified the same way: it is constant at the value of scale_H1 until scale_time1 , linearly scaled between scale_time1 and scale_time2 , and then constant thereafter at the value of scale_H2 . Increasing the value of H_scale over time helps to avoid oscillations in the optimization procedure as cf_ref approaches the best fit. Finally, after qmax-1 iterations or time=(qmax-1)*time_iter , cf_ref is held constant, and the simulation runs for time_steady years to equilibrate the model with the current conditions. This step minimizes drift in the final result and confirms that the optimized cf_ref field works well.","title":"Basal friction optimization"},{"location":"optimization/#basal-friction-optimization","text":"A simple optimization program has developed that attempts to optimize the basal friction field applied in Yelmo so that the errors between simulated and observed ice thickness are minimized. Program: tests/yelmo_opt.f90 To compile: make opt To run: ./runme -rs -e opt -o output/test -n par/yelmo_Antarctica_opt.nml The program consists of the following steps:","title":"Basal friction optimization"},{"location":"optimization/#1-spin-up-a-steady-state-ice-sheet-with-constant-forcing-and-fixed-topography","text":"For this step, the restart parameter should be set to yelmo.restart='none' , to ensure that the spin-up is performed with the current parameters. Currently, the program is hard-coded to spin-up the ice sheet for 20 kyr using SIA only, followed by another 10 kyr using the solver of choice, as seen in the following lines of code: call yelmo_update_equil_external(yelmo1,hyd1,cf_ref,time_init,time_tot=20e3,topo_fixed=.TRUE.,dt=5.0,ssa_vel_max=0.0) call yelmo_update_equil_external(yelmo1,hyd1,cf_ref,time_init,time_tot=10e3, topo_fixed=.TRUE.,dt=1.0,ssa_vel_max=5000.0) Note that this spin-up is obtained with a fixed topography set to the present-day observed fields ( H_ice , z_bed ). After the spin-up finishes, a restart file is written in the output directory with the name yelmo_restart.nc . The simulation will terminate at this point.","title":"1. Spin-up a steady-state ice sheet with constant forcing and fixed topography"},{"location":"optimization/#2-optimization","text":"The restart file from Step 1 should be saved somewhere convenient for the model (like in the input folder). Then the restart parameter should be set to that location yelmo.restart='PATH_TO_RESTART.nc' . This will ensure that the spin-up step is skipped, and instead the program will start directly with the optimization iterations. The optimization method follows Pollard and DeConto (2012), in that the basal friction coefficient is scaled as a function of the error in elevation. Here we do not modify beta directly, however, we assume that beta = cf_ref * lambda_bed * N_eff * f(u) . lambda_bed , N_eff and f(u) are all controlled by parameter choices in the .nml file like normal. Thus we are left with a unitless field cf_ref , which for any given friction law varies within the range of about [0:1]. When cf_ref=1.0 , sliding will diminish to near zero, and cf_ref~0.0 (near, but not zero) will give fast sliding. This gives a convenient range for optimization. Parameters that control the total run time are hard coded: qmax : number of total iterations to run, where qmax-1 is the number of optimization steps, during which cf_ref is updated, and the last step is a steady-state run with cf_ref held constant. time_iter : time to run the model for each iteration before updating cf_ref . time_steady : Time to run the model to steady state with cf_ref held constant (last iteration step). So, the program runs for, e.g., time_iter=500 years with a given initial field of cf_ref (with C_bed and beta updating every time step to follow changes in u/v and N_eff ). At the end of time_iter , the error in ice thickness is determined and used to update cf_ref via the function update_cf_ref_thickness_simple . The model is again run for time_iter years and the process is repeated. Two important parameters control the optimization process: tau and H_scale . The optimization works best when the ice shelves are relaxed to the reference (observed) ice thickness in the beginning of the simulation, and then gradually allowed to freely evolve. tau is the time scale of relaxation, which is applied in Yelmo as yelmo1%tpo%par%topo_rel_tau . A lower value of tau means that the ice shelves are more tightly held to the observed thickness. Likewise, H_scale controls the scaling of the ice thickness error, which determines how to modify cf_ref at each iteration. A higher value of H_scale means that changes to cf_ref will be applied more slowly. These parameters are designed to change over time with the simulation. tau is set to rel_tau1 from the start of the simulation until rel_time1 . Between rel_time1 and rel_time2 , tau is linearly scaled from the value of rel_tau1 to rel_tau2 . Or, if rel_q > 1 , then the scaling is non-linear with an exponent of rel_q (this helps maintain small values of tau longer which seems to help keep errors low). Once rel_time2 is reached, relaxation in the model is disabled, and the ice shelves are allowed to freely evolve. Analogously, H_scale is modified the same way: it is constant at the value of scale_H1 until scale_time1 , linearly scaled between scale_time1 and scale_time2 , and then constant thereafter at the value of scale_H2 . Increasing the value of H_scale over time helps to avoid oscillations in the optimization procedure as cf_ref approaches the best fit. Finally, after qmax-1 iterations or time=(qmax-1)*time_iter , cf_ref is held constant, and the simulation runs for time_steady years to equilibrate the model with the current conditions. This step minimizes drift in the final result and confirms that the optimized cf_ref field works well.","title":"2. Optimization"},{"location":"parameters/","text":"Parameters Here important parameter choices pertinent to running Yelmo will be documented. Each section will outline a specific parameter or set of related parameters. The author of each section and the date last updated will apear in the heading, to maintain traceability in the documentation (since code usually changes over time). This is a work in progress! Basal friction Yelmo includes the representation of several friction laws that all take the form: \\tau_b = -\\beta u_b where $\\beta$ is composed of a coefficient $c_b$ and potentially another contribution that depends on $u_b$ too: \\beta = c_b f(u_b) In Yelmo, the field $c_b$ is defined by the variable c_bed and has units of Pa. The term $f(u_b)$ is not output in the model, but it contributes with units of yr m$^{-1}$, so $\\beta$ finally has units of Pa yr m$^{-1}$. When multiplied with $u_b$, we arrive at $\\tau_b$ with units of Pa. Yelmo calculates $c_b$ ( c_bed ) internally as either: c_b = c_{\\rm b,ref} * N_{\\rm eff} or c_b = {\\rm tan}(c_{\\rm b,ref}) * N_{\\rm eff} This is controlled by the user option ytill.is_angle . If ytill.is_angle=True , then $c_{\\rm b,ref}$ (variable cb_ref in the code) is considered as an angle and the latter formulation above is used, following e.g., Bueler and van Pelt (2015). If ytill.is_angle=False , then cb_ref is used as a scalar field directly. In both cases, this field represents the till or basal properties (roughness, etc.) that are rather independent from how the effective pressure $N_{\\rm eff}$ (variable N_eff ) may be defined. With the variables formulated as above, it is possible to consider cb_ref as a tunable field that can be adjusted to improve model performance on a given domain. This can be achieved, for example, by performing optimization via the ice_optimization module, which adjusts cb_ref as a function of the mismatch of the simulated ice thickness with a target field. Also, cb_ref can either be optimized as a scalar field itself, or as an angle that is input to ${\\rm tan}(c_{\\rm b,ref})$ above. Another possibility is to tune cb_ref as a function of other model or boundary variables. The most common approach is to tune it as as function of the bedrock elevation relative to present-day sea level (e.g., Winkelmann et al., 2011). In Yelmo, this is controlled by the parameter choices in the ytill section, and in particular the parameter ytill.scale=['none','lin','exp'] . When ytill.scale='none' , no scaling function is applied and then cb_ref=ytill.cf_ref everywhere. When ytill.scale='lin' , a linear scaling is applied so that cb_ref goes from ytill.cf_min to ytill.cb_ref for bedrock elevations between ytill.z0 and ytill.z1 (saturating otherwise). Finally, if ytill.scale='exp' , an exponential decay function is applied, such that cb_ref=ytill.cf_ref for z_bed >= ytill.z1 , and decays following a curve that reaches ~30% of its value at z_bed=ytill.z0 . Finally, all values are limited to a minimum value of ytill.cf_min . Effective pressure Effective pressure ( N_eff , $N_{\\rm eff}$) in Yelmo is currently only used in the basal friction formulation as shown above. It provides a mechanism to alter the basal friction as a function of the state of the ice sheet, which is separate from $c_{\\rm b,ref}$ ( cb_ref ), which represents the properties of the bed beneath the ice sheet. The calculation of N_eff can be done with several methods: bash yneff.method = [-1,0,1,2,3] -1: Set N_eff external to Yelmo, do not modify it internally. 0: Impose a constant value, N_eff = yneff.const 1: Impose the overburden pressure, N_eff = rho_ice*g*H_ice 2: Calculate N_eff following the Leguy formulation 3: Calculate N_eff as till pressure following Bueler and van Pelt (2015). 4: Calculate N_eff as a 'two-valued' function scaled by f_pmp using yneff.delta.","title":"Parameters"},{"location":"parameters/#parameters","text":"Here important parameter choices pertinent to running Yelmo will be documented. Each section will outline a specific parameter or set of related parameters. The author of each section and the date last updated will apear in the heading, to maintain traceability in the documentation (since code usually changes over time). This is a work in progress!","title":"Parameters"},{"location":"parameters/#basal-friction","text":"Yelmo includes the representation of several friction laws that all take the form: \\tau_b = -\\beta u_b where $\\beta$ is composed of a coefficient $c_b$ and potentially another contribution that depends on $u_b$ too: \\beta = c_b f(u_b) In Yelmo, the field $c_b$ is defined by the variable c_bed and has units of Pa. The term $f(u_b)$ is not output in the model, but it contributes with units of yr m$^{-1}$, so $\\beta$ finally has units of Pa yr m$^{-1}$. When multiplied with $u_b$, we arrive at $\\tau_b$ with units of Pa. Yelmo calculates $c_b$ ( c_bed ) internally as either: c_b = c_{\\rm b,ref} * N_{\\rm eff} or c_b = {\\rm tan}(c_{\\rm b,ref}) * N_{\\rm eff} This is controlled by the user option ytill.is_angle . If ytill.is_angle=True , then $c_{\\rm b,ref}$ (variable cb_ref in the code) is considered as an angle and the latter formulation above is used, following e.g., Bueler and van Pelt (2015). If ytill.is_angle=False , then cb_ref is used as a scalar field directly. In both cases, this field represents the till or basal properties (roughness, etc.) that are rather independent from how the effective pressure $N_{\\rm eff}$ (variable N_eff ) may be defined. With the variables formulated as above, it is possible to consider cb_ref as a tunable field that can be adjusted to improve model performance on a given domain. This can be achieved, for example, by performing optimization via the ice_optimization module, which adjusts cb_ref as a function of the mismatch of the simulated ice thickness with a target field. Also, cb_ref can either be optimized as a scalar field itself, or as an angle that is input to ${\\rm tan}(c_{\\rm b,ref})$ above. Another possibility is to tune cb_ref as a function of other model or boundary variables. The most common approach is to tune it as as function of the bedrock elevation relative to present-day sea level (e.g., Winkelmann et al., 2011). In Yelmo, this is controlled by the parameter choices in the ytill section, and in particular the parameter ytill.scale=['none','lin','exp'] . When ytill.scale='none' , no scaling function is applied and then cb_ref=ytill.cf_ref everywhere. When ytill.scale='lin' , a linear scaling is applied so that cb_ref goes from ytill.cf_min to ytill.cb_ref for bedrock elevations between ytill.z0 and ytill.z1 (saturating otherwise). Finally, if ytill.scale='exp' , an exponential decay function is applied, such that cb_ref=ytill.cf_ref for z_bed >= ytill.z1 , and decays following a curve that reaches ~30% of its value at z_bed=ytill.z0 . Finally, all values are limited to a minimum value of ytill.cf_min .","title":"Basal friction"},{"location":"parameters/#effective-pressure","text":"Effective pressure ( N_eff , $N_{\\rm eff}$) in Yelmo is currently only used in the basal friction formulation as shown above. It provides a mechanism to alter the basal friction as a function of the state of the ice sheet, which is separate from $c_{\\rm b,ref}$ ( cb_ref ), which represents the properties of the bed beneath the ice sheet. The calculation of N_eff can be done with several methods: bash yneff.method = [-1,0,1,2,3] -1: Set N_eff external to Yelmo, do not modify it internally. 0: Impose a constant value, N_eff = yneff.const 1: Impose the overburden pressure, N_eff = rho_ice*g*H_ice 2: Calculate N_eff following the Leguy formulation 3: Calculate N_eff as till pressure following Bueler and van Pelt (2015). 4: Calculate N_eff as a 'two-valued' function scaled by f_pmp using yneff.delta.","title":"Effective pressure"},{"location":"remapping/","text":"Remapping Yelmo runs on a Cartesian (x/y) grid. Often input data comes in many formats, global lat/lon grids, projections and sets of points. It is important to have robust remapping tools. Typically for a given domain, we define a Polar Stereographic projection to be able to convert lat/lon data points onto a Cartesian plane. For Antarctica, for example, the standard projection has the following parameters: int polar_stereographic ; polar_stereographic:grid_mapping_name = \"polar_stereographic\" ; polar_stereographic:straight_vertical_longitude_from_pole = 0. ; polar_stereographic:latitude_of_projection_origin = -71. ; polar_stereographic:angle_of_oblique_tangent = 19. ; polar_stereographic:scale_factor_at_projection_origin = 1. ; polar_stereographic:false_easting = 0. ; polar_stereographic:false_northing = 0. ; Naming files For grids used by Yelmo, we generally use an abbreviation for the domain name followed by the resolution. So for Antarctica, we could have the grids ANT-32KM or ANT-16KM for a 32km or 16km grid, respectively. Data that have been projected onto these grids are saved with the grid name as a prefix followed by a general name that specifies the type of data, e.g., CLIM or TOPO , finally followed by more descriptive information about the specific dataset IPSL-14Ma or IPSL-PD-CTRL . For example, the latest topopgraphy dataset we use is called the RTopo2.0.1 dataset, so this is processed into a file called ANT-32KM_TOPO-RTOPO-2.0.1.nc . Fields Yelmo needs To drive Yelmo with boundary conditions derived from a climate model, it needs the following fields to be defined on the Polar Stereographic grid: Climatological mean near-surface air temperature [monthly] Climatological mean precipitation [monthly] Surface elevation Sea level Ice thickness Climatological mean 3D ocean temperature [annual] Climatological mean 3D ocean salinity [annual] Oceanic bathymetry Likely these would be processed into two or more separate files, e.g., one for climate CLIM variables and another for ocean OCN variables. Preprocessing data using cdo As a first step, the Climate Data Operators cdo package is great for most preprocessing steps. It can handle averaging data over time and space, merging data files, extracting individual variables etc. See the extensive documentation and examples online. For example, it is possible to use the command cdo selvar to extract specific variables from a file: cdo selvar,t2m,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc If you have several variables in individual files, you can then conveniently merge them into one file usine merge (it's better if they have the same shape): # Extract t2m to a temporary file cdo selvar,t2m diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc # Extract precip to a temporary file cdo selvar,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp2.nc # Merge the two individual variable files into one convenient file cdo merge ipsl_tmp1.nc ipsl_tmp2.nc ipsl_tmp3.nc There are many other useful commands, particularly for getting monthly means cdo monmean ... and other statistics. Resources: CDO Documentation page: https://code.mpimet.mpg.de/projects/cdo/wiki/Cdo#Documentation CDO User guide: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo.pdf CDO Reference card: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo_refcard.pdf Using cdo for remapping To remap a data file from lat/lon coordinates to our projection, cdo needs a grid description file that describes the target Polar Stereographic projection grid. For example, for a 32km resolution domain, we would use the following file named grid_ANT-32KM.txt : gridtype = projection gridsize = 36481 xsize = 191 ysize = 191 xname = xc xunits = km yname = yc yunits = km xfirst = -3040.000000 xinc = 32.000000 yfirst = -3040.000000 yinc = 32.000000 grid_mapping = crs grid_mapping_name = polar_stereographic straight_vertical_longitude_from_pole = 0.000 latitude_of_projection_origin = -90.000 standard_parallel = -71.000 false_easting = 0.000 false_northing = 0.000 semi_major_axis = 6378137.000 inverse_flattening = 298.25722356 With this file defined, it's easy to perform projections using the cdo remap* commands. To perform a bicubic interpolation, call: cdo remapbic,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-bic.nc Here, remapbic specifies bicubic interpolation and grid_ANT-32KM.txt defines the target grid as above. Then the source dataset is specified and the desired output file ANT-32KM_test.nc . To perform conservative interpolation, replace remapbic with remapcon : cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc Conservative interpolation is generally preferred, especially when going from a high resolution to a lower resolution, as it avoids unwanted interpolation artifacts and conserves the quantity being remapped. However, from low resolution to high resolution, conservative interpolation can result in more \"blocky\" fields with abrupt changes in values. Thus, in this case, bicubic interpolation, or conservative interpolation with additional Gaussian smoothing is better. The latter is not supported by cdo , but can be acheived with other tools. One option for processing may be a conservative remapping, following by a smoothing step: cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc cdo smooth,radius=128km ANT-32KM_test-con.nc ANT-32KM_test-con-smooth.nc The smoothing radius should be chosen such that it is the smallest value possible that removes blocky artifacts from the field. Summary It can be tedious to process data from a climate model into the right format to drive Yelmo. Tools like cdo help to reduce this burden. Other tools like NetCDF Operator NCO and today numerous Python-based libraries and tools can also be used. It is best to define a script or program with all the processing steps clearly defined. That way, when new data becomes available from the same model, it is easy to process it systematically (and reproducibly) in the same way without any trouble. Remapping restart file Sometimes we may want to restart a simulation at a new resolution - i.e., perform a spinup simulation at relatively low resolution and then continue the simulation at higher resolution. Use cdo to remap the restart file based on the grid definition files. # Define env variables as shortcuts to locations of grid files grid_src=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-32KM.txt grid_tgt=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-16KM.txt # Call remapping cdo remapcon,${grid_tgt} -setgrid,${grid_src} yelmo_restart.nc yelmo_restart_16km.nc Let's do a test. First, run a short 32km Greenland simulation and generate a restart file: ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim0-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" That simulation should have produced a nice restart file. Let's test a normal 32km simulation that continues from this restart file. ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim1-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" Ok, now generate scrip map file to interpolate from 32km down to 16km. domain=Greenland grid_name_src=GRL-32KM grid_name_tgt=GRL-4KM nc_src=../ice_data/${domain}/${grid_name_src}/${grid_name_src}_REGIONS.nc cdo gencon,grid_${grid_name_tgt}.txt -setgrid,grid_${grid_name_src}.txt ${nc_src} scrip-con_${grid_name_src}_${grid_name_tgt}.nc Now let's try to run a simulation at 16km, loading the restart file from 32km ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim2-16km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-16KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" The simulation is successful! (as of branch alex-dev-2 , revision 1d9783fb ).","title":"Remapping"},{"location":"remapping/#remapping","text":"Yelmo runs on a Cartesian (x/y) grid. Often input data comes in many formats, global lat/lon grids, projections and sets of points. It is important to have robust remapping tools. Typically for a given domain, we define a Polar Stereographic projection to be able to convert lat/lon data points onto a Cartesian plane. For Antarctica, for example, the standard projection has the following parameters: int polar_stereographic ; polar_stereographic:grid_mapping_name = \"polar_stereographic\" ; polar_stereographic:straight_vertical_longitude_from_pole = 0. ; polar_stereographic:latitude_of_projection_origin = -71. ; polar_stereographic:angle_of_oblique_tangent = 19. ; polar_stereographic:scale_factor_at_projection_origin = 1. ; polar_stereographic:false_easting = 0. ; polar_stereographic:false_northing = 0. ;","title":"Remapping"},{"location":"remapping/#naming-files","text":"For grids used by Yelmo, we generally use an abbreviation for the domain name followed by the resolution. So for Antarctica, we could have the grids ANT-32KM or ANT-16KM for a 32km or 16km grid, respectively. Data that have been projected onto these grids are saved with the grid name as a prefix followed by a general name that specifies the type of data, e.g., CLIM or TOPO , finally followed by more descriptive information about the specific dataset IPSL-14Ma or IPSL-PD-CTRL . For example, the latest topopgraphy dataset we use is called the RTopo2.0.1 dataset, so this is processed into a file called ANT-32KM_TOPO-RTOPO-2.0.1.nc .","title":"Naming files"},{"location":"remapping/#fields-yelmo-needs","text":"To drive Yelmo with boundary conditions derived from a climate model, it needs the following fields to be defined on the Polar Stereographic grid: Climatological mean near-surface air temperature [monthly] Climatological mean precipitation [monthly] Surface elevation Sea level Ice thickness Climatological mean 3D ocean temperature [annual] Climatological mean 3D ocean salinity [annual] Oceanic bathymetry Likely these would be processed into two or more separate files, e.g., one for climate CLIM variables and another for ocean OCN variables.","title":"Fields Yelmo needs"},{"location":"remapping/#preprocessing-data-using-cdo","text":"As a first step, the Climate Data Operators cdo package is great for most preprocessing steps. It can handle averaging data over time and space, merging data files, extracting individual variables etc. See the extensive documentation and examples online. For example, it is possible to use the command cdo selvar to extract specific variables from a file: cdo selvar,t2m,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc If you have several variables in individual files, you can then conveniently merge them into one file usine merge (it's better if they have the same shape): # Extract t2m to a temporary file cdo selvar,t2m diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp1.nc # Extract precip to a temporary file cdo selvar,precip diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ipsl_tmp2.nc # Merge the two individual variable files into one convenient file cdo merge ipsl_tmp1.nc ipsl_tmp2.nc ipsl_tmp3.nc There are many other useful commands, particularly for getting monthly means cdo monmean ... and other statistics. Resources: CDO Documentation page: https://code.mpimet.mpg.de/projects/cdo/wiki/Cdo#Documentation CDO User guide: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo.pdf CDO Reference card: https://code.mpimet.mpg.de/projects/cdo/embedded/cdo_refcard.pdf","title":"Preprocessing data using cdo"},{"location":"remapping/#using-cdo-for-remapping","text":"To remap a data file from lat/lon coordinates to our projection, cdo needs a grid description file that describes the target Polar Stereographic projection grid. For example, for a 32km resolution domain, we would use the following file named grid_ANT-32KM.txt : gridtype = projection gridsize = 36481 xsize = 191 ysize = 191 xname = xc xunits = km yname = yc yunits = km xfirst = -3040.000000 xinc = 32.000000 yfirst = -3040.000000 yinc = 32.000000 grid_mapping = crs grid_mapping_name = polar_stereographic straight_vertical_longitude_from_pole = 0.000 latitude_of_projection_origin = -90.000 standard_parallel = -71.000 false_easting = 0.000 false_northing = 0.000 semi_major_axis = 6378137.000 inverse_flattening = 298.25722356 With this file defined, it's easy to perform projections using the cdo remap* commands. To perform a bicubic interpolation, call: cdo remapbic,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-bic.nc Here, remapbic specifies bicubic interpolation and grid_ANT-32KM.txt defines the target grid as above. Then the source dataset is specified and the desired output file ANT-32KM_test.nc . To perform conservative interpolation, replace remapbic with remapcon : cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc Conservative interpolation is generally preferred, especially when going from a high resolution to a lower resolution, as it avoids unwanted interpolation artifacts and conserves the quantity being remapped. However, from low resolution to high resolution, conservative interpolation can result in more \"blocky\" fields with abrupt changes in values. Thus, in this case, bicubic interpolation, or conservative interpolation with additional Gaussian smoothing is better. The latter is not supported by cdo , but can be acheived with other tools. One option for processing may be a conservative remapping, following by a smoothing step: cdo remapcon,grid_ANT-32KM.txt diane_C14Ma_1_5PAL_SE_4750_4849_1M_histmth.nc ANT-32KM_test-con.nc cdo smooth,radius=128km ANT-32KM_test-con.nc ANT-32KM_test-con-smooth.nc The smoothing radius should be chosen such that it is the smallest value possible that removes blocky artifacts from the field.","title":"Using cdo for remapping"},{"location":"remapping/#summary","text":"It can be tedious to process data from a climate model into the right format to drive Yelmo. Tools like cdo help to reduce this burden. Other tools like NetCDF Operator NCO and today numerous Python-based libraries and tools can also be used. It is best to define a script or program with all the processing steps clearly defined. That way, when new data becomes available from the same model, it is easy to process it systematically (and reproducibly) in the same way without any trouble.","title":"Summary"},{"location":"remapping/#remapping-restart-file","text":"Sometimes we may want to restart a simulation at a new resolution - i.e., perform a spinup simulation at relatively low resolution and then continue the simulation at higher resolution. Use cdo to remap the restart file based on the grid definition files. # Define env variables as shortcuts to locations of grid files grid_src=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-32KM.txt grid_tgt=/Users/robinson/models/EURICE/gridding/maps/grid_GRL-16KM.txt # Call remapping cdo remapcon,${grid_tgt} -setgrid,${grid_src} yelmo_restart.nc yelmo_restart_16km.nc Let's do a test. First, run a short 32km Greenland simulation and generate a restart file: ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim0-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" That simulation should have produced a nice restart file. Let's test a normal 32km simulation that continues from this restart file. ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim1-32km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-32KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" Ok, now generate scrip map file to interpolate from 32km down to 16km. domain=Greenland grid_name_src=GRL-32KM grid_name_tgt=GRL-4KM nc_src=../ice_data/${domain}/${grid_name_src}/${grid_name_src}_REGIONS.nc cdo gencon,grid_${grid_name_tgt}.txt -setgrid,grid_${grid_name_src}.txt ${nc_src} scrip-con_${grid_name_src}_${grid_name_tgt}.nc Now let's try to run a simulation at 16km, loading the restart file from 32km ./runme -r -e initmip -n par/yelmo_initmip.nml -o output/restarts/sim2-16km -p ctrl.time_end=100 ctrl.time_equil=0 ctrl.clim_nm=\"clim_pd_grl\" yelmo.domain=\"Greenland\" yelmo.grid_name=\"GRL-16KM\" yelmo.restart=\"../sim0-32km/yelmo_restart.nc\" The simulation is successful! (as of branch alex-dev-2 , revision 1d9783fb ).","title":"Remapping restart file"},{"location":"running-antarctica/","text":"Standard YelmoX simulations To run YelmoX, by default we use the program yelmox.f90 . This program currently makes use of snapclim for the climatic forcing and smbpal for the snowpack and surface mass balance calculations. Spin-up simulation for ISMIP6-based runs (ISMIP6, ABUMIP) First make sure your distribution of yelmox and yelmo are up to date. cd yelmo git pull cd .. # In the main yelmox directory, change to the branch 'tfm2021' or 'abumip-2021': git pull git checkout tfm2021 # From main directory of yelmox, also reconfigure to adopt all changes: python3 config.py config/snowball_gfortran # Link to ice_data path as needed: ln -s /media/Data/ice_data ice_data Now compile as normal, but with the yelmox_ismip6 program: make clean make yelmox_ismip6 You are now ready to run some ISMIP6 simulations. If you have a spinup simulation available you can skip the rest of this section. The next step is to get a spin-up simulation ready. To do so, we will run a small ensemble of simulations that apply different calving coefficients ( ytopo.kt ) and shear-regime enhancement factors ( ymat.enh_shear ). Each simulation will run with these parameter values set, while optimizing the basal friction coefficient field cb_ref and the temperature anomalies imposed in different basins tf_corr . To run this ensemble, use the following commands: # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Define the output folder fldr=tmp/ismip6/spinup_32km_68 # Run the Yelmo ensemble jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1,3 An alternative spinup procedure # First run for 30kyr with topo relax on to spinup thermodynamics... runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_69 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 spinup_ismip6.equil_method=\"relax\" spinup_ismip6.time_end=30e3 spinup_ismip6.time_equil=30e3 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1 # Testing opt spinup but with 'robin' initial temp profile runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_70 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=1.0 # robin-cold but only by -2deg instead of -10deg runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_71 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=0.2 opt_L21.cf_init=0.2 runopt='-rs -q short -w 10' fldr=tmp/ismip6/spinup_32km_72 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 opt_L21.cf_max=10,20,40,45 ydyn.beta_u0=100,300 ISMIP6 simulations Make sure you already have a spinup simulation available, and that the parameters of the spinup will match those supplied here. The next step is to run different experiments of interest that restart from the spinup experiment. Some commands for running diagnostic short runs ### Diagnostic short runs ### # Run a 16km spinup run with relaxation runopt='-rs -q priority -w 1' fldr=tmp/ismip6/spinup_16km_72_diag jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 spinup_ismip6.equil_method=\"relax\" yelmo.grid_name=\"ANT-16KM\" spinup_ismip6.time_end=20 spinup_ismip6.dt2D_out=1 # Run with 16km restarting from 32km file # (currently crashes probably because tf_corr cannot be interpolated) runopt='-rs -q priority -w 1' fldr=tmp/ismip6/ismip_32km_68_diag file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} yelmo.grid_name=\"ANT-16KM\" transient_proj.time_end=1920 transient_proj.dt2D_out=1 ytill.is_angle=False ### Actual ISMIP6 commands # Define output folder as a bash variable fldr=tmp/ismip6/ismip_32km_71 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" ## First run a steady-state simulation to stabilize everything # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_71/0/yelmo_restart.nc # ctrl-0 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl-0 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 transient_proj.time_end=11900 transient_proj.dt1D_out=10 transient_proj.dt2D_out=200 ${paropt} ## Next, call the Yelmo commands for the individual cases... # Define restart file path as a bash variable, for example, on snowball: #file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc file_restart=/p/tmp/robinson/ismip6/ismip_32km_68/ctrl-0/yelmo_restart.nc # ctrl ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp05 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp05 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp09 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp09 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_95\" marine_shelf.gamma_quad_nl=21000 ${paropt} # exp10 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp10 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_5\" marine_shelf.gamma_quad_nl=9620 ${paropt} # exp13 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp13 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_pigl\" marine_shelf.gamma_quad_nl=159000 ${paropt} ABUMIP Make sure you already have a spinup simulation available. Use the following commands to run the three main experiments of interest. Note that abuk and abum may run much more slowly than abuc . The parameter values applied in the commands below ensure that the model parameters correspond to those used in the restart simulation, although many of them like ocean temp. anomalies in different basins or calving parameters, are no longer relevant in the ABUMIP context. It is important, however, to specify ydyn.ssa_lat_bc='marine' , as it is relevant for this experiment to apply marine boundary conditions. This is generally not used currently, as it makes the model much less stable. Note that an equilibrium spin-up simulation has already been performed, which gives good agreement with the present-day ice sheet. These results have been saved in a restart file, from which your simulations will begin (see below). # Define restart file path as a bash variable file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/abumip_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' debugopt=\"abumip_proj.time_end=20 abumip_proj.dt2D_out=1\" # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Call the Yelmo commands... # ABUC - control experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuc -p abumip.scenario=\"abuc\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUM - High shelf melt (400 m/yr) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk-marine -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" # ABUM - High shelf melt (400 m/yr) (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum-marine -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" Simulations with hyster # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/ramp_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Now run simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r1 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=30e3 hyster.method='ramp-time' hyster.df_sign=-1 hyster.dt_init=0 hyster.dt_ramp=10e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} # Try a periodic simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r2 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=100e3 hyster.method='sin' hyster.df_sign=1 hyster.dt_init=0 hyster.dt_ramp=20e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} That's it!","title":"Standard YelmoX simulations"},{"location":"running-antarctica/#standard-yelmox-simulations","text":"To run YelmoX, by default we use the program yelmox.f90 . This program currently makes use of snapclim for the climatic forcing and smbpal for the snowpack and surface mass balance calculations.","title":"Standard YelmoX simulations"},{"location":"running-antarctica/#spin-up-simulation-for-ismip6-based-runs-ismip6-abumip","text":"First make sure your distribution of yelmox and yelmo are up to date. cd yelmo git pull cd .. # In the main yelmox directory, change to the branch 'tfm2021' or 'abumip-2021': git pull git checkout tfm2021 # From main directory of yelmox, also reconfigure to adopt all changes: python3 config.py config/snowball_gfortran # Link to ice_data path as needed: ln -s /media/Data/ice_data ice_data Now compile as normal, but with the yelmox_ismip6 program: make clean make yelmox_ismip6 You are now ready to run some ISMIP6 simulations. If you have a spinup simulation available you can skip the rest of this section. The next step is to get a spin-up simulation ready. To do so, we will run a small ensemble of simulations that apply different calving coefficients ( ytopo.kt ) and shear-regime enhancement factors ( ymat.enh_shear ). Each simulation will run with these parameter values set, while optimizing the basal friction coefficient field cb_ref and the temperature anomalies imposed in different basins tf_corr . To run this ensemble, use the following commands: # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Define the output folder fldr=tmp/ismip6/spinup_32km_68 # Run the Yelmo ensemble jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1,3 An alternative spinup procedure # First run for 30kyr with topo relax on to spinup thermodynamics... runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_69 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 spinup_ismip6.equil_method=\"relax\" spinup_ismip6.time_end=30e3 spinup_ismip6.time_equil=30e3 ytopo.kt=1.0e-3,1.5e-3,2.0e-3,2.5e-3 ymat.enh_shear=1 # Testing opt spinup but with 'robin' initial temp profile runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_70 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=1.0 # robin-cold but only by -2deg instead of -10deg runopt='-rs -q priority -w 10' fldr=tmp/ismip6/spinup_32km_71 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 ymat.enh_shear=1 opt_L21.cf_max=0.2 opt_L21.cf_init=0.2 runopt='-rs -q short -w 10' fldr=tmp/ismip6/spinup_32km_72 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 opt_L21.cf_max=10,20,40,45 ydyn.beta_u0=100,300","title":"Spin-up simulation for ISMIP6-based runs (ISMIP6, ABUMIP)"},{"location":"running-antarctica/#ismip6-simulations","text":"Make sure you already have a spinup simulation available, and that the parameters of the spinup will match those supplied here. The next step is to run different experiments of interest that restart from the spinup experiment. Some commands for running diagnostic short runs ### Diagnostic short runs ### # Run a 16km spinup run with relaxation runopt='-rs -q priority -w 1' fldr=tmp/ismip6/spinup_16km_72_diag jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ytopo.kt=1.0e-3 spinup_ismip6.equil_method=\"relax\" yelmo.grid_name=\"ANT-16KM\" spinup_ismip6.time_end=20 spinup_ismip6.dt2D_out=1 # Run with 16km restarting from 32km file # (currently crashes probably because tf_corr cannot be interpolated) runopt='-rs -q priority -w 1' fldr=tmp/ismip6/ismip_32km_68_diag file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} yelmo.grid_name=\"ANT-16KM\" transient_proj.time_end=1920 transient_proj.dt2D_out=1 ytill.is_angle=False ###","title":"ISMIP6 simulations"},{"location":"running-antarctica/#actual-ismip6-commands","text":"# Define output folder as a bash variable fldr=tmp/ismip6/ismip_32km_71 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 10' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" ## First run a steady-state simulation to stabilize everything # Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_71/0/yelmo_restart.nc # ctrl-0 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl-0 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 transient_proj.time_end=11900 transient_proj.dt1D_out=10 transient_proj.dt2D_out=200 ${paropt} ## Next, call the Yelmo commands for the individual cases... # Define restart file path as a bash variable, for example, on snowball: #file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc file_restart=/p/tmp/robinson/ismip6/ismip_32km_68/ctrl-0/yelmo_restart.nc # ctrl ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/ctrl -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp05 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp05 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 ${paropt} # exp09 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp09 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_95\" marine_shelf.gamma_quad_nl=21000 ${paropt} # exp10 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp10 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_5\" marine_shelf.gamma_quad_nl=9620 ${paropt} # exp13 ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/exp13 -p ctrl.run_step=\"transient_proj\" yelmo.restart=${file_restart} transient_proj.scenario=\"rcp85\" tf_cor.name=\"dT_nl_pigl\" marine_shelf.gamma_quad_nl=159000 ${paropt}","title":"Actual ISMIP6 commands"},{"location":"running-antarctica/#abumip","text":"Make sure you already have a spinup simulation available. Use the following commands to run the three main experiments of interest. Note that abuk and abum may run much more slowly than abuc . The parameter values applied in the commands below ensure that the model parameters correspond to those used in the restart simulation, although many of them like ocean temp. anomalies in different basins or calving parameters, are no longer relevant in the ABUMIP context. It is important, however, to specify ydyn.ssa_lat_bc='marine' , as it is relevant for this experiment to apply marine boundary conditions. This is generally not used currently, as it makes the model much less stable. Note that an equilibrium spin-up simulation has already been performed, which gives good agreement with the present-day ice sheet. These results have been saved in a restart file, from which your simulations will begin (see below). # Define restart file path as a bash variable file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/abumip_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' debugopt=\"abumip_proj.time_end=20 abumip_proj.dt2D_out=1\" # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Call the Yelmo commands... # ABUC - control experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuc -p abumip.scenario=\"abuc\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUM - High shelf melt (400 m/yr) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} # ABUK - Ocean-kill experiment (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abuk-marine -p abumip.scenario=\"abuk\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\" # ABUM - High shelf melt (400 m/yr) (MARINE BOUNDARY CONDITIONS) ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/abum-marine -p abumip.scenario=\"abum\" ctrl.run_step=\"abumip_proj\" yelmo.restart=${file_restart} abumip_proj.scenario=\"ctrl\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 isostasy.method=0 ${paropt} ydyn.ssa_lat_bc=\"marine\"","title":"ABUMIP"},{"location":"running-antarctica/#simulations-with-hyster","text":"# Define restart file path as a bash variable, for example, on snowball: file_restart=/p/tmp/robinson/ismip6/spinup_32km_68/0/yelmo_restart.nc # Define output folder as a bash variable fldr=tmp/ismip6/ramp_32km_68 # Specify run choices, to run locally in the background: runopt='-r' # or, to submit job to a cluster, eg: runopt='-rs -q priority -w 5' # Set parameter choices to match those of spinup simulation paropt=\"ytopo.kt=1.0e-3\" # Now run simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r1 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=30e3 hyster.method='ramp-time' hyster.df_sign=-1 hyster.dt_init=0 hyster.dt_ramp=10e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} # Try a periodic simulation ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o ${fldr}/r2 -p ctrl.run_step=\"hysteresis_proj\" yelmo.restart=${file_restart} tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 hysteresis_proj.time_end=100e3 hyster.method='sin' hyster.df_sign=1 hyster.dt_init=0 hyster.dt_ramp=20e3 hyster.f_min=-10 hyster.f_max=5 ${paropt} That's it!","title":"Simulations with hyster"},{"location":"running-greenland-ismip6/","text":"Testing without the ice sheet jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" spinup_ismip6.with_ice_sheet=False spinup_ismip6.time_end=10 spinup_ismip6.dt2D_out=10 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"transient_proj\" transient_proj.with_ice_sheet=False Specify run choices, to run locally in the background runopt='-r' or, to submit job to a cluster, eg runopt='-rs -q priority -w 05:00:00' Define the output folder fldr=tmp/ismip6/spinup-grl_16km_2 Run the Yelmo ensemble fldr=tmp/ismip6/spinup-grl_16km_1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ymat.enh_shear=1,3 fldr=tmp/ismip6/spinup-grl_16km_2 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.opt_tf=True opt_L21.tf_min=-1 opt_L21.tf_max=1 fldr=tmp/ismip6/spinup-grl_16km_3 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" Spinup is not great, but functions. Try a transient simulation now file_restart=/p/tmp/robinson/ismip6/spinup-grl_16km_3/0/yelmo_restart.nc fldr=tmp/ismip6/ismip-grl_16km_3-1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p yelmo.restart=${file_restart} ctrl.run_step=\"transient_proj\" transient_proj.scenario=\"ctrl\",\"rcp26\",\"rcp85\" ismip6.gcm=\"miroc5\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" ytill.method=-1","title":"Testing without the ice sheet"},{"location":"running-greenland-ismip6/#testing-without-the-ice-sheet","text":"jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" spinup_ismip6.with_ice_sheet=False spinup_ismip6.time_end=10 spinup_ismip6.dt2D_out=10 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"transient_proj\" transient_proj.with_ice_sheet=False","title":"Testing without the ice sheet"},{"location":"running-greenland-ismip6/#specify-run-choices-to-run-locally-in-the-background","text":"runopt='-r'","title":"Specify run choices, to run locally in the background"},{"location":"running-greenland-ismip6/#or-to-submit-job-to-a-cluster-eg","text":"runopt='-rs -q priority -w 05:00:00'","title":"or, to submit job to a cluster, eg"},{"location":"running-greenland-ismip6/#define-the-output-folder","text":"fldr=tmp/ismip6/spinup-grl_16km_2","title":"Define the output folder"},{"location":"running-greenland-ismip6/#run-the-yelmo-ensemble","text":"fldr=tmp/ismip6/spinup-grl_16km_1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ymat.enh_shear=1,3 fldr=tmp/ismip6/spinup-grl_16km_2 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.opt_tf=True opt_L21.tf_min=-1 opt_L21.tf_max=1 fldr=tmp/ismip6/spinup-grl_16km_3 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\"","title":"Run the Yelmo ensemble"},{"location":"running-greenland-ismip6/#spinup-is-not-great-but-functions-try-a-transient-simulation-now","text":"file_restart=/p/tmp/robinson/ismip6/spinup-grl_16km_3/0/yelmo_restart.nc fldr=tmp/ismip6/ismip-grl_16km_3-1 jobrun ./runme ${runopt} -e ismip6 -n par/yelmo_ismip6_Greenland.nml -- -o ${fldr} -p yelmo.restart=${file_restart} ctrl.run_step=\"transient_proj\" transient_proj.scenario=\"ctrl\",\"rcp26\",\"rcp85\" ismip6.gcm=\"miroc5\" ytopo.calv_flt_method=\"kill\" ytopo.calv_grnd_method=\"zero\" ytill.method=-1","title":"Spinup is not great, but functions. Try a transient simulation now"},{"location":"running-greenland-paleo/","text":"Running with YelmoX: Greenland paleo simulations This document describes how to get a transient paleo simulation running. It will assume that you already have cloned yelmo and yelmox and have checked out the right version and/or branch, and that you have access to the boundary data needed (e.g., in an ice_data folder). This setup will use the standard yelmox.f90 program. To run a present-day spinup simulation that will also optimize the basal friction coefficient cb_ref using the default parameter values, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o output/spinup-grl-1 Once a spinup is available, you can run a transient simulation. # Define output folder and restart file path fldr=output/paleo-grl-1 restart=output/spinup-grl-1/yelmo_restart.nc # Call run command ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o ${fldr} -p ctrl.time_init=-158e3 ctrl.time_end=2000 ctrl.transient_clim=True ctrl.equil_method=\"none\" yelmo.restart=${restart} Note that transient simulations have the time defined in [years CE]. In other words, time=2000 corresponds to the time 2000 CE. There is another variable available in the code time_bp which is used to represent time before present, where the present day is year 0, assumed to occur at the year time=1950 . To run without a restart file, then it is possible to run the above command but leave out the option yelmo.restart=${restart} . That's it!","title":"Running with YelmoX: Greenland paleo simulations"},{"location":"running-greenland-paleo/#running-with-yelmox-greenland-paleo-simulations","text":"This document describes how to get a transient paleo simulation running. It will assume that you already have cloned yelmo and yelmox and have checked out the right version and/or branch, and that you have access to the boundary data needed (e.g., in an ice_data folder). This setup will use the standard yelmox.f90 program. To run a present-day spinup simulation that will also optimize the basal friction coefficient cb_ref using the default parameter values, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o output/spinup-grl-1 Once a spinup is available, you can run a transient simulation. # Define output folder and restart file path fldr=output/paleo-grl-1 restart=output/spinup-grl-1/yelmo_restart.nc # Call run command ./runme -r -e yelmox -n par/yelmo_Greenland.nml -o ${fldr} -p ctrl.time_init=-158e3 ctrl.time_end=2000 ctrl.transient_clim=True ctrl.equil_method=\"none\" yelmo.restart=${restart} Note that transient simulations have the time defined in [years CE]. In other words, time=2000 corresponds to the time 2000 CE. There is another variable available in the code time_bp which is used to represent time before present, where the present day is year 0, assumed to occur at the year time=1950 . To run without a restart file, then it is possible to run the above command but leave out the option yelmo.restart=${restart} . That's it!","title":"Running with YelmoX: Greenland paleo simulations"},{"location":"running-hysteresis/","text":"Running hysteresis experiments for Antarctica For now, we will: Use optimized basal friction. Spinup with a constant present-day climate based on the ISMIP6 protocol. To run yelmox with this setup, we need the hyst-2021 branches: cd yelmox git checkout hyst-2021 cd yelmo git checkout hyst-2021 # Reconfigure python3 config.py config/snowball_gfortran cd .. python3 config.py config/snowball_gfortran # If not done already, link to ice_data ln -s /media/Data/ice_data ice_data We will run with the ISMIP6 standard YelmoX program, so compile: make clean make yelmox_ismip6 The hysteresis runs can be done in two steps: First, generate a spun-up simulation with optimized basal friction. Restart from the optimized, spun-up state and continue with transient forcing from the hysteresis module. Step 1: spinup The spinup simulation runs for 30.000 years, by default. For the first opt_L21.rel_time1=5e3 years, the shelves and grounding line are relaxed (tightly) to the present-day reference state, while the optimization of the basal friction field cf_ref is active. Next between opt_L21.rel_time1=5e3 kyr and opt_L21.rel_time2=10e3 years, the relaxation timescale is slowly increased from opt_L21.rel_tau1=10 yrs to opt_L21.rel_tau2=1000 yrs, to slowly allow the ice sheet more freedom to adjust its state. Basal optimization is further optimized during this time period. After opt_L21.rel_time2=10e3 years, the relation is disabled and the ice sheet if fully prognostic. The simulation is further run until the end with continual optimization adjustments to cf_ref , although these are usually minor after the initial spinup period. To run a spinup simulation as above, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/spinup01 -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2 tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 Or an ensemble to test different parameters too: fldr=output/hyst/spinup02 jobrun ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2,0.20e-2,0.30e-2,0.40e-2 tf_corr_ant.ronne=0.0,0.25 tf_corr_ant.ross=0.0,0.2 tf_corr_ant.pine=-0.5,0.0 To make the spinup run for a longer time, like 50 kyr, set ctrl.time_end=50e3 . If you already have a spinup simulation available, you can skip that step. Alternatively, you can specify one that is ready on snowball : yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc Step 2: transient simulations To run transient simulations the run_step should be specified as ctrl.run_step=\"hysteresis_proj\" . Typically model parameters should be defined to be equivalent to those used by the restart simulation. The time control parameters of the simulation are defined in the parameter section &hysteresis_proj . Parameters associated with the hysteresis module can be changed in the &hyster section. To be consistent with the restart file above, the following reference parameter values should be set in the parameter file (or at the command line, if used as part of the ensemble): ctrl.run_step=\"hysteresis_proj\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 # For setting output frequency hysteresis_proj.dt2D_out=5e3 hysteresis.dt2D_small_out=100 Example transient simulation of hysteresis_proj.time_end=500 years, with ramp forcing via the hyster module with 100 years of constant forcing, followed by a ramp over 250 years from an anomaly of 0 degC to 5 degC: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test1 -p ytopo.kt=0.001 \\ hyster.method=\"ramp\" hyster.dt_init=100 hyster.dt_ramp=250 hyster.f_min=0 hyster.f_max=5 Example transient simulation using Adaptive Quasi-Equilibrium Forcing (AQEF) with no lead-in time: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test2 -p ytopo.kt=0.001 \\ hyster.method=\"PI42\" hyster.dt_init=0 hyster.f_min=0 hyster.f_max=5 Or simply further constant forcing of a control run by setting hyster.method=\"const\" . You can add white noise (Normal distribution) to your forcing by setting the standard deviation greater than zero: hyster.sigma=1.0 .","title":"Running hysteresis experiments for Antarctica"},{"location":"running-hysteresis/#running-hysteresis-experiments-for-antarctica","text":"For now, we will: Use optimized basal friction. Spinup with a constant present-day climate based on the ISMIP6 protocol. To run yelmox with this setup, we need the hyst-2021 branches: cd yelmox git checkout hyst-2021 cd yelmo git checkout hyst-2021 # Reconfigure python3 config.py config/snowball_gfortran cd .. python3 config.py config/snowball_gfortran # If not done already, link to ice_data ln -s /media/Data/ice_data ice_data We will run with the ISMIP6 standard YelmoX program, so compile: make clean make yelmox_ismip6 The hysteresis runs can be done in two steps: First, generate a spun-up simulation with optimized basal friction. Restart from the optimized, spun-up state and continue with transient forcing from the hysteresis module.","title":"Running hysteresis experiments for Antarctica"},{"location":"running-hysteresis/#step-1-spinup","text":"The spinup simulation runs for 30.000 years, by default. For the first opt_L21.rel_time1=5e3 years, the shelves and grounding line are relaxed (tightly) to the present-day reference state, while the optimization of the basal friction field cf_ref is active. Next between opt_L21.rel_time1=5e3 kyr and opt_L21.rel_time2=10e3 years, the relaxation timescale is slowly increased from opt_L21.rel_tau1=10 yrs to opt_L21.rel_tau2=1000 yrs, to slowly allow the ice sheet more freedom to adjust its state. Basal optimization is further optimized during this time period. After opt_L21.rel_time2=10e3 years, the relation is disabled and the ice sheet if fully prognostic. The simulation is further run until the end with continual optimization adjustments to cf_ref , although these are usually minor after the initial spinup period. To run a spinup simulation as above, use the following command: # First run spinup simulation # (steady-state present day boundary conditions) ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/spinup01 -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2 tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 Or an ensemble to test different parameters too: fldr=output/hyst/spinup02 jobrun ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -- -o ${fldr} -p ctrl.run_step=\"spinup_ismip6\" opt_L21.cf_min=1e-3 ytopo.kt=0.10e-2,0.20e-2,0.30e-2,0.40e-2 tf_corr_ant.ronne=0.0,0.25 tf_corr_ant.ross=0.0,0.2 tf_corr_ant.pine=-0.5,0.0 To make the spinup run for a longer time, like 50 kyr, set ctrl.time_end=50e3 . If you already have a spinup simulation available, you can skip that step. Alternatively, you can specify one that is ready on snowball : yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc","title":"Step 1: spinup"},{"location":"running-hysteresis/#step-2-transient-simulations","text":"To run transient simulations the run_step should be specified as ctrl.run_step=\"hysteresis_proj\" . Typically model parameters should be defined to be equivalent to those used by the restart simulation. The time control parameters of the simulation are defined in the parameter section &hysteresis_proj . Parameters associated with the hysteresis module can be changed in the &hyster section. To be consistent with the restart file above, the following reference parameter values should be set in the parameter file (or at the command line, if used as part of the ensemble): ctrl.run_step=\"hysteresis_proj\" tf_cor.name=\"dT_nl\" marine_shelf.gamma_quad_nl=14500 yelmo.restart=/home/robinson/abumip-2021/yelmox/output/ismip6/spinup11/1/yelmo_restart.nc tf_corr_ant.ronne=0.25 tf_corr_ant.ross=0.2 tf_corr_ant.pine=-0.5 # For setting output frequency hysteresis_proj.dt2D_out=5e3 hysteresis.dt2D_small_out=100 Example transient simulation of hysteresis_proj.time_end=500 years, with ramp forcing via the hyster module with 100 years of constant forcing, followed by a ramp over 250 years from an anomaly of 0 degC to 5 degC: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test1 -p ytopo.kt=0.001 \\ hyster.method=\"ramp\" hyster.dt_init=100 hyster.dt_ramp=250 hyster.f_min=0 hyster.f_max=5 Example transient simulation using Adaptive Quasi-Equilibrium Forcing (AQEF) with no lead-in time: ./runme -r -e ismip6 -n par/yelmo_ismip6_Antarctica.nml -o output/hyst/test2 -p ytopo.kt=0.001 \\ hyster.method=\"PI42\" hyster.dt_init=0 hyster.f_min=0 hyster.f_max=5 Or simply further constant forcing of a control run by setting hyster.method=\"const\" . You can add white noise (Normal distribution) to your forcing by setting the standard deviation greater than zero: hyster.sigma=1.0 .","title":"Step 2: transient simulations"},{"location":"running-with-yelmox/","text":"Running with YelmoX YelmoX is a separate repository that is designed to provide supplementary libraries and programs that allow running ice-sheet simulations with realistic boundary (e.g., climate and ocean) forcing and interactions (e.g., isostatic rebound). Here you can find the basic information and steps needed to get YelmoX running. Super-quick start A summary of commands to get started is given below. Make sure all Dependencies are installed and that you follow the HPC notes ! Also note, below it is assumed that you are setting up on the pik_hpc2024 system. If not, make sure to specify the config file for your own system, as well as the locations of ice_data and isostasy_data (see HPC notes ). # yelmox git clone git@github.com:palma-ice/yelmox.git cd yelmox python3 config.py config/pik_hpc2024_ifx # yelmo git clone git@github.com:palma-ice/yelmo.git cd yelmo python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ cd .. # FastIsostasy git clone git@github.com:palma-ice/FastIsostasy.git cd FastIsostasy python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./ cd .. # coordinates git clone git@github.com:cxesmc/coordinates.git cd coordinates COORDSRC=$PWD python3 config.py config/pik_hpc2024_ifx cd .. # REMBOv1 git clone git@github.com:alex-robinson/rembo1.git cd rembo1 python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ ln -s $COORDSRC ./ cd .. # Now, compile the default program make clean make yelmox # Link to `ice_data` and `isostasy_data` repositories wherever you have them saved on your system datapath=/p/projects/megarun ln -s $datapath/ice_data ln -s $datapath/isostasy_data # Copy the runme config file to the main directory and modify for your system cp .runme/runme_config .runme_config # Run a test simulation of Antarctica for 1000 yrs ./runme -r -e yelmox -n par/yelmo_Antarctica.nml -o output/ant-test -p ctrl.time_end=1e3 That's it!","title":"Running with YelmoX"},{"location":"running-with-yelmox/#running-with-yelmox","text":"YelmoX is a separate repository that is designed to provide supplementary libraries and programs that allow running ice-sheet simulations with realistic boundary (e.g., climate and ocean) forcing and interactions (e.g., isostatic rebound). Here you can find the basic information and steps needed to get YelmoX running.","title":"Running with YelmoX"},{"location":"running-with-yelmox/#super-quick-start","text":"A summary of commands to get started is given below. Make sure all Dependencies are installed and that you follow the HPC notes ! Also note, below it is assumed that you are setting up on the pik_hpc2024 system. If not, make sure to specify the config file for your own system, as well as the locations of ice_data and isostasy_data (see HPC notes ). # yelmox git clone git@github.com:palma-ice/yelmox.git cd yelmox python3 config.py config/pik_hpc2024_ifx # yelmo git clone git@github.com:palma-ice/yelmo.git cd yelmo python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ cd .. # FastIsostasy git clone git@github.com:palma-ice/FastIsostasy.git cd FastIsostasy python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./ cd .. # coordinates git clone git@github.com:cxesmc/coordinates.git cd coordinates COORDSRC=$PWD python3 config.py config/pik_hpc2024_ifx cd .. # REMBOv1 git clone git@github.com:alex-robinson/rembo1.git cd rembo1 python3 config.py config/pik_hpc2024_ifx ln -s $FESMUSRC ./libs/ ln -s $COORDSRC ./ cd .. # Now, compile the default program make clean make yelmox # Link to `ice_data` and `isostasy_data` repositories wherever you have them saved on your system datapath=/p/projects/megarun ln -s $datapath/ice_data ln -s $datapath/isostasy_data # Copy the runme config file to the main directory and modify for your system cp .runme/runme_config .runme_config # Run a test simulation of Antarctica for 1000 yrs ./runme -r -e yelmox -n par/yelmo_Antarctica.nml -o output/ant-test -p ctrl.time_end=1e3 That's it!","title":"Super-quick start"},{"location":"running-yelmox-rembo/","text":"Running with YelmoX-REMBO Before doing anything, make sure dependencies are installed (Lis, NetCDF, Python:runner) Step 1: Clone the repositories # Clone repository: YelmoX git clone https://github.com/palma-ice/yelmox.git # Clone yelmo into a sub-directory too cd yelmox git clone https://github.com/palma-ice/yelmo.git # Clone isostasy into a sub-directory too git clone https://github.com/palma-ice/isostasy.git # Clone rembo into a sub-directory too git clone https://github.com/alex-robinson/rembo1 At this point all the code is downloaded onto the machine. Now we need to configure it for compiling properly. Step 2: Run configuration scripts for each code base # Enter Yelmo directory and configure it for compiling cd yelmo python config.py config/snowball_gfortran make clean cd .. # Enter isostasy directory and configure it for compiling cd isostasy python config.py config/snowball_gfortran make clean cd .. # Enter rembo1 directory and configure it for compiling cd rembo1 python config.py config/snowball_gfortran make clean cd .. # From YelmoX directory, configure it for compiling too python config.py config/snowball_gfortran make clean Note that the example assumes we are using the machine called snowball and we will use the compiler gfortran . If this is not correct, you will need to use the right configuration file available in the config/ directory, or make your own using the others as a template. Step 3: Compile the default program make clean make yelmox_rembo Step 4: Make a link to ice_data # Link to `ice_data` repository wherever you have it saved on your system ln -s /media/Data/ice_data # Take a look at some data to make sure it worked well ncview ice_data/Greenland/GRL-16KM/GRL-16KM_TOPO-M17-v5.nc Step 5: Run a test simulation # Run a test simulation of Greenland for 100 yrs ./runme -r -e rembo -n par/yelmo_Greenland_rembo.nml -o output/test1 -p ctrl.time_end=1e2 That's it, YelmoX-REMBO is working.","title":"Running with YelmoX-REMBO"},{"location":"running-yelmox-rembo/#running-with-yelmox-rembo","text":"Before doing anything, make sure dependencies are installed (Lis, NetCDF, Python:runner)","title":"Running with YelmoX-REMBO"},{"location":"running-yelmox-rembo/#step-1-clone-the-repositories","text":"# Clone repository: YelmoX git clone https://github.com/palma-ice/yelmox.git # Clone yelmo into a sub-directory too cd yelmox git clone https://github.com/palma-ice/yelmo.git # Clone isostasy into a sub-directory too git clone https://github.com/palma-ice/isostasy.git # Clone rembo into a sub-directory too git clone https://github.com/alex-robinson/rembo1 At this point all the code is downloaded onto the machine. Now we need to configure it for compiling properly.","title":"Step 1: Clone the repositories"},{"location":"running-yelmox-rembo/#step-2-run-configuration-scripts-for-each-code-base","text":"# Enter Yelmo directory and configure it for compiling cd yelmo python config.py config/snowball_gfortran make clean cd .. # Enter isostasy directory and configure it for compiling cd isostasy python config.py config/snowball_gfortran make clean cd .. # Enter rembo1 directory and configure it for compiling cd rembo1 python config.py config/snowball_gfortran make clean cd .. # From YelmoX directory, configure it for compiling too python config.py config/snowball_gfortran make clean Note that the example assumes we are using the machine called snowball and we will use the compiler gfortran . If this is not correct, you will need to use the right configuration file available in the config/ directory, or make your own using the others as a template.","title":"Step 2: Run configuration scripts for each code base"},{"location":"running-yelmox-rembo/#step-3-compile-the-default-program","text":"make clean make yelmox_rembo","title":"Step 3: Compile the default program"},{"location":"running-yelmox-rembo/#step-4-make-a-link-to-ice_data","text":"# Link to `ice_data` repository wherever you have it saved on your system ln -s /media/Data/ice_data # Take a look at some data to make sure it worked well ncview ice_data/Greenland/GRL-16KM/GRL-16KM_TOPO-M17-v5.nc","title":"Step 4: Make a link to ice_data"},{"location":"running-yelmox-rembo/#step-5-run-a-test-simulation","text":"# Run a test simulation of Greenland for 100 yrs ./runme -r -e rembo -n par/yelmo_Greenland_rembo.nml -o output/test1 -p ctrl.time_end=1e2 That's it, YelmoX-REMBO is working.","title":"Step 5: Run a test simulation"},{"location":"snapclim/","text":"Snapshot climate (snapclim) The snapclim module is designed to determine climatic forcing, i.e., monthly temperature and precipitation, for a given point in time. This can be acheived by applying a temperature anomaly, or by interpolating snapshots of climate states available for different times. The \"hybrid\" method This is my preferred method and is set up to be rather flexible, and I think is a good place to start for these simulations. It is comprised of an annual mean temperature anomaly time series from 300 kyr ago to today obtained from several spliced paleo reconstructions plus a monthly seasonal cycle over the 300 kyr obtained from a climber2 paleo run. So with the monthly values and the annual mean, you can get monthly temp anomalies over 300 kyr. There are more details in the attached manuscript that was never yet submitted... To activate this method, in the parameter file, set the following parameters in the group \"snapclim\": atm_type = \"hybrid\" ocn_type = \"hybrid\" Then in the group \"snapclim_hybrid\", you can specify: f_eem = 0.4 # Controls the maximum temp anomaly during the Eemian f_glac = 1.0 # Controls the minimum temp anomaly during the glacial period f_hol = 0.5 # Controls the maximum temp anomaly during the Holocene f_seas = 1.0 # Controls the magnitude of the seasonal cycle f_to = 0.2 # Defines the oceanic temperature anomaly relative # to the annual mean atmospheric temp anomaly","title":"Snapshot climate (snapclim)"},{"location":"snapclim/#snapshot-climate-snapclim","text":"The snapclim module is designed to determine climatic forcing, i.e., monthly temperature and precipitation, for a given point in time. This can be acheived by applying a temperature anomaly, or by interpolating snapshots of climate states available for different times.","title":"Snapshot climate (snapclim)"},{"location":"snapclim/#the-hybrid-method","text":"This is my preferred method and is set up to be rather flexible, and I think is a good place to start for these simulations. It is comprised of an annual mean temperature anomaly time series from 300 kyr ago to today obtained from several spliced paleo reconstructions plus a monthly seasonal cycle over the 300 kyr obtained from a climber2 paleo run. So with the monthly values and the annual mean, you can get monthly temp anomalies over 300 kyr. There are more details in the attached manuscript that was never yet submitted... To activate this method, in the parameter file, set the following parameters in the group \"snapclim\": atm_type = \"hybrid\" ocn_type = \"hybrid\" Then in the group \"snapclim_hybrid\", you can specify: f_eem = 0.4 # Controls the maximum temp anomaly during the Eemian f_glac = 1.0 # Controls the minimum temp anomaly during the glacial period f_hol = 0.5 # Controls the maximum temp anomaly during the Holocene f_seas = 1.0 # Controls the magnitude of the seasonal cycle f_to = 0.2 # Defines the oceanic temperature anomaly relative # to the annual mean atmospheric temp anomaly","title":"The \"hybrid\" method"},{"location":"yelmo-io/","text":"Yelmo IO Writing output Multiple generalized routines are available for writing the variables of a Yelmo instance (yelmo_class) to a NetCDF file. The main public facing routines are the following: yelmo_write_init yelmo_write_var yelmo_write_step yelmo_restart_write These routines will be described briefly below. yelmo_write_init subroutine yelmo_write_init(ylmo,filename,time_init,units,irange,jrange) This routine can be used to initialize any file that will make use of one or more dimension axes of Yelmo variables. The dimension variables that will be written to the file are the following: xc, yc, month, zeta, zeta_ac, zeta_rock, age_iso, pd_age_iso, pc_steps, time [unlimited] Some of the dimension variables above are typically only needed for restart files ( age_iso, pd_age_iso, pc_steps ), but are written as well to maintain generality. Importantly, yelmo_write_init can be used to initialize a regional output file by specifying the indices of the bounding box for the region of interest via the arguments irange=[i1,i2], jrange=[j1,j2] . yelmo_write_var subroutine yelmo_write_var(filename,varname,ylmo,n,ncid,irange,jrange) This routine will write a variable to a given filename of an already existing NetCDF file, most likely but not necessarily initialized using yelmo_write_init . This routine will accept any variable varname that is listed in the Yelmo variable tables , which will be written with the attributes specified in the table. This routine can also be used to write regional output using the arguments irange, jrange . yelmo_write_step subroutine yelmo_write_step(ylmo,filename,time,nms,compare_pd,irange,jrange) This routine will write several variables to a file for a given timestep. The variable names can be provided as a vector of strings via the nms argument (e.g., nms=[\"H_ice\",\"z_srf\"] ). The routine will write relevant model performance information and then individually call yelmo_write_var for each variable listed. Optionally it is possible to write comparison fields with present-day data ( compare_pd=.TRUE. ), assuming it has been loaded into the ylmo%dta fields. This routine can also be used to write regional output using the arguements irange, jrange . Note that this routine can be challenging to use in Fortran, when custom variable names ( nms argument) is used. This is because of the Fortran limitation on defining string arrays as inline arguments - namely, all strings in the array are required to have the same length. Passing this argument would give an error: nms=[\"H_ice\",\"z_srf\",\"mask_bed\"] while this would be ok: nms=[\"H_ice \",\"z_srf \",\"mask_bed\"] For three variables this is not so cumbersome, but can be when many variables are listed. If no argument is used, then a subset of useful variables is written: names(1) = \"H_ice\" names(2) = \"z_srf\" names(3) = \"z_bed\" names(4) = \"mask_bed\" names(5) = \"uxy_b\" names(6) = \"uxy_s\" names(7) = \"uxy_bar\" names(8) = \"beta\" names(9) = \"visc_bar\" names(10) = \"T_prime_b\" names(11) = \"H_w\" names(12) = \"mb_net\" names(13) = \"smb\" names(14) = \"bmb\" names(15) = \"cmb\" names(16) = \"z_sl\" yelmo_write_restart subroutine yelmo_restart_write(ylmo,filename,time,init,irange,jrange) This routine will save a snapshot of the Yelmo instance. Essentially the routine will loop over every field found in the Yelmo variable tables and write them to a NetCDF file. Optionally init=.FALSE. will allow writing of multiple timesteps to the same file (largely useful for diagnostic purposes, since the files can get very large). This routine can also be used to write regional output using the arguements irange, jrange . Reading input By specifying the parameter yelmo.restart to a restart file path, Yelmo will read the NetCDF file with a saved snapshot. The routines yelmo_restart_read_topo_bnd and yelmo_restart_read are generally used internally during yelmo_init and yelmo_init_state , respectively. So these routines will not typically be needed by a user externally.","title":"Input/output"},{"location":"yelmo-io/#yelmo-io","text":"","title":"Yelmo IO"},{"location":"yelmo-io/#writing-output","text":"Multiple generalized routines are available for writing the variables of a Yelmo instance (yelmo_class) to a NetCDF file. The main public facing routines are the following: yelmo_write_init yelmo_write_var yelmo_write_step yelmo_restart_write These routines will be described briefly below.","title":"Writing output"},{"location":"yelmo-io/#yelmo_write_init","text":"subroutine yelmo_write_init(ylmo,filename,time_init,units,irange,jrange) This routine can be used to initialize any file that will make use of one or more dimension axes of Yelmo variables. The dimension variables that will be written to the file are the following: xc, yc, month, zeta, zeta_ac, zeta_rock, age_iso, pd_age_iso, pc_steps, time [unlimited] Some of the dimension variables above are typically only needed for restart files ( age_iso, pd_age_iso, pc_steps ), but are written as well to maintain generality. Importantly, yelmo_write_init can be used to initialize a regional output file by specifying the indices of the bounding box for the region of interest via the arguments irange=[i1,i2], jrange=[j1,j2] .","title":"yelmo_write_init"},{"location":"yelmo-io/#yelmo_write_var","text":"subroutine yelmo_write_var(filename,varname,ylmo,n,ncid,irange,jrange) This routine will write a variable to a given filename of an already existing NetCDF file, most likely but not necessarily initialized using yelmo_write_init . This routine will accept any variable varname that is listed in the Yelmo variable tables , which will be written with the attributes specified in the table. This routine can also be used to write regional output using the arguments irange, jrange .","title":"yelmo_write_var"},{"location":"yelmo-io/#yelmo_write_step","text":"subroutine yelmo_write_step(ylmo,filename,time,nms,compare_pd,irange,jrange) This routine will write several variables to a file for a given timestep. The variable names can be provided as a vector of strings via the nms argument (e.g., nms=[\"H_ice\",\"z_srf\"] ). The routine will write relevant model performance information and then individually call yelmo_write_var for each variable listed. Optionally it is possible to write comparison fields with present-day data ( compare_pd=.TRUE. ), assuming it has been loaded into the ylmo%dta fields. This routine can also be used to write regional output using the arguements irange, jrange . Note that this routine can be challenging to use in Fortran, when custom variable names ( nms argument) is used. This is because of the Fortran limitation on defining string arrays as inline arguments - namely, all strings in the array are required to have the same length. Passing this argument would give an error: nms=[\"H_ice\",\"z_srf\",\"mask_bed\"] while this would be ok: nms=[\"H_ice \",\"z_srf \",\"mask_bed\"] For three variables this is not so cumbersome, but can be when many variables are listed. If no argument is used, then a subset of useful variables is written: names(1) = \"H_ice\" names(2) = \"z_srf\" names(3) = \"z_bed\" names(4) = \"mask_bed\" names(5) = \"uxy_b\" names(6) = \"uxy_s\" names(7) = \"uxy_bar\" names(8) = \"beta\" names(9) = \"visc_bar\" names(10) = \"T_prime_b\" names(11) = \"H_w\" names(12) = \"mb_net\" names(13) = \"smb\" names(14) = \"bmb\" names(15) = \"cmb\" names(16) = \"z_sl\"","title":"yelmo_write_step"},{"location":"yelmo-io/#yelmo_write_restart","text":"subroutine yelmo_restart_write(ylmo,filename,time,init,irange,jrange) This routine will save a snapshot of the Yelmo instance. Essentially the routine will loop over every field found in the Yelmo variable tables and write them to a NetCDF file. Optionally init=.FALSE. will allow writing of multiple timesteps to the same file (largely useful for diagnostic purposes, since the files can get very large). This routine can also be used to write regional output using the arguements irange, jrange .","title":"yelmo_write_restart"},{"location":"yelmo-io/#reading-input","text":"By specifying the parameter yelmo.restart to a restart file path, Yelmo will read the NetCDF file with a saved snapshot. The routines yelmo_restart_read_topo_bnd and yelmo_restart_read are generally used internally during yelmo_init and yelmo_init_state , respectively. So these routines will not typically be needed by a user externally.","title":"Reading input"},{"location":"yelmo-variables-ybound/","text":"ybound id variable dimensions units long_name 1 z_bed xc, yc m Bedrock elevation 2 z_bed_sd xc, yc m Standard deviation of bedrock elevation 3 z_sl xc, yc m Sea level elevation 4 H_sed xc, yc m Sediment thickness 5 smb_ref xc, yc m/yr Surface mass balance 6 T_srf xc, yc K Surface temperature 7 bmb_shlf xc, yc m/yr Basal mass balance for ice shelf 8 fmb_shlf xc, yc m/yr Frontal mass balance for ice shelf 9 T_shlf xc, yc K Ice shelf temperature 10 Q_geo xc, yc mW m^-2 Geothermal heat flow at depth 11 enh_srf xc, yc - Enhancement factor at the surface 12 basins xc, yc - Basin identification numbers 13 basin_mask xc, yc - Mask for basins 14 regions xc, yc - Region identification numbers 15 region_mask xc, yc - Mask for regions 16 ice_allowed xc, yc - Locations where ice thickness can be greater than zero 17 calv_mask xc, yc - Locations where calving is not allowed 18 H_ice_ref xc, yc m Reference ice thickness for relaxation routines 19 z_bed_ref xc, yc m Reference bedrock elevation for relaxation routines","title":"ybound"},{"location":"yelmo-variables-ybound/#ybound","text":"id variable dimensions units long_name 1 z_bed xc, yc m Bedrock elevation 2 z_bed_sd xc, yc m Standard deviation of bedrock elevation 3 z_sl xc, yc m Sea level elevation 4 H_sed xc, yc m Sediment thickness 5 smb_ref xc, yc m/yr Surface mass balance 6 T_srf xc, yc K Surface temperature 7 bmb_shlf xc, yc m/yr Basal mass balance for ice shelf 8 fmb_shlf xc, yc m/yr Frontal mass balance for ice shelf 9 T_shlf xc, yc K Ice shelf temperature 10 Q_geo xc, yc mW m^-2 Geothermal heat flow at depth 11 enh_srf xc, yc - Enhancement factor at the surface 12 basins xc, yc - Basin identification numbers 13 basin_mask xc, yc - Mask for basins 14 regions xc, yc - Region identification numbers 15 region_mask xc, yc - Mask for regions 16 ice_allowed xc, yc - Locations where ice thickness can be greater than zero 17 calv_mask xc, yc - Locations where calving is not allowed 18 H_ice_ref xc, yc m Reference ice thickness for relaxation routines 19 z_bed_ref xc, yc m Reference bedrock elevation for relaxation routines","title":"ybound"},{"location":"yelmo-variables-ydata/","text":"ydata id variable dimensions units long_name 1 pd_H_ice xc, yc m PD ice thickness 2 pd_z_srf xc, yc m PD surface elevation 3 pd_z_bed xc, yc m PD bedrock elevation 4 pd_H_grnd xc, yc m PD overburden ice thickness 5 pd_mask_bed xc, yc - PD mask 6 pd_ux_s xc, yc m/yr PD surface velocity in the x-direction 7 pd_uy_s xc, yc m/yr PD surface velocity in the y-direction 8 pd_uxy_s xc, yc m/yr PD surface velocity magnitude 9 pd_T_srf xc, yc K PD surface temperature 10 pd_smb_ref xc, yc m/yr PD surface mass balance 11 pd_depth_iso xc, yc, pd_age_iso m PD depth of specific isochrones 12 pd_err_H_ice xc, yc m PD error in ice thickness 13 pd_err_z_srf xc, yc m PD error in surface elevation 14 pd_err_z_bed xc, yc m PD error in bedrock elevation 15 pd_err_smb_ref xc, yc m/yr PD error in surface mass balance 16 pd_err_uxy_s xc, yc m/yr PD error in surface velocity magnitude 17 pd_err_depth_iso xc, yc, pd_age_iso m PD error in isochrone depth","title":"ydata"},{"location":"yelmo-variables-ydata/#ydata","text":"id variable dimensions units long_name 1 pd_H_ice xc, yc m PD ice thickness 2 pd_z_srf xc, yc m PD surface elevation 3 pd_z_bed xc, yc m PD bedrock elevation 4 pd_H_grnd xc, yc m PD overburden ice thickness 5 pd_mask_bed xc, yc - PD mask 6 pd_ux_s xc, yc m/yr PD surface velocity in the x-direction 7 pd_uy_s xc, yc m/yr PD surface velocity in the y-direction 8 pd_uxy_s xc, yc m/yr PD surface velocity magnitude 9 pd_T_srf xc, yc K PD surface temperature 10 pd_smb_ref xc, yc m/yr PD surface mass balance 11 pd_depth_iso xc, yc, pd_age_iso m PD depth of specific isochrones 12 pd_err_H_ice xc, yc m PD error in ice thickness 13 pd_err_z_srf xc, yc m PD error in surface elevation 14 pd_err_z_bed xc, yc m PD error in bedrock elevation 15 pd_err_smb_ref xc, yc m/yr PD error in surface mass balance 16 pd_err_uxy_s xc, yc m/yr PD error in surface velocity magnitude 17 pd_err_depth_iso xc, yc, pd_age_iso m PD error in isochrone depth","title":"ydata"},{"location":"yelmo-variables-ydyn/","text":"ydyn id variable dimensions units long_name 1 ux xc, yc, zeta m/yr x-velocity 2 uy xc, yc, zeta m/yr y-velocity 3 uxy xc, yc, zeta m/yr Horizonal velocity magnitude 4 uz xc, yc, zeta_ac m/yr z-component velocity 5 uz_star xc, yc, zeta_ac m/yr z-velocity with corr. for thermal advection 6 ux_bar xc, yc m/yr Depth-averaged x-velocity 7 uy_bar xc, yc m/yr Depth-averaged y-velocity 8 uxy_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 9 ux_bar_prev xc, yc m/yr Previous depth-averaged x-velocity 10 uy_bar_prev xc, yc m/yr Previous depth-averaged y-velocity 11 ux_b xc, yc m/yr Basal x-velocity 12 uy_b xc, yc m/yr Basal y-velocity 13 uz_b xc, yc m/yr Basal z-velocity 14 uxy_b xc, yc m/yr Basal horizontal velocity magnitude 15 ux_s xc, yc m/yr Surface x-velocity 16 uy_s xc, yc m/yr Surface y-velocity 17 uz_s xc, yc m/yr Surface z-velocity 18 uxy_s xc, yc m/yr Surface horizontal velocity magnitude 19 ux_i xc, yc, zeta m/yr Shearing x-velocity 20 uy_i xc, yc, zeta m/yr Shearing y-velocity 21 ux_i_bar xc, yc m/yr Depth-averaged shearing x-velocity 22 uy_i_bar xc, yc m/yr Depth-averaged shearing y-velocity 23 uxy_i_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 24 duxydt xc, yc m/yr^2 Time derivative of uxy 25 duxdz xc, yc, zeta 1/yr x-velocity vertical gradient 26 duydz xc, yc, zeta 1/yr y-velocity vertical gradient 27 duxdz_bar xc, yc 1/yr Depth-averaged x-velocity vertical gradient 28 duydz_bar xc, yc 1/yr Depth-averaged y-velocity vertical gradient 29 taud_acx xc, yc Pa Driving stress (x-dir) 30 taud_acy xc, yc Pa Driving stress (y-dir) 31 taud xc, yc Pa Driving stress magnitude 32 taub_acx xc, yc Pa Basal stress (x-dir) 33 taub_acy xc, yc Pa Basal stress (y-dir) 34 taub xc, yc Pa Basal stress magnitude 35 taul_int_acx xc, yc Pa Depth-integrated lateral stress (x-dir) 36 taul_int_acy xc, yc Pa Depth-integrated lateral stress (y-dir) 37 qq_gl_acx xc, yc m^3/yr Flux across grounding line 38 qq_gl_acy xc, yc m^3/yr Flux across grounding line 39 qq_acx xc, yc m^3/yr Flux (x-dir) 40 qq_acy xc, yc m^3/yr Flux (y-dir) 41 qq xc, yc m^3/yr Flux magnitude 42 de_eff xc, yc, zeta 1/yr Effective strain rate 43 visc_eff xc, yc, zeta Pa yr Effective viscosity 44 visc_eff_int xc, yc Pa yr m Depth-integrated viscosity 45 N_eff xc, yc Pa Effective pressure 46 cb_tgt xc, yc Pa Target basal parameter 47 cb_ref xc, yc -- Reference basal parameter 48 c_bed xc, yc Pa Basal drag coefficient 49 beta_acx xc, yc Pa yr m^-1 Basal stress factor (x) 50 beta_acy xc, yc Pa yr m^-1 Basal stress factor (y) 51 beta xc, yc Pa yr m^-1 Basal stress factor mag. 52 beta_eff xc, yc Pa yr m^-1 Effective basal factor 53 f_vbvs xc, yc - Vertical basal stress 54 ssa_mask_acx xc, yc - SSA mask (x-dir) 55 ssa_mask_acy xc, yc - SSA mask (y-dir) 56 ssa_err_acx xc, yc m/yr SSA error (x-dir) 57 ssa_err_acy xc, yc m/yr SSA error (y-dir) 58 jvel_dxx xc, yc, zeta 1/yr Velocity Jacobian component duxdx 59 jvel_dxy xc, yc, zeta 1/yr Velocity Jacobian component duxdy 60 jvel_dxz xc, yc, zeta 1/yr Velocity Jacobian component duxdz 61 jvel_dyx xc, yc, zeta 1/yr Velocity Jacobian component duydx 62 jvel_dyy xc, yc, zeta 1/yr Velocity Jacobian component duydy 63 jvel_dyz xc, yc, zeta 1/yr Velocity Jacobian component duydz 64 jvel_dzx xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdx 65 jvel_dzy xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdy 66 jvel_dzz xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdz","title":"ydyn"},{"location":"yelmo-variables-ydyn/#ydyn","text":"id variable dimensions units long_name 1 ux xc, yc, zeta m/yr x-velocity 2 uy xc, yc, zeta m/yr y-velocity 3 uxy xc, yc, zeta m/yr Horizonal velocity magnitude 4 uz xc, yc, zeta_ac m/yr z-component velocity 5 uz_star xc, yc, zeta_ac m/yr z-velocity with corr. for thermal advection 6 ux_bar xc, yc m/yr Depth-averaged x-velocity 7 uy_bar xc, yc m/yr Depth-averaged y-velocity 8 uxy_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 9 ux_bar_prev xc, yc m/yr Previous depth-averaged x-velocity 10 uy_bar_prev xc, yc m/yr Previous depth-averaged y-velocity 11 ux_b xc, yc m/yr Basal x-velocity 12 uy_b xc, yc m/yr Basal y-velocity 13 uz_b xc, yc m/yr Basal z-velocity 14 uxy_b xc, yc m/yr Basal horizontal velocity magnitude 15 ux_s xc, yc m/yr Surface x-velocity 16 uy_s xc, yc m/yr Surface y-velocity 17 uz_s xc, yc m/yr Surface z-velocity 18 uxy_s xc, yc m/yr Surface horizontal velocity magnitude 19 ux_i xc, yc, zeta m/yr Shearing x-velocity 20 uy_i xc, yc, zeta m/yr Shearing y-velocity 21 ux_i_bar xc, yc m/yr Depth-averaged shearing x-velocity 22 uy_i_bar xc, yc m/yr Depth-averaged shearing y-velocity 23 uxy_i_bar xc, yc m/yr Depth-averaged horizontal velocity magnitude 24 duxydt xc, yc m/yr^2 Time derivative of uxy 25 duxdz xc, yc, zeta 1/yr x-velocity vertical gradient 26 duydz xc, yc, zeta 1/yr y-velocity vertical gradient 27 duxdz_bar xc, yc 1/yr Depth-averaged x-velocity vertical gradient 28 duydz_bar xc, yc 1/yr Depth-averaged y-velocity vertical gradient 29 taud_acx xc, yc Pa Driving stress (x-dir) 30 taud_acy xc, yc Pa Driving stress (y-dir) 31 taud xc, yc Pa Driving stress magnitude 32 taub_acx xc, yc Pa Basal stress (x-dir) 33 taub_acy xc, yc Pa Basal stress (y-dir) 34 taub xc, yc Pa Basal stress magnitude 35 taul_int_acx xc, yc Pa Depth-integrated lateral stress (x-dir) 36 taul_int_acy xc, yc Pa Depth-integrated lateral stress (y-dir) 37 qq_gl_acx xc, yc m^3/yr Flux across grounding line 38 qq_gl_acy xc, yc m^3/yr Flux across grounding line 39 qq_acx xc, yc m^3/yr Flux (x-dir) 40 qq_acy xc, yc m^3/yr Flux (y-dir) 41 qq xc, yc m^3/yr Flux magnitude 42 de_eff xc, yc, zeta 1/yr Effective strain rate 43 visc_eff xc, yc, zeta Pa yr Effective viscosity 44 visc_eff_int xc, yc Pa yr m Depth-integrated viscosity 45 N_eff xc, yc Pa Effective pressure 46 cb_tgt xc, yc Pa Target basal parameter 47 cb_ref xc, yc -- Reference basal parameter 48 c_bed xc, yc Pa Basal drag coefficient 49 beta_acx xc, yc Pa yr m^-1 Basal stress factor (x) 50 beta_acy xc, yc Pa yr m^-1 Basal stress factor (y) 51 beta xc, yc Pa yr m^-1 Basal stress factor mag. 52 beta_eff xc, yc Pa yr m^-1 Effective basal factor 53 f_vbvs xc, yc - Vertical basal stress 54 ssa_mask_acx xc, yc - SSA mask (x-dir) 55 ssa_mask_acy xc, yc - SSA mask (y-dir) 56 ssa_err_acx xc, yc m/yr SSA error (x-dir) 57 ssa_err_acy xc, yc m/yr SSA error (y-dir) 58 jvel_dxx xc, yc, zeta 1/yr Velocity Jacobian component duxdx 59 jvel_dxy xc, yc, zeta 1/yr Velocity Jacobian component duxdy 60 jvel_dxz xc, yc, zeta 1/yr Velocity Jacobian component duxdz 61 jvel_dyx xc, yc, zeta 1/yr Velocity Jacobian component duydx 62 jvel_dyy xc, yc, zeta 1/yr Velocity Jacobian component duydy 63 jvel_dyz xc, yc, zeta 1/yr Velocity Jacobian component duydz 64 jvel_dzx xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdx 65 jvel_dzy xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdy 66 jvel_dzz xc, yc, zeta_ac 1/yr Velocity Jacobian component duzdz","title":"ydyn"},{"location":"yelmo-variables-ymat/","text":"ymat id variable dimensions units long_name 1 enh xc, yc, zeta - Enhancement factor 2 enh_bnd xc, yc, zeta - Imposed enhancement factor 3 enh_bar xc, yc - Depth-averaged enhancement 4 ATT xc, yc, zeta - Rate factor 5 ATT_bar xc, yc - Depth-averaged rate factor 6 visc xc, yc, zeta Pa yr Ice viscosity 7 visc_bar xc, yc Pa yr Depth-averaged ice viscosity 8 visc_int xc, yc Pa yr m Ice viscosity interpolated at interfaces 9 f_shear_bar xc, yc - Depth-averaged shear fraction 10 dep_time xc, yc, zeta yr Ice deposition time (for online age tracing) 11 depth_iso xc, yc, age_iso m Depth of specific isochronal layers 12 strn2D_dxx xc, yc 1/yr 2D strain rate tensor component dxx 13 strn2D_dyy xc, yc 1/yr 2D strain rate tensor component dyy 14 strn2D_dxy xc, yc 1/yr 2D strain rate tensor component dxy 15 strn2D_dxz xc, yc 1/yr 2D strain rate tensor component dxz 16 strn2D_dyz xc, yc 1/yr 2D strain rate tensor component dyz 17 strn2D_de xc, yc 1/yr 2D effective strain rate 18 strn2D_div xc, yc 1/yr 2D horizontal divergence 19 strn2D_f_shear xc, yc 2D strain rate shear fraction 20 strn_dxx xc, yc, zeta 1/yr Strain rate tensor component dxx 21 strn_dyy xc, yc, zeta 1/yr Strain rate tensor component dyy 22 strn_dxy xc, yc, zeta 1/yr Strain rate tensor component dxy 23 strn_dxz xc, yc, zeta 1/yr Strain rate tensor component dxz 24 strn_dyz xc, yc, zeta 1/yr Strain rate tensor component dyz 25 strn_de xc, yc, zeta 1/yr Effective strain rate 26 strn_div xc, yc, zeta 1/yr Horizontal divergence 27 strn_f_shear xc, yc, zeta Strain rate shear fraction 28 strs2D_txx xc, yc Pa 2D stress tensor component txx 29 strs2D_tyy xc, yc Pa 2D stress tensor component tyy 30 strs2D_txy xc, yc Pa 2D stress tensor component txy 31 strs2D_txz xc, yc Pa 2D stress tensor component txz 32 strs2D_tyz xc, yc Pa 2D stress tensor component tyz 33 strs2D_te xc, yc Pa 2D effective stress 34 strs2D_tau_eig_1 xc, yc Pa 2D stress first principal eigenvalue 35 strs2D_tau_eig_2 xc, yc Pa 2D stress second principal eigenvalue 36 strs_txx xc, yc, zeta Pa Stress tensor component txx 37 strs_tyy xc, yc, zeta Pa Stress tensor component tyy 38 strs_txy xc, yc, zeta Pa Stress tensor component txy 39 strs_txz xc, yc, zeta Pa Stress tensor component txz 40 strs_tyz xc, yc, zeta Pa Stress tensor component tyz 41 strs_te xc, yc, zeta Pa Effective stress","title":"ymat"},{"location":"yelmo-variables-ymat/#ymat","text":"id variable dimensions units long_name 1 enh xc, yc, zeta - Enhancement factor 2 enh_bnd xc, yc, zeta - Imposed enhancement factor 3 enh_bar xc, yc - Depth-averaged enhancement 4 ATT xc, yc, zeta - Rate factor 5 ATT_bar xc, yc - Depth-averaged rate factor 6 visc xc, yc, zeta Pa yr Ice viscosity 7 visc_bar xc, yc Pa yr Depth-averaged ice viscosity 8 visc_int xc, yc Pa yr m Ice viscosity interpolated at interfaces 9 f_shear_bar xc, yc - Depth-averaged shear fraction 10 dep_time xc, yc, zeta yr Ice deposition time (for online age tracing) 11 depth_iso xc, yc, age_iso m Depth of specific isochronal layers 12 strn2D_dxx xc, yc 1/yr 2D strain rate tensor component dxx 13 strn2D_dyy xc, yc 1/yr 2D strain rate tensor component dyy 14 strn2D_dxy xc, yc 1/yr 2D strain rate tensor component dxy 15 strn2D_dxz xc, yc 1/yr 2D strain rate tensor component dxz 16 strn2D_dyz xc, yc 1/yr 2D strain rate tensor component dyz 17 strn2D_de xc, yc 1/yr 2D effective strain rate 18 strn2D_div xc, yc 1/yr 2D horizontal divergence 19 strn2D_f_shear xc, yc 2D strain rate shear fraction 20 strn_dxx xc, yc, zeta 1/yr Strain rate tensor component dxx 21 strn_dyy xc, yc, zeta 1/yr Strain rate tensor component dyy 22 strn_dxy xc, yc, zeta 1/yr Strain rate tensor component dxy 23 strn_dxz xc, yc, zeta 1/yr Strain rate tensor component dxz 24 strn_dyz xc, yc, zeta 1/yr Strain rate tensor component dyz 25 strn_de xc, yc, zeta 1/yr Effective strain rate 26 strn_div xc, yc, zeta 1/yr Horizontal divergence 27 strn_f_shear xc, yc, zeta Strain rate shear fraction 28 strs2D_txx xc, yc Pa 2D stress tensor component txx 29 strs2D_tyy xc, yc Pa 2D stress tensor component tyy 30 strs2D_txy xc, yc Pa 2D stress tensor component txy 31 strs2D_txz xc, yc Pa 2D stress tensor component txz 32 strs2D_tyz xc, yc Pa 2D stress tensor component tyz 33 strs2D_te xc, yc Pa 2D effective stress 34 strs2D_tau_eig_1 xc, yc Pa 2D stress first principal eigenvalue 35 strs2D_tau_eig_2 xc, yc Pa 2D stress second principal eigenvalue 36 strs_txx xc, yc, zeta Pa Stress tensor component txx 37 strs_tyy xc, yc, zeta Pa Stress tensor component tyy 38 strs_txy xc, yc, zeta Pa Stress tensor component txy 39 strs_txz xc, yc, zeta Pa Stress tensor component txz 40 strs_tyz xc, yc, zeta Pa Stress tensor component tyz 41 strs_te xc, yc, zeta Pa Effective stress","title":"ymat"},{"location":"yelmo-variables-ytherm/","text":"ytherm id variable dimensions units long_name 1 enth xc, yc, zeta J m^-3 Ice enthalpy 2 T_ice xc, yc, zeta K Ice temperature 3 omega xc, yc, zeta - Ice water content 4 T_pmp xc, yc, zeta K Pressure-corrected melting point 5 T_prime xc, yc, zeta deg C Homologous ice temperature 6 f_pmp xc, yc - Fraction of cell at pressure melting point 7 bmb_grnd xc, yc m/yr Grounded basal mass balance 8 Q_strn xc, yc, zeta W m^-3 Internal strain heat production 9 dQsdt xc, yc, zeta W m^-3 yr^-1 Rate of change of internal heat production 10 Q_b xc, yc mW m^-2 Basal friction heat production 11 Q_ice_b xc, yc mW m^-2 Basal ice heat flux 12 T_prime_b xc, yc K Homologous temperature at the base 13 H_w xc, yc m Basal water layer thickness 14 dHwdt xc, yc m/yr Rate of change of basal water layer thickness 15 cp xc, yc, zeta J kg^-1 K^-1 Specific heat capacity 16 kt xc, yc, zeta W m^-1 K^-1 Heat conductivity 17 H_cts xc, yc m Height of the CTS (cold-temperate surface) 18 advecxy xc, yc, zeta - Horizontal advection 19 Q_rock xc, yc W m^-2 Heat flux from bedrock 20 enth_rock xc, yc, zeta_rock J m^-3 Bedrock enthalpy 21 T_rock xc, yc, zeta_rock K Bedrock temperature","title":"ytherm"},{"location":"yelmo-variables-ytherm/#ytherm","text":"id variable dimensions units long_name 1 enth xc, yc, zeta J m^-3 Ice enthalpy 2 T_ice xc, yc, zeta K Ice temperature 3 omega xc, yc, zeta - Ice water content 4 T_pmp xc, yc, zeta K Pressure-corrected melting point 5 T_prime xc, yc, zeta deg C Homologous ice temperature 6 f_pmp xc, yc - Fraction of cell at pressure melting point 7 bmb_grnd xc, yc m/yr Grounded basal mass balance 8 Q_strn xc, yc, zeta W m^-3 Internal strain heat production 9 dQsdt xc, yc, zeta W m^-3 yr^-1 Rate of change of internal heat production 10 Q_b xc, yc mW m^-2 Basal friction heat production 11 Q_ice_b xc, yc mW m^-2 Basal ice heat flux 12 T_prime_b xc, yc K Homologous temperature at the base 13 H_w xc, yc m Basal water layer thickness 14 dHwdt xc, yc m/yr Rate of change of basal water layer thickness 15 cp xc, yc, zeta J kg^-1 K^-1 Specific heat capacity 16 kt xc, yc, zeta W m^-1 K^-1 Heat conductivity 17 H_cts xc, yc m Height of the CTS (cold-temperate surface) 18 advecxy xc, yc, zeta - Horizontal advection 19 Q_rock xc, yc W m^-2 Heat flux from bedrock 20 enth_rock xc, yc, zeta_rock J m^-3 Bedrock enthalpy 21 T_rock xc, yc, zeta_rock K Bedrock temperature","title":"ytherm"},{"location":"yelmo-variables-ytopo/","text":"ytopo id variable dimensions units long_name 1 H_ice xc, yc m Ice thickness 2 dHidt xc, yc m/yr Ice thickness rate of change 3 dHidt_dyn xc, yc m/yr Ice thickness change due to dynamics 4 mb_net xc, yc m/yr Actual mass balance applied 5 mb_relax xc, yc m/yr Change in mass balance due to relaxation 6 mb_resid xc, yc m/yr Residual mass balance 7 mb_err xc, yc m/yr Residual error in mass balance accounting 8 smb xc, yc m/yr Surface mass balance 9 bmb xc, yc m/yr Combined basal mass balance 10 fmb xc, yc m/yr Combined frontal mass balance 11 dmb xc, yc m/yr Subgrid discharge mass balance 12 cmb xc, yc m/yr Calving mass balance 13 bmb_ref xc, yc m/yr Reference basal mass balance 14 fmb_ref xc, yc m/yr Reference frontal mass balance 15 dmb_ref xc, yc m/yr Reference subgrid discharge mass balance 16 cmb_flt xc, yc m/yr Floating calving rate 17 cmb_grnd xc, yc m/yr Grounded calving rate 18 z_srf xc, yc m Surface elevation 19 dzsdt xc, yc m/yr Surface elevation rate of change 20 mask_adv xc, yc Advection mask 21 eps_eff xc, yc 1/yr Effective strain rate 22 tau_eff xc, yc Pa Effective stress 23 z_base xc, yc m Ice-base elevation 24 dzsdx xc, yc m/m Surface elevation slope, acx nodes 25 dzsdy xc, yc m/m Surface elevation slope, acy nodes 26 dHidx xc, yc m/m Ice thickness gradient, acx nodes 27 dHidy xc, yc m/m Ice thickness gradient, acy nodes 28 dzbdx xc, yc m/m Bedrock slope, acx nodes 29 dzbdy xc, yc m/m Bedrock slope, acy nodes 30 H_eff xc, yc m Effective ice thickness (margin-corrected) 31 H_grnd xc, yc m Grounded ice thickness 32 H_calv xc, yc m Calving parameter field, ice thickness limit 33 kt_calv xc, yc Calving parameter field, vm-l19 34 z_bed_filt xc, yc m Filtered bedrock elevation 35 f_grnd xc, yc Grounded fraction 36 f_grnd_acx xc, yc Grounded fraction (acx nodes) 37 f_grnd_acy xc, yc Grounded fraction (acy nodes) 38 f_grnd_ab xc, yc Grounded fraction (ab nodes) 39 f_ice xc, yc Ice-covered fraction 40 f_grnd_bmb xc, yc Grounded fraction for basal mass balance 41 f_grnd_pin xc, yc Grounded fraction from subgrid pinning points 42 dist_margin xc, yc m Distance to nearest margin point 43 dist_grline xc, yc m Distance to nearest grounding-line point 44 mask_bed xc, yc Multi-valued bed mask 45 mask_grz xc, yc Multi-valued grounding-line zone mask 46 mask_frnt xc, yc Multi-valued ice front mask 47 dHidt_dyn_n xc, yc m/yr Ice thickness change due to advection (previous) 48 H_ice_n xc, yc m Ice thickness from previous timestep 49 z_srf_n xc, yc m Surface elevation from previous timestep 50 H_ice_dyn xc, yc m Dynamic ice thickness 51 f_ice_dyn xc, yc Dynamic ice-covered fraction 52 pc_pred_H_ice xc, yc m Predicted ice thickness 53 pc_pred_dHidt_dyn xc, yc m/yr Predicted dynamic ice thickness rate of change 54 pc_pred_mb_net xc, yc m/yr Predicted net mass balance 55 pc_pred_smb xc, yc m/yr Predicted surface mass balance 56 pc_pred_bmb xc, yc m/yr Predicted basal mass balance 57 pc_pred_fmb xc, yc m/yr Predicted frontal mass balance 58 pc_pred_dmb xc, yc m/yr Predicted discharge mass balance 59 pc_pred_cmb xc, yc m/yr Predicted calving mass balance 60 pc_corr_H_ice xc, yc m Corrected ice thickness 61 pc_corr_dHidt_dyn xc, yc m/yr Corrected dynamic ice thickness rate of change 62 pc_corr_mb_net xc, yc m/yr Corrected net mass balance 63 pc_corr_smb xc, yc m/yr Corrected surface mass balance 64 pc_corr_bmb xc, yc m/yr Corrected basal mass balance 65 pc_corr_fmb xc, yc m/yr Corrected frontal mass balance 66 pc_corr_dmb xc, yc m/yr Corrected discharge mass balance 67 pc_corr_cmb xc, yc m/yr Corrected calving mass balance","title":"ytopo"},{"location":"yelmo-variables-ytopo/#ytopo","text":"id variable dimensions units long_name 1 H_ice xc, yc m Ice thickness 2 dHidt xc, yc m/yr Ice thickness rate of change 3 dHidt_dyn xc, yc m/yr Ice thickness change due to dynamics 4 mb_net xc, yc m/yr Actual mass balance applied 5 mb_relax xc, yc m/yr Change in mass balance due to relaxation 6 mb_resid xc, yc m/yr Residual mass balance 7 mb_err xc, yc m/yr Residual error in mass balance accounting 8 smb xc, yc m/yr Surface mass balance 9 bmb xc, yc m/yr Combined basal mass balance 10 fmb xc, yc m/yr Combined frontal mass balance 11 dmb xc, yc m/yr Subgrid discharge mass balance 12 cmb xc, yc m/yr Calving mass balance 13 bmb_ref xc, yc m/yr Reference basal mass balance 14 fmb_ref xc, yc m/yr Reference frontal mass balance 15 dmb_ref xc, yc m/yr Reference subgrid discharge mass balance 16 cmb_flt xc, yc m/yr Floating calving rate 17 cmb_grnd xc, yc m/yr Grounded calving rate 18 z_srf xc, yc m Surface elevation 19 dzsdt xc, yc m/yr Surface elevation rate of change 20 mask_adv xc, yc Advection mask 21 eps_eff xc, yc 1/yr Effective strain rate 22 tau_eff xc, yc Pa Effective stress 23 z_base xc, yc m Ice-base elevation 24 dzsdx xc, yc m/m Surface elevation slope, acx nodes 25 dzsdy xc, yc m/m Surface elevation slope, acy nodes 26 dHidx xc, yc m/m Ice thickness gradient, acx nodes 27 dHidy xc, yc m/m Ice thickness gradient, acy nodes 28 dzbdx xc, yc m/m Bedrock slope, acx nodes 29 dzbdy xc, yc m/m Bedrock slope, acy nodes 30 H_eff xc, yc m Effective ice thickness (margin-corrected) 31 H_grnd xc, yc m Grounded ice thickness 32 H_calv xc, yc m Calving parameter field, ice thickness limit 33 kt_calv xc, yc Calving parameter field, vm-l19 34 z_bed_filt xc, yc m Filtered bedrock elevation 35 f_grnd xc, yc Grounded fraction 36 f_grnd_acx xc, yc Grounded fraction (acx nodes) 37 f_grnd_acy xc, yc Grounded fraction (acy nodes) 38 f_grnd_ab xc, yc Grounded fraction (ab nodes) 39 f_ice xc, yc Ice-covered fraction 40 f_grnd_bmb xc, yc Grounded fraction for basal mass balance 41 f_grnd_pin xc, yc Grounded fraction from subgrid pinning points 42 dist_margin xc, yc m Distance to nearest margin point 43 dist_grline xc, yc m Distance to nearest grounding-line point 44 mask_bed xc, yc Multi-valued bed mask 45 mask_grz xc, yc Multi-valued grounding-line zone mask 46 mask_frnt xc, yc Multi-valued ice front mask 47 dHidt_dyn_n xc, yc m/yr Ice thickness change due to advection (previous) 48 H_ice_n xc, yc m Ice thickness from previous timestep 49 z_srf_n xc, yc m Surface elevation from previous timestep 50 H_ice_dyn xc, yc m Dynamic ice thickness 51 f_ice_dyn xc, yc Dynamic ice-covered fraction 52 pc_pred_H_ice xc, yc m Predicted ice thickness 53 pc_pred_dHidt_dyn xc, yc m/yr Predicted dynamic ice thickness rate of change 54 pc_pred_mb_net xc, yc m/yr Predicted net mass balance 55 pc_pred_smb xc, yc m/yr Predicted surface mass balance 56 pc_pred_bmb xc, yc m/yr Predicted basal mass balance 57 pc_pred_fmb xc, yc m/yr Predicted frontal mass balance 58 pc_pred_dmb xc, yc m/yr Predicted discharge mass balance 59 pc_pred_cmb xc, yc m/yr Predicted calving mass balance 60 pc_corr_H_ice xc, yc m Corrected ice thickness 61 pc_corr_dHidt_dyn xc, yc m/yr Corrected dynamic ice thickness rate of change 62 pc_corr_mb_net xc, yc m/yr Corrected net mass balance 63 pc_corr_smb xc, yc m/yr Corrected surface mass balance 64 pc_corr_bmb xc, yc m/yr Corrected basal mass balance 65 pc_corr_fmb xc, yc m/yr Corrected frontal mass balance 66 pc_corr_dmb xc, yc m/yr Corrected discharge mass balance 67 pc_corr_cmb xc, yc m/yr Corrected calving mass balance","title":"ytopo"},{"location":"yelmo-variables/","text":"Yelmo variable tables Here are tables containing all variables available within Yelmo with dimensions and units. These tables are used directly in the code for output writing routines, to select which variables to write. Yelmo topography Yelmo dynamics Yelmo material Yelmo thermodynamics Yelmo boundaries Yelmo data","title":"Variables"},{"location":"yelmo-variables/#yelmo-variable-tables","text":"Here are tables containing all variables available within Yelmo with dimensions and units. These tables are used directly in the code for output writing routines, to select which variables to write. Yelmo topography Yelmo dynamics Yelmo material Yelmo thermodynamics Yelmo boundaries Yelmo data","title":"Yelmo variable tables"}]} \ No newline at end of file