-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
202305 Release of GFDL_atmos_cubed_sphere #273
Conversation
Introducing idealized TC test case with SHiELD physics See merge request fv3team/atmos_cubed_sphere!91
remove the abandoned file fv_diagnostics.F90.epv See merge request fv3team/atmos_cubed_sphere!97
bug fix to prevent model crash when using domain_deg = 0. in debug mode See merge request fv3team/atmos_cubed_sphere!99
Fix to avoid reproducibility issue. See merge request fv3team/atmos_cubed_sphere!100
A number of updates See merge request fv3team/atmos_cubed_sphere!101
New test cases See merge request fv3team/atmos_cubed_sphere!102
Improved Rayleigh Damping on w See merge request fv3team/atmos_cubed_sphere!103
Update namelist reading code to avoid model crash because of the absence of naemlist. See merge request fv3team/atmos_cubed_sphere!104
Experimental 2D Smagorinsky damping and tau_w See merge request fv3team/atmos_cubed_sphere!107
Add the option to disable intermediate physics. See merge request fv3team/atmos_cubed_sphere!108
Add the options to sub-cycling condensation evaporation, control the time scale of evaporation, and delay condensation and evaporation. See merge request fv3team/atmos_cubed_sphere!109
Remove grid size in energy and mass calculation See merge request fv3team/atmos_cubed_sphere!110
2023/03 Jan-Huey Chen See merge request fv3team/atmos_cubed_sphere!112
FV3 Solver updates 202305 See merge request fv3team/atmos_cubed_sphere!114
Pass the namelist variables from the dycore to the physics during the initialization See merge request fv3team/atmos_cubed_sphere!117
Rolling back smag damping See merge request fv3team/atmos_cubed_sphere!118
Revised vertical remapping operators See merge request fv3team/atmos_cubed_sphere!116
Removed dudz and dvdz arrays that are not currently used. See merge request fv3team/atmos_cubed_sphere!119
Add nest to DP cartesian config See merge request fv3team/atmos_cubed_sphere!121
fix nesting in solo mode and add a new idealized test case for multiple nests See merge request fv3team/atmos_cubed_sphere!120
fix square_domain logic for one tile grids See merge request fv3team/atmos_cubed_sphere!122
The CI is expected to fail until I merge in NOAA-GFDL/SHiELD_build#22 because idealized test cases now require a new namelist flag |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, Lauren. This is really great. Thank you for working so hard to get this done.
id_dynam = mpp_clock_id ('FV dy-core', flags = clock_flag_default, grain=CLOCK_SUBCOMPONENT ) | ||
id_subgridz = mpp_clock_id ('FV subgrid_z',flags = clock_flag_default, grain=CLOCK_SUBCOMPONENT ) | ||
id_fv_diag = mpp_clock_id ('FV Diag', flags = clock_flag_default, grain=CLOCK_SUBCOMPONENT ) | ||
id_dycore = mpp_clock_id ('---FV Dycore',flags = clock_flag_default, grain=CLOCK_SUBCOMPONENT ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Glad to see that the timers are nicely cleaned up. We have been needing to do this for a long time.
@@ -684,6 +702,10 @@ subroutine remap_restart(Atm) | |||
|
|||
fname = 'INPUT/fv_core.res'//trim(stile_name)//'.nc' | |||
if (open_file(Fv_tile_restart_r, fname, "read", fv_domain, is_restart=.true.)) then | |||
if (Atm(1)%flagstruct%is_ideal_case) then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good! This should only be necessary for the ideal case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nothing I've noted should stop this PR from being merged, but we should address the issues noted sooner rather than later.
@@ -57,6 +58,8 @@ module dyn_core_mod | |||
#ifdef SW_DYNAMICS | |||
use test_cases_mod, only: test_case, case9_forcing1, case9_forcing2 | |||
#endif | |||
use test_cases_mod, only: w_forcing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not a big fan of using the parameter from test_cases_mod. If it is something needed by the simulations, it might be better included in flagstruct and defined in fv_control/fv_arrays.
@@ -895,6 +905,8 @@ module fv_arrays_mod | |||
real(kind=R_GRID) :: deglat=15. !< Latitude (in degrees) used to compute the uniform f-plane | |||
!< Coriolis parameter for doubly-periodic simulations | |||
!< (grid_type = 4). The default value is 15. | |||
real(kind=R_GRID) :: domain_deg = 0. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should get a description for this variable.
@@ -194,7 +194,7 @@ subroutine grid_utils_init(Atm, npx, npy, npz, non_ortho, grid_type, c2l_order) | |||
if (.not. Atm%flagstruct%external_eta) then | |||
call set_eta(npz, Atm%ks, Atm%ptop, Atm%ak, Atm%bk, Atm%flagstruct%npz_type, Atm%flagstruct%fv_eta_file) | |||
if ( is_master() ) then | |||
write(*,*) 'Grid_init', npz, Atm%ks, Atm%ptop | |||
!write(*,*) 'Grid_init', npz, Atm%ks, Atm%ptop |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should slate this for removal.
@@ -586,7 +585,7 @@ subroutine init_grid(Atm, grid_name, grid_file, npx, npy, npz, ndims, nregions, | |||
if (Atm%flagstruct%grid_type>3) then | |||
if (Atm%flagstruct%grid_type == 4) then | |||
call setup_cartesian(npx, npy, Atm%flagstruct%dx_const, Atm%flagstruct%dy_const, & | |||
Atm%flagstruct%deglat, Atm%bd) | |||
Atm%flagstruct%deglat, Atm%flagstruct%domain_deg, Atm%bd, Atm) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because this is sending in elements of the Atm structure and the whole Atm, the interface should be simplified and remove the individual Atm elements as unexpected things can happen with scope INOUT variables.
if (gid == sending_proc) then !crazy logic but what we have for now | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast with the pelist
if (gid == sending_proc) then | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast here with the given pelist
if (gid == sending_proc) then | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast here with the given pelist
if (gid == sending_proc) then | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast here with the given pelist
if (gid == sending_proc) then | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast here with the given pelist
if (gid == sending_proc) then | ||
do p=1,size(Atm(1)%pelist) | ||
call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p)) | ||
enddo | ||
endif | ||
endif | ||
if (ANY(Atm(1)%pelist == gid)) then | ||
call mpp_recv(g_dat, size(g_dat), sending_proc) | ||
endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be able to use mpp_broadcast here with the given pelist
Hi, Rusty. These are all excellent ideas. I am hoping we can work on them
soon.
Lucas
…On Fri, Jun 2, 2023 at 2:24 PM Rusty Benson ***@***.***> wrote:
***@***.**** approved this pull request.
Nothing I've noted should stop this PR from being merged, but we should
address the issues noted sooner rather than later.
------------------------------
In model/dyn_core.F90
<#273 (comment)>
:
> @@ -57,6 +58,8 @@ module dyn_core_mod
#ifdef SW_DYNAMICS
use test_cases_mod, only: test_case, case9_forcing1, case9_forcing2
#endif
+ use test_cases_mod, only: w_forcing
I'm not a big fan of using the parameter from test_cases_mod. If it is
something needed by the simulations, it might be better included in
flagstruct and defined in fv_control/fv_arrays.
------------------------------
In model/fv_arrays.F90
<#273 (comment)>
:
> @@ -895,6 +905,8 @@ module fv_arrays_mod
real(kind=R_GRID) :: deglat=15. !< Latitude (in degrees) used to compute the uniform f-plane
!< Coriolis parameter for doubly-periodic simulations
!< (grid_type = 4). The default value is 15.
+ real(kind=R_GRID) :: domain_deg = 0.
We should get a description for this variable.
------------------------------
In model/fv_grid_utils.F90
<#273 (comment)>
:
> @@ -194,7 +194,7 @@ subroutine grid_utils_init(Atm, npx, npy, npz, non_ortho, grid_type, c2l_order)
if (.not. Atm%flagstruct%external_eta) then
call set_eta(npz, Atm%ks, Atm%ptop, Atm%ak, Atm%bk, Atm%flagstruct%npz_type, Atm%flagstruct%fv_eta_file)
if ( is_master() ) then
- write(*,*) 'Grid_init', npz, Atm%ks, Atm%ptop
+ !write(*,*) 'Grid_init', npz, Atm%ks, Atm%ptop
Should slate this for removal.
------------------------------
In tools/fv_grid_tools.F90
<#273 (comment)>
:
> @@ -586,7 +585,7 @@ subroutine init_grid(Atm, grid_name, grid_file, npx, npy, npz, ndims, nregions,
if (Atm%flagstruct%grid_type>3) then
if (Atm%flagstruct%grid_type == 4) then
call setup_cartesian(npx, npy, Atm%flagstruct%dx_const, Atm%flagstruct%dy_const, &
- Atm%flagstruct%deglat, Atm%bd)
+ Atm%flagstruct%deglat, Atm%flagstruct%domain_deg, Atm%bd, Atm)
Because this is sending in elements of the Atm structure and the whole
Atm, the interface should be simplified and remove the individual Atm
elements as unexpected things can happen with scope INOUT variables.
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then !crazy logic but what we have for now
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast with the pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
------------------------------
In tools/fv_restart.F90
<#273 (comment)>
:
> + if (gid == sending_proc) then
+ do p=1,size(Atm(1)%pelist)
+ call mpp_send(g_dat,size(g_dat),Atm(1)%pelist(p))
+ enddo
endif
+ endif
+ if (ANY(Atm(1)%pelist == gid)) then
+ call mpp_recv(g_dat, size(g_dat), sending_proc)
+ endif
Should be able to use mpp_broadcast here with the given pelist
—
Reply to this email directly, view it on GitHub
<#273 (review)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMUQRVFVGUDGDXFLCKKVSA3XJIVVDANCNFSM6AAAAAAYW4UKLQ>
.
You are receiving this because your review was requested.Message ID:
***@***.***>
|
This is the 202305 public release. This release is the work of the GFDL FV3 development team. * Merge branch 'TC20_test_case' into 'main' Introducing idealized TC test case with SHiELD physics See merge request fv3team/atmos_cubed_sphere!91 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' remove the abandoned file fv_diagnostics.F90.epv See merge request fv3team/atmos_cubed_sphere!97 * Merge branch 'domain_deg_fix' into 'main' bug fix to prevent model crash when using domain_deg = 0. in debug mode See merge request fv3team/atmos_cubed_sphere!99 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' Fix to avoid reproducibility issue. See merge request fv3team/atmos_cubed_sphere!100 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' A number of updates See merge request fv3team/atmos_cubed_sphere!101 * Merge branch 'main' into 'main' New test cases See merge request fv3team/atmos_cubed_sphere!102 * Merge branch 'tau_w_202210' into 'main' Improved Rayleigh Damping on w See merge request fv3team/atmos_cubed_sphere!103 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' Update namelist reading code to avoid model crash because of the absence of naemlist. See merge request fv3team/atmos_cubed_sphere!104 * Merge branch 'smag2d_202211' into 'main' Experimental 2D Smagorinsky damping and tau_w See merge request fv3team/atmos_cubed_sphere!107 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' Add the option to disable intermediate physics. See merge request fv3team/atmos_cubed_sphere!108 * Merge branch 'user/lnz/shield2022_gfdlmp' into 'main' Add the options to sub-cycling condensation evaporation, control the time scale of evaporation, and delay condensation and evaporation. See merge request fv3team/atmos_cubed_sphere!109 * Merge branch 'user/lnz/shield2023' into 'main' Remove grid size in energy and mass calculation See merge request fv3team/atmos_cubed_sphere!110 * Merge branch 'user/lnz/shield2023' into 'main' 2023/03 Jan-Huey Chen See merge request fv3team/atmos_cubed_sphere!112 * Merge branch 'lmh_public_release_202205' into 'main' FV3 Solver updates 202305 See merge request fv3team/atmos_cubed_sphere!114 * gnu updates * Merge branch 'user/lnz/shield2023' into 'main' Pass the namelist variables from the dycore to the physics during the initialization See merge request fv3team/atmos_cubed_sphere!117 * Merge branch 'main_mayrelease_smag_rollback' into 'main' Rolling back smag damping See merge request fv3team/atmos_cubed_sphere!118 * Merge branch 'lmh_revised_mapz' into 'main' Revised vertical remapping operators See merge request fv3team/atmos_cubed_sphere!116 * Merge branch 'main_upstream_mayrelease_dudz' into 'main' Removed dudz and dvdz arrays that are not currently used. See merge request fv3team/atmos_cubed_sphere!119 * Merge branch 'dp+nest' into 'main' Add nest to DP cartesian config See merge request fv3team/atmos_cubed_sphere!121 * Merge branch 'solonest' into 'main' fix nesting in solo mode and add a new idealized test case for multiple nests See merge request fv3team/atmos_cubed_sphere!120 * Merge branch 'fixsquare' into 'main' fix square_domain logic for one tile grids See merge request fv3team/atmos_cubed_sphere!122 * updating release notes. --------- Co-authored-by: Lucas Harris <[email protected]> Co-authored-by: Linjiong Zhou <[email protected]> Co-authored-by: Joseph Mouallem <[email protected]>
Description
This PR publishes GFDL_atmos_cubed_sphere 202305 release. This release coincides with the release of SHiELD_physics 202305.
The changes included in this PR are from the GFDL FV3 Team. Full description of changes can be seen in the RELEASE.md
PRs that are all related and should be merged with this one:
NOAA-GFDL/SHiELD_physics#22
NOAA-GFDL/SHiELD_build#22
NOAA-GFDL/atmos_drivers#22
Fixes # (issue)
How Has This Been Tested?
Tested with the regression tests in SHiELD_build
Checklist:
Please check all whether they apply or not