diff --git a/CHANGELOG.md b/CHANGELOG.md
index b60c3625a..2bf7b460f 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -16,6 +16,7 @@ Code freeze date: YYYY-MM-DD
### Changed
+- Improved scaling factors implemented in `climada.hazard.trop_cyclone.apply_climate_scenario_knu` to model the impact of climate changes to tropical cyclones [#734](https://github.com/CLIMADA-project/climada_python/pull/734)
- In `climada.util.plot.geo_im_from_array`, NaNs are plotted in gray while cells with no centroid are not plotted [#929](https://github.com/CLIMADA-project/climada_python/pull/929)
- Renamed `climada.util.plot.subplots_from_gdf` to `climada.util.plot.plot_from_gdf` [#929](https://github.com/CLIMADA-project/climada_python/pull/929)
diff --git a/climada/hazard/tc_clim_change.py b/climada/hazard/tc_clim_change.py
index 5230ac247..c10fcecc0 100644
--- a/climada/hazard/tc_clim_change.py
+++ b/climada/hazard/tc_clim_change.py
@@ -7,168 +7,383 @@
terms of the GNU General Public License as published by the Free
Software Foundation, version 3.
-CLIMADA is distributed in the hope that it will be useful, but WITHOUT ANY
-WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
-PARTICULAR PURPOSE. See the GNU General Public License for more details.
+Most of this module are modifications of work originally published under the following license:
-You should have received a copy of the GNU General Public License along
-with CLIMADA. If not, see .
+MIT License
+
+Copyright (c) 2021 Lynne Jewson, Stephen Jewson
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
---
-Define climate change scenarios for tropical cycones.
+Define scaling factors to model the impact of climate change on tropical cyclones.
"""
-import numpy as np
+from math import log
+import logging
import pandas as pd
+import numpy as np
+
+LOGGER = logging.getLogger(__name__)
+
+MAP_BASINS_NAMES = {'NA': 0, 'WP': 1, 'EP': 2, 'NI': 3, 'SI': 4, 'SP': 5}
+
+MAP_VARS_NAMES = {'cat05': 0, 'cat45': 1, 'intensity': 2}
+
+MAP_PERC_NAMES = {'5/10': 0, '25': 1, '50': 2, '75': 3, '90/95': 4}
+
+# it defines the first and last projection years as well as the largest smoothing window
+YEAR_WINDOWS_PROPS = {'start': 2000, 'end': 2100, 'smoothing': 5}
+
+def get_knutson_scaling_factor(
+ variable: str='cat05',
+ percentile: str='50',
+ basin: str='NA',
+ baseline: tuple=(1982, 2022),
+ yearly_steps: int=5
+ ):
+ """
+ This code combines data in Knutson et al. (2020) and global mean surface
+ temperature (GMST) data (historical and CMIP5 simulated) to produce TC
+ projections for 4 RCPs and any historical baseline. The code uses GMST data
+ implements to interpolate and extrapolate the Knutson data
+ relative to the specified baseline time period for various RCPs
+ with a log-linear model. The methodology was developed and explained
+ in Jewson et al., (2021).
+
+ Related publications:
+ - Knutson et al., (2020): Tropical cyclones and climate
+ change assessment. Part II: Projected response to anthropogenic warming.
+ Bull. Amer. Meteor. Soc., 101 (3), E303–E322,
+ https://doi.org/10.1175/BAMS-D-18-0194.1.
+
+ - Jewson (2021): Conversion of the Knutson et al. (2020) Tropical Cyclone
+ Climate Change Projections to Risk Model Baselines,
+ https://doi.org/10.1175/JAMC-D-21-0102.1
+
+ Parameters
+ ----------
+ variable: int
+ variable of interest, possible choices are 'cat05' (frequencies of all
+ tropical cyclones), 'cat45' (frequencies of category 4 and 5 tropical cyclones)
+ and 'intensity' (mean intensity of all tropical cyclones)
+ percentile: str
+ percentiles of Knutson et al. 2020 estimates, representing the model uncertainty
+ in future changes in TC activity. These estimates come from a review of state-of-the-art
+ literature and models. For the 'cat05' variable (i.e. frequency of all tropical cyclones)
+ the 5th, 25th, 50th, 75th and 95th percentiles are provided. For 'cat45' and 'intensity',
+ the provided percentiles are the 10th, 25th, 50th, 75th and 90th. Please refer to the
+ mentioned publications for more details.
+ possible percentiles:
+ '5/10' either the 5th or 10th percentile depending on variable (see text above)
+ '25' for the 25th percentile
+ '50' for the 50th percentile
+ '75' for the 75th percentile
+ '90/95' either the 90th or 95th percentile depending on variable (see text above)
+ Default: '50'
+ basin : str
+ region of interest, possible choices are:
+ 'NA', 'WP', 'EP', 'NI', 'SI', 'SP'
+ baseline : tuple of int
+ the starting and ending years that define the historical
+ baseline. The historical baseline period must fall within
+ the GSMT data period, i.e., 1880-2100. Default is 1982-2022.
+ yearly_steps : int
+ yearly resolution at which projections are provided. Default is 5 years.
+ Returns
+ -------
+ future_change_variable : pd.DataFrame
+ data frame with future projections of the selected variables at different
+ times (indexes) and for RCPs 2.6, 4.5, 6.0 and 8.5 (columns).
+ """
+
+ base_start_year, base_end_year = baseline
+ gmst_info = get_gmst_info()
+
+ knutson_data = get_knutson_data()
+
+ num_of_rcps, gmst_years = gmst_info['gmst_data'].shape
-from climada.util.constants import SYSTEM_DIR
+ if ((base_start_year <= gmst_info['gmst_start_year']) or
+ (base_start_year >= gmst_info['gmst_end_year']) or
+ (base_end_year <= gmst_info['gmst_start_year']) or
+ (base_end_year >= gmst_info['gmst_end_year'])):
-TOT_RADIATIVE_FORCE = SYSTEM_DIR.joinpath('rcp_db.xls')
-"""© RCP Database (Version 2.0.5) http://www.iiasa.ac.at/web-apps/tnt/RcpDb.
-generated: 2018-07-04 10:47:59."""
+ raise ValueError("The selected historical baseline falls outside"
+ f"the GMST data period {gmst_info['gmst_start_year']}"
+ f"-{gmst_info['gmst_end_year']}")
-def get_knutson_criterion():
+ var_id = MAP_VARS_NAMES[variable]
+ perc_id = MAP_PERC_NAMES[percentile]
+
+ # Steps:
+ # 1. transform annual GMST values using e^βT
+ # 2. calculate the average of these transformed values over the two time periods
+ # 3. calculate the fractional change in the averages
+ # please refer to section 4. Methods of Jewson (2021) for more details.
+
+ mid_years = np.arange(YEAR_WINDOWS_PROPS['start'],
+ YEAR_WINDOWS_PROPS['end']+1,
+ yearly_steps)
+ predicted_change = np.ones((mid_years.shape[0], num_of_rcps))
+
+ try:
+ basin_id = MAP_BASINS_NAMES[basin]
+ knutson_value = knutson_data[var_id, basin_id, perc_id]
+
+ except KeyError:
+ LOGGER.warning(f"No scaling factors are defined for basin {basin} therefore"
+ "no change will be projected for tracks in this basin")
+ return pd.DataFrame(predicted_change,
+ index=mid_years,
+ columns=gmst_info['rcps'])
+
+ base_start_pos = base_start_year - gmst_info['gmst_start_year']
+ base_end_pos = base_end_year - gmst_info['gmst_start_year']
+
+ # Step 1.
+ beta = 0.5 * log(0.01 * knutson_value + 1) # equation 6 in Jewson (2021)
+ tc_properties = np.exp(beta * gmst_info['gmst_data']) # equation 3 in Jewson (2021)
+
+ # Step 2.
+ baseline = np.mean(tc_properties[:, base_start_pos:base_end_pos + 1], 1)
+
+ # Step 3.
+ for i, mid_year in enumerate(mid_years):
+ mid_year_in_gmst_ind = mid_year - gmst_info['gmst_start_year']
+ actual_smoothing = min(
+ YEAR_WINDOWS_PROPS['smoothing'],
+ gmst_years - mid_year_in_gmst_ind - 1,
+ mid_year_in_gmst_ind
+ )
+ fut_start_pos = mid_year_in_gmst_ind - actual_smoothing
+ fut_end_pos = mid_year_in_gmst_ind + actual_smoothing + 1
+
+ prediction = np.mean(tc_properties[:, fut_start_pos:fut_end_pos], 1)
+
+ # assess fractional changes
+ predicted_change[i] = ((prediction - baseline) /
+ baseline) * 100
+
+ return pd.DataFrame(predicted_change,
+ index=mid_years,
+ columns=gmst_info['rcps'])
+
+def get_gmst_info():
"""
- Fill changes in TCs according to Knutson et al. 2015 Global projections
- of intense tropical cyclone activity for the late twenty-first century from
- dynamical downscaling of CMIP5/RCP4.5 scenarios.
+ Get Global Mean Surface Temperature (GMST) data from 1880 to 2100 for
+ RCPs 2.6, 4.5, 6.0 and 8.5. Data are provided in:
+
+ Jewson (2021): Conversion of the Knutson et al. (2020) Tropical Cyclone
+ Climate Change Projections to Risk Model Baselines,
+ https://doi.org/10.1175/JAMC-D-21-0102.1
+
+ and in supporting documentation and code.
Returns
-------
- criterion : list(dict)
- list of the criterion dictionary for frequency and intensity change
- per basin, per category taken from the Table 3 in Knutson et al. 2015.
- with items 'basin' (str), 'category' (list(int)), 'year' (int),
- 'change' (float), 'variable' ('intensity' or 'frequency')
+ gmst_info : dict
+ dictionary with four keys, which are:
+ - rcps: list of strings referring to RCPs 2.6, 4.5, 6.0 and 8.5
+ - gmst_start_year: integer with the GMST data starting year, 1880
+ - gmst_end_year: integer with the GMST data ending year, 2100
+ - gmst_data: array with GMST data across RCPs (first dim) and years (second dim)
"""
- # NA
- na = [
- {'basin': 'NA', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.045, 'variable': 'intensity'}
- ]
-
- # EP
- ep = [
- {'basin': 'EP', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.163, 'variable': 'frequency'},
- {'basin': 'EP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.193, 'variable': 'frequency'},
- {'basin': 'EP', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1.837, 'variable': 'frequency'},
- {'basin': 'EP', 'category': [4, 5],
- 'year': 2100, 'change': 3.375, 'variable': 'frequency'},
- {'basin': 'EP', 'category': [0],
- 'year': 2100, 'change': 1.082, 'variable': 'intensity'},
- {'basin': 'EP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.078, 'variable': 'intensity'}
- ]
-
- # WP
- wp = [
- {'basin': 'WP', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.345, 'variable': 'frequency'},
- {'basin': 'WP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.316, 'variable': 'frequency'},
- {'basin': 'WP', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1 - 0.169, 'variable': 'frequency'},
- {'basin': 'WP', 'category': [4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'WP', 'category': [0],
- 'year': 2100, 'change': 1.074, 'variable': 'intensity'},
- {'basin': 'WP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.055, 'variable': 'intensity'},
- ]
-
- # SP
- sp = [
- {'basin': 'SP', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.366, 'variable': 'frequency'},
- {'basin': 'SP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.406, 'variable': 'frequency'},
- {'basin': 'SP', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1 - 0.506, 'variable': 'frequency'},
- {'basin': 'SP', 'category': [4, 5],
- 'year': 2100, 'change': 1 - 0.583, 'variable': 'frequency'}
- ]
-
- # NI
- ni = [
- {'basin': 'NI', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NI', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.256, 'variable': 'frequency'},
- {'basin': 'NI', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NI', 'category': [4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'}
- ]
-
- # SI
- si = [
- {'basin': 'SI', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.261, 'variable': 'frequency'},
- {'basin': 'SI', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1 - 0.284, 'variable': 'frequency'},
- {'basin': 'SI', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'SI', 'category': [4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'SI', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.033, 'variable': 'intensity'}
- ]
-
- return na + ep + wp + sp + ni + si
-
-
-def calc_scale_knutson(ref_year=2050, rcp_scenario=45):
+
+ gmst_data = np.array([
+ [-0.16,-0.08,-0.1,-0.16,-0.28,-0.32,-0.3,-0.35,-0.16,-0.1,
+ -0.35,-0.22,-0.27,-0.31,-0.3,-0.22,-0.11,-0.11,-0.26,-0.17,
+ -0.08,-0.15,-0.28,-0.37, -0.47,-0.26,-0.22,-0.39,-0.43,-0.48,
+ -0.43,-0.44,-0.36,-0.34,-0.15,-0.14,-0.36,-0.46,-0.29,-0.27,
+ -0.27,-0.19,-0.28,-0.26,-0.27,-0.22,-0.1,-0.22,-0.2,-0.36,
+ -0.16,-0.1,-0.16,-0.29,-0.13,-0.2,-0.15,-0.03,-0.01,-0.02,
+ 0.13,0.19,0.07,0.09,0.2,0.09,-0.07,-0.03,-0.11,-0.11,-0.17,
+ -0.07,0.01,0.08,-0.13,-0.14,-0.19,0.05,0.06,0.03,-0.02,0.06,
+ 0.04,0.05,-0.2,-0.11,-0.06,-0.02,-0.08,0.05,0.02,-0.08,0.01,
+ 0.16,-0.07,-0.01,-0.1,0.18,0.07,0.16,0.26,0.32,0.14,0.31,0.15,
+ 0.11,0.18,0.32,0.38,0.27,0.45,0.4,0.22,0.23,0.32,0.45,0.33,0.47,
+ 0.61,0.39,0.39,0.54,0.63,0.62,0.54, 0.68,0.64,0.66,0.54,0.66,
+ 0.72,0.61,0.64,0.68,0.75,0.9,1.02,0.92,0.85,0.98,0.909014286,
+ 0.938814286,0.999714286,1.034314286,1.009714286,1.020014286,
+ 1.040914286,1.068614286,1.072114286,1.095114286,1.100414286,
+ 1.099014286,1.118514286,1.133414286,1.135314286,1.168814286,
+ 1.200414286,1.205414286,1.227214286,1.212614286,1.243014286,
+ 1.270114286,1.250114286,1.254514286,1.265814286,1.263314286,
+ 1.294714286,1.289814286,1.314214286,1.322514286,1.315614286,
+ 1.276314286,1.302414286,1.318414286,1.312014286,1.317914286,
+ 1.341214286,1.297414286,1.308514286,1.314614286,1.327814286,
+ 1.335814286,1.331214286,1.318014286,1.289714286,1.334414286,
+ 1.323914286,1.316614286,1.300214286,1.302414286,1.303114286,
+ 1.311014286,1.283914286,1.293814286,1.296914286,1.316614286,
+ 1.306314286,1.290614286,1.288814286,1.272114286,1.264614286,
+ 1.262514286,1.290514286,1.285114286,1.267214286,1.267414286,
+ 1.294314286,1.315614286,1.310314286,1.283914286,1.296614286,
+ 1.281214286,1.301014286,1.300114286,1.303114286,1.286714286,
+ 1.297514286,1.312114286,1.276714286,1.281414286,1.276414286],
+ [-0.16,-0.08,-0.1,-0.16,-0.28,-0.32,-0.3,-0.35,-0.16,-0.1,
+ -0.35, -0.22,-0.27,-0.31,-0.3,-0.22,-0.11,-0.11,-0.26,-0.17,
+ -0.08,-0.15,-0.28,-0.37,-0.47,-0.26,-0.22,-0.39,-0.43,-0.48,
+ -0.43,-0.44,-0.36,-0.34,-0.15,-0.14,-0.36,-0.46, -0.29,-0.27,
+ -0.27,-0.19,-0.28,-0.26,-0.27,-0.22,-0.1,-0.22,-0.2,-0.36,
+ -0.16,-0.1,-0.16,-0.29,-0.13,-0.2,-0.15,-0.03,-0.01,-0.02,0.13,
+ 0.19,0.07,0.09,0.2,0.09,-0.07,-0.03,-0.11,-0.11,-0.17,-0.07,0.01,
+ 0.08,-0.13,-0.14,-0.19,0.05,0.06,0.03,-0.02,0.06,0.04,0.05,-0.2,
+ -0.11,-0.06,-0.02,-0.08,0.05,0.02,-0.08,0.01,0.16,-0.07,-0.01,
+ -0.1,0.18,0.07,0.16,0.26,0.32,0.14,0.31,0.15,0.11,0.18,0.32,0.38,
+ 0.27,0.45,0.4,0.22,0.23,0.32,0.45,0.33,0.47,0.61,0.39,0.39,0.54,
+ 0.63,0.62,0.54,0.68,0.64,0.66,0.54,0.66,0.72,0.61,0.64,0.68,0.75,
+ 0.9,1.02,0.92,0.85,0.98,0.903592857,0.949092857,0.955792857,
+ 0.997892857,1.048392857,1.068092857,1.104792857,1.122192857,
+ 1.125792857,1.156292857,1.160992857,1.201692857,1.234692857,
+ 1.255392857,1.274392857,1.283792857,1.319992857,1.369992857,
+ 1.385592857,1.380892857,1.415092857,1.439892857,1.457092857,
+ 1.493592857,1.520292857,1.517692857,1.538092857,1.577192857,
+ 1.575492857,1.620392857,1.657092857,1.673492857,1.669992857,
+ 1.706292857,1.707892857,1.758592857,1.739492857,1.740192857,
+ 1.797792857,1.839292857,1.865392857,1.857692857,1.864092857,
+ 1.881192857,1.907592857,1.918492857,1.933992857,1.929392857,
+ 1.931192857,1.942492857,1.985592857,1.997392857,2.000992857,
+ 2.028692857,2.016192857,2.020792857,2.032892857,2.057492857,
+ 2.092092857,2.106292857,2.117492857,2.123492857,2.121092857,
+ 2.096892857,2.126892857,2.131292857,2.144892857,2.124092857,
+ 2.134492857,2.171392857,2.163692857,2.144092857,2.145092857,
+ 2.128992857,2.129992857,2.169192857,2.186492857,2.181092857,
+ 2.217592857,2.210492857,2.223692857],
+ [-0.16,-0.08,-0.1,-0.16,-0.28,-0.32,-0.3,-0.35,-0.16,-0.1,
+ -0.35,-0.22,-0.27, -0.31,-0.3,-0.22,-0.11,-0.11,-0.26,-0.17,
+ -0.08,-0.15,-0.28,-0.37,-0.47,-0.26,-0.22,-0.39,-0.43,-0.48,
+ -0.43,-0.44,-0.36,-0.34,-0.15,-0.14,-0.36,-0.46,-0.29,-0.27,
+ -0.27,-0.19,-0.28,-0.26,-0.27,-0.22,-0.1,-0.22,-0.2,-0.36,
+ -0.16,-0.1,-0.16,-0.29,-0.13,-0.2,-0.15,-0.03,-0.01,-0.02,0.13,
+ 0.19,0.07,0.09,0.2,0.09,-0.07,-0.03,-0.11,-0.11,-0.17,-0.07,0.01,
+ 0.08,-0.13,-0.14,-0.19,0.05,0.06,0.03,-0.02,0.06,0.04,0.05,-0.2,
+ -0.11,-0.06,-0.02,-0.08,0.05,0.02,-0.08,0.01,0.16,-0.07,-0.01,-0.1,
+ 0.18,0.07,0.16,0.26,0.32,0.14,0.31,0.15,0.11,0.18,0.32,0.38,0.27,0.45,
+ 0.4,0.22,0.23,0.32,0.45,0.33,0.47,0.61,0.39,0.39,0.54,0.63,0.62,0.54,
+ 0.68,0.64,0.66,0.54,0.66,0.72,0.61,0.64,0.68,0.75,0.9,1.02,0.92,0.85,
+ 0.98,0.885114286,0.899814286,0.919314286,0.942414286,0.957814286,
+ 1.000414286,1.023114286,1.053414286,1.090814286,1.073014286,1.058114286,
+ 1.117514286,1.123714286,1.123814286,1.177514286,1.190814286,1.187514286,
+ 1.223514286,1.261714286,1.289014286,1.276414286,1.339114286,1.365714286,
+ 1.375314286,1.402214286,1.399914286,1.437314286,1.464914286,1.479114286,
+ 1.505514286,1.509614286,1.539814286,1.558214286,1.595014286,1.637114286,
+ 1.653414286,1.636714286,1.652214286,1.701014286,1.731114286,1.759214286,
+ 1.782114286,1.811014286,1.801714286,1.823014286,1.842914286,1.913014286,
+ 1.943114286,1.977514286,1.982014286,2.007114286,2.066314286,2.079214286,
+ 2.126014286,2.147314286,2.174914286,2.184414286,2.218514286,2.261514286,
+ 2.309614286,2.328014286,2.347014286,2.369414286,2.396614286,2.452014286,
+ 2.473314286,2.486514286,2.497914286,2.518014286,2.561814286,2.613014286,
+ 2.626814286,2.585914286,2.614614286,2.644714286,2.688414286,2.688514286,
+ 2.685314286,2.724614286,2.746214286,2.773814286],
+ [-0.16,-0.08,-0.1,-0.16,-0.28,-0.32,-0.3,-0.35,-0.16,-0.1,-0.35,-0.22,
+ -0.27,-0.31,-0.3,-0.22,-0.11,-0.11,-0.26,-0.17,-0.08,-0.15,-0.28,-0.37,
+ -0.47,-0.26,-0.22,-0.39,-0.43,-0.48,-0.43,-0.44,-0.36,-0.34,-0.15,-0.14,
+ -0.36,-0.46,-0.29,-0.27,-0.27,-0.19,-0.28,-0.26,-0.27,-0.22,-0.1,-0.22,
+ -0.2,-0.36,-0.16,-0.1,-0.16,-0.29,-0.13,-0.2,-0.15,-0.03,-0.01,-0.02,0.13,
+ 0.19,0.07,0.09,0.2,0.09,-0.07,-0.03,-0.11,-0.11,-0.17,-0.07,0.01,0.08,-0.13,
+ -0.14,-0.19,0.05,0.06,0.03,-0.02,0.06,0.04,0.05,-0.2,-0.11,-0.06,-0.02,-0.08,
+ 0.05,0.02,-0.08,0.01,0.16,-0.07,-0.01,-0.1,0.18,0.07,0.16,0.26,0.32,0.14,0.31,
+ 0.15,0.11,0.18,0.32,0.38,0.27,0.45,0.4,0.22,0.23,0.32,0.45,0.33,0.47,0.61,0.39,
+ 0.39,0.54,0.63,0.62,0.54,0.68,0.64,0.66,0.54,0.66,0.72,0.61,0.64, 0.68,0.75,0.9,
+ 1.02,0.92,0.85,0.98,0.945764286,1.011064286,1.048564286,1.049564286,1.070264286,
+ 1.126564286,1.195464286,1.215064286,1.246964286,1.272564286,1.262464286,
+ 1.293464286,1.340864286,1.391164286,1.428764286,1.452564286,1.494164286,
+ 1.520664286,1.557164286,1.633664286,1.654264286,1.693264286,1.730264286,
+ 1.795264286,1.824264286,1.823864286,1.880664286,1.952864286,1.991764286,
+ 1.994764286,2.085764286,2.105764286,2.155064286,2.227464286,2.249964286,
+ 2.313664286,2.341464286,2.394064286,2.457364286,2.484664286,2.549564286,
+ 2.605964286,2.656864286,2.707364286,2.742964286,2.789764286,2.847664286,
+ 2.903564286,2.925064286,2.962864286,3.002664286,3.069264286,3.133364286,
+ 3.174764286,3.217764286,3.256564286,3.306864286,3.375464286,3.420264286,
+ 3.476464286,3.493864286,3.552964286,3.592364286,3.630664286,3.672464286,
+ 3.734364286,3.789764286,3.838164286,3.882264286,3.936064286,3.984064286,
+ 4.055764286,4.098964286,4.122364286,4.172064286,4.225264286,4.275064286,
+ 4.339064286,4.375864286,4.408064286,4.477764286]
+])
+
+ gmst_info = {
+ 'rcps' : ['2.6', '4.5', '6.0', '8.5'],
+ 'gmst_start_year' : 1880,
+ 'gmst_end_year' : 2100,
+ 'gmst_data' : gmst_data
+ }
+
+ return gmst_info
+
+def get_knutson_data():
"""
- Comparison 2081-2100 (i.e., late twenty-first century) and 2001-20
- (i.e., present day). Late twenty-first century effects on intensity and
- frequency per Saffir-Simpson-category and ocean basin is scaled to target
- year and target RCP proportional to total radiative forcing of the
- respective RCP and year.
+ Retrieve projections data in Knutson et al., (2020):
- Parameters
- ----------
- ref_year : int, optional
- year between 2000 ad 2100. Default: 2050
- rcp_scenario: int, optional
- 26 for RCP 2.6, 45 for RCP 4.5. The default is 45
- 60 for RCP 6.0 and 85 for RCP 8.5.
+ Tropical cyclones and climate change assessment. Part II: Projected
+ response to anthropogenic warming. Bull. Amer. Meteor. Soc., 101 (3), E303–E322,
+ https://doi.org/10.1175/BAMS-D-18-0194.1.
+
+ for 4 variables (i.e., cat05 frequency, cat45 frequency, intensity, precipitation rate),
+ 6 regions, i.e., N. Atl (NA), NW Pac. (WP), NE Pac (EP)., N. Ind (NI), S. Ind. (SI),
+ SW Pac. (SP), and 5 percentiles, i.e., 5% or 10%, 25%, 50%, 75%, 95% or 90%.
+
+ The data are available at:
+ S Jewson, T Knutson, S Camargo, J Chan, K Emanuel, C Ho, J Kossin, M Mohapatra, M Satoh,
+ M Sugi, K Walsh, & L Wu. (2021). Knutson et al 2020 Tropical Cyclone Projections Data (v0.2)
+ [Data set]. Zenodo. https://doi.org/10.5281/zenodo.4757343
Returns
-------
- factor : float
- factor to scale Knutson parameters to the give RCP and year
+ knutson_data : np.array of dimension (4x6x5)
+ array contaning data used by Knutson et al. (2020) to project changes in cat05 frequency,
+ cat45 frequency, intensity and precipitation rate (first array's dimension), for the
+ N. Atl (NA), NW Pac. (WP), NE Pac (EP)., N. Ind (NI), S. Ind. (SI), SW Pac. (SP) regions
+ (second array's dimension) for the 5%/10%, 25%, 50%, 75%, 95%/90% percentiles
+ (thirs array's dimension).
"""
- # Parameters used in Knutson et al 2015
- base_knu = np.arange(2001, 2021)
- end_knu = np.arange(2081, 2101)
- rcp_knu = 45
-
- # radiative forcings for each RCP scenario
- rad_force = pd.read_excel(TOT_RADIATIVE_FORCE)
- years = np.array([year for year in rad_force.columns if isinstance(year, int)])
- rad_rcp = np.array([int(float(sce[sce.index('.') - 1:sce.index('.') + 2]) * 10)
- for sce in rad_force['Scenario'] if isinstance(sce, str)])
-
- # mean values for Knutson values
- rf_vals = np.argwhere(rad_rcp == rcp_knu).reshape(-1)[0]
- rf_vals = np.array([rad_force.iloc[rf_vals][year] for year in years])
- rf_base = np.nanmean(np.interp(base_knu, years, rf_vals))
- rf_end = np.nanmean(np.interp(end_knu, years, rf_vals))
-
- # scale factor for ref_year and rcp_scenario
- rf_vals = np.argwhere(rad_rcp == rcp_scenario).reshape(-1)[0]
- rf_vals = np.array([rad_force.iloc[rf_vals][year] for year in years])
- rf_sel = np.interp(ref_year, years, rf_vals)
- return max((rf_sel - rf_base) / (rf_end - rf_base), 0)
+
+ # The knutson_data array has dimension:
+ # 4 (tropical cyclones variables) x 6 (tropical cyclone regions) x 5 (percentiles)
+ knutson_data = np.array([[
+ [-34.49,-24.875,-14.444,3.019,28.737],
+ [-30.444,-20,-10.27,0.377,17.252],
+ [-32.075,-18.491,-3.774,11.606,36.682],
+ [-35.094,-15.115,-4.465,5.785,29.405],
+ [-32.778,-22.522,-17.297,-8.995,7.241],
+ [-40.417,-26.321,-18.113,-8.21,4.689]],
+ [
+ [-38.038,-22.264,11.321,38.302,81.874],
+ [-25.811,-14.34,-4.75,16.146,41.979],
+ [-24.83,-6.792,22.642,57.297,104.315],
+ [-30.566,-16.415,5.283,38.491,79.119],
+ [-23.229,-13.611,4.528,26.645,63.514],
+ [-42.453,-29.434,-14.467,-0.541,19.061]],
+ [
+ [0.543,1.547,2.943,4.734,6.821],
+ [1.939,3.205,5.328,6.549,9.306],
+ [-2.217,0.602,5.472,9.191,10.368],
+ [-0.973,1.944,4.324,6.15,7.808],
+ [1.605,3.455,5.405,7.69,10.884],
+ [-6.318,-0.783,0.938,5.314,12.213]],
+ [
+ [5.848,9.122,15.869,20.352,22.803],
+ [6.273,12.121,16.486,18.323,23.784],
+ [6.014,8.108,21.081,29.324,31.838],
+ [12.703,14.347,17.649,19.182,20.77],
+ [2.2,11.919,19.73,23.115,26.243],
+ [-1.299,5.137,7.297,11.091,15.419]
+ ]])
+
+ return knutson_data
\ No newline at end of file
diff --git a/climada/hazard/test/test_tc_cc.py b/climada/hazard/test/test_tc_cc.py
index 00264e5f1..4014ac2cb 100644
--- a/climada/hazard/test/test_tc_cc.py
+++ b/climada/hazard/test/test_tc_cc.py
@@ -21,45 +21,131 @@
import unittest
+import unittest
+from math import log
+import pandas as pd
+import numpy as np
import climada.hazard.tc_clim_change as tc_cc
class TestKnutson(unittest.TestCase):
- """Test loading funcions from the TropCyclone class"""
- def test_get_pass(self):
+ def test_get_knutson_scaling_calculations(self):
+
+ basin = 'NA'
+ variable = 'cat05'
+ percentile = '5/10'
+ base_start, base_end = 1950, 2018
+ yearly_steps = 5
+
+ target_predicted_changes = tc_cc.get_knutson_scaling_factor(
+ percentile=percentile,
+ variable=variable,
+ basin=basin,
+ baseline=(base_start, base_end),
+ yearly_steps=yearly_steps
+ )
+
+ ## Test computations of future changes
+ # Load data
+ gmst_info = tc_cc.get_gmst_info()
+
+ var_id, basin_id, perc_id = (tc_cc.MAP_VARS_NAMES[variable],
+ tc_cc.MAP_BASINS_NAMES[basin],
+ tc_cc.MAP_PERC_NAMES[percentile])
+
+ knutson_data = tc_cc.get_knutson_data()
+ knutson_value = knutson_data[var_id, basin_id, perc_id]
+
+ start_ind = base_start - gmst_info['gmst_start_year']
+ end_ind = base_end - gmst_info['gmst_start_year']
+
+ # Apply model
+ beta = 0.5 * log(0.01 * knutson_value + 1)
+ tc_properties = np.exp(beta * gmst_info['gmst_data'])
+
+ # Assess baseline value
+ baseline = np.mean(tc_properties[:, start_ind:end_ind + 1], 1)
+
+ # Assess future value and test predicted change from baseline is
+ # the same as given by function
+ smoothing = 5
+
+ for target_year in [2030, 2050, 2070, 2090]:
+ target_year_ind = target_year - gmst_info['gmst_start_year']
+ ind1 = target_year_ind - smoothing
+ ind2 = target_year_ind + smoothing + 1
+
+ prediction = np.mean(tc_properties[:, ind1:ind2], 1)
+ calculated_predicted_change = ((prediction - baseline) / baseline) * 100
+
+ np.testing.assert_array_almost_equal(target_predicted_changes.loc[target_year, '2.6'],
+ calculated_predicted_change[0])
+ np.testing.assert_array_almost_equal(target_predicted_changes.loc[target_year, '4.5'],
+ calculated_predicted_change[1])
+ np.testing.assert_array_almost_equal(target_predicted_changes.loc[target_year, '6.0'],
+ calculated_predicted_change[2])
+ np.testing.assert_array_almost_equal(target_predicted_changes.loc[target_year, '8.5'],
+ calculated_predicted_change[3])
+
+ def test_get_knutson_scaling_structure(self):
"""Test get_knutson_criterion function."""
- criterion = tc_cc.get_knutson_criterion()
- self.assertTrue(len(criterion), 20)
- for crit_val in criterion:
- self.assertTrue('year' in crit_val)
- self.assertTrue('change' in crit_val)
- self.assertTrue('variable' in crit_val)
- self.assertEqual(criterion[0]['variable'], "frequency")
- self.assertEqual(criterion[0]['change'], 1)
- self.assertEqual(criterion[4]['variable'], "intensity")
- self.assertEqual(criterion[4]['change'], 1.045)
- self.assertEqual(criterion[-10]['basin'], "SP")
- self.assertEqual(criterion[-10]['variable'], "frequency")
- self.assertEqual(criterion[-10]['change'], 1 - 0.583)
-
- def test_scale_pass(self):
- """Test calc_scale_knutson function."""
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2050, rcp_scenario=45),
- 0.759630751756698)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2070, rcp_scenario=45),
- 0.958978483788876)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2060, rcp_scenario=60),
- 0.825572149523299)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2080, rcp_scenario=60),
- 1.309882943406079)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2090, rcp_scenario=85),
- 2.635069196605717)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2100, rcp_scenario=85),
- 2.940055236533517)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2066, rcp_scenario=26),
- 0.341930203294547)
- self.assertAlmostEqual(tc_cc.calc_scale_knutson(ref_year=2078, rcp_scenario=26),
- 0.312383928930456)
+
+ yearly_steps = 8
+ target_predicted_changes = tc_cc.get_knutson_scaling_factor(yearly_steps=yearly_steps)
+
+ np.testing.assert_equal(target_predicted_changes.columns, np.array(['2.6', '4.5', '6.0', '8.5']))
+
+ simulated_years = np.arange(tc_cc.YEAR_WINDOWS_PROPS['start'],
+ tc_cc.YEAR_WINDOWS_PROPS['end']+1,
+ yearly_steps)
+ np.testing.assert_equal(target_predicted_changes.index, simulated_years)
+
+ def test_get_knutson_scaling_valid_inputs(self):
+ df = tc_cc.get_knutson_scaling_factor()
+ self.assertIsInstance(df, pd.DataFrame)
+ np.testing.assert_equal(df.shape, (21, 4))
+
+ def test_get_knutson_scaling_invalid_baseline_start_year(self):
+ with self.assertRaises(ValueError):
+ tc_cc.get_knutson_scaling_factor(baseline=(1870, 2022))
+
+ def test_get_knutson_scaling_invalid_baseline_end_year(self):
+ with self.assertRaises(ValueError):
+ tc_cc.get_knutson_scaling_factor(baseline=(1982, 2110))
+
+ def test_get_knutson_scaling_no_scaling_factors_for_unknonw_basin(self):
+ df = tc_cc.get_knutson_scaling_factor(basin='ZZZZZ')
+ self.assertIsInstance(df, pd.DataFrame)
+ np.testing.assert_equal(df.values, np.ones_like(df.values))
+
+ def test_get_gmst(self):
+ """Test get_gmst_info function."""
+ gmst_info = tc_cc.get_gmst_info()
+
+ self.assertAlmostEqual(gmst_info['gmst_start_year'], 1880)
+ self.assertAlmostEqual(gmst_info['gmst_end_year'], 2100)
+ self.assertAlmostEqual(len(gmst_info['rcps']), 4)
+
+ self.assertAlmostEqual(gmst_info['gmst_data'].shape,
+ (len(gmst_info['rcps']),
+ gmst_info['gmst_end_year']-gmst_info['gmst_start_year']+1))
+ self.assertAlmostEqual(gmst_info['gmst_data'][0,0], -0.16)
+ self.assertAlmostEqual(gmst_info['gmst_data'][0,-1], 1.27641, 4)
+ self.assertAlmostEqual(gmst_info['gmst_data'][-1,0], -0.16)
+ self.assertAlmostEqual(gmst_info['gmst_data'][-1,-1], 4.477764, 4)
+
+ def test_get_knutson_data_pass(self):
+ """Test get_knutson_data function."""
+
+ data_knutson = tc_cc.get_knutson_data()
+
+ self.assertAlmostEqual(data_knutson.shape, (4,6,5))
+ self.assertAlmostEqual(data_knutson[0,0,0], -34.49)
+ self.assertAlmostEqual(data_knutson[-1,-1,-1], 15.419)
+ self.assertAlmostEqual(data_knutson[0,-1,-1], 4.689)
+ self.assertAlmostEqual(data_knutson[-1,0,0], 5.848)
+ self.assertAlmostEqual(data_knutson[-1,0,-1], 22.803)
+ self.assertAlmostEqual(data_knutson[2,3,2], 4.324)
if __name__ == "__main__":
TESTS = unittest.TestLoader().loadTestsFromTestCase(TestKnutson)
diff --git a/climada/hazard/test/test_trop_cyclone.py b/climada/hazard/test/test_trop_cyclone.py
index 3fd4f07eb..9141778ae 100644
--- a/climada/hazard/test/test_trop_cyclone.py
+++ b/climada/hazard/test/test_trop_cyclone.py
@@ -31,6 +31,7 @@
from climada.util import ureg
from climada.test import get_test_file
from climada.hazard.tc_tracks import TCTracks
+from climada.hazard.tc_clim_change import get_knutson_scaling_factor
from climada.hazard.trop_cyclone.trop_cyclone import (
TropCyclone, )
from climada.hazard.centroids.centr import Centroids
@@ -298,140 +299,75 @@ def test_two_files_pass(self):
class TestClimateSce(unittest.TestCase):
- def test_apply_criterion_track(self):
- """Test _apply_criterion function."""
+ def create_tc(self):
+ """Create mock TropCyclone object."""
+ # Setup data directly
intensity = np.zeros((4, 10))
intensity[0, :] = np.arange(10)
intensity[1, 5] = 10
intensity[2, :] = np.arange(10, 20)
intensity[3, 3] = 3
- tc = TropCyclone(
+
+ self.tc = TropCyclone(
intensity=sparse.csr_matrix(intensity),
- basin=['NA', 'NA', 'NA', 'NO'],
+ basin=['NA', 'NA', 'NA', 'WP'],
category=np.array([2, 0, 4, 1]),
- event_id=np.arange(4),
- frequency=np.ones(4) * 0.5,
+ event_id=np.arange(intensity.shape[0]),
+ frequency=np.repeat(1./intensity.shape[0], intensity.shape[0]),
+ date=np.array([723795, 728395, 738395, 724395])
)
- tc_cc = tc.apply_climate_scenario_knu(ref_year=2050, rcp_scenario=45)
- self.assertTrue(np.allclose(tc.intensity[1, :].toarray(), tc_cc.intensity[1, :].toarray()))
- self.assertTrue(np.allclose(tc.intensity[3, :].toarray(), tc_cc.intensity[3, :].toarray()))
- self.assertFalse(
- np.allclose(tc.intensity[0, :].toarray(), tc_cc.intensity[0, :].toarray()))
- self.assertFalse(
- np.allclose(tc.intensity[2, :].toarray(), tc_cc.intensity[2, :].toarray()))
- self.assertTrue(np.allclose(tc.frequency, tc_cc.frequency))
-
- def test_apply_criterion_track2(self):
+ def test_apply_climate_scenario_knu_calculations(self):
"""Test _apply_criterion function."""
- criterion = [{'basin': 'NA', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.045, 'variable': 'intensity'}
- ]
- scale = 0.75
- # artificially increase the size of the hazard by repeating (tiling) the data:
- ntiles = 8
+ ## Build tc object
+ self.create_tc()
- intensity = np.zeros((4, 10))
- intensity[0, :] = np.arange(10)
- intensity[1, 5] = 10
- intensity[2, :] = np.arange(10, 20)
- intensity[3, 3] = 3
- intensity = np.tile(intensity, (ntiles, 1))
- tc = TropCyclone(
- intensity=sparse.csr_matrix(intensity),
- basin=ntiles * ['NA', 'NA', 'NA', 'WP'],
- category=np.array(ntiles * [2, 0, 4, 1]),
- event_id=np.arange(intensity.shape[0]),
- )
+ cat05_sel = np.repeat(True, self.tc.category.shape[0])
+ cat03_sel = np.array([cat in [0,1,2,3] for cat in self.tc.category])
+ cat45_sel = np.array([cat in [4,5] for cat in self.tc.category])
- tc_cc = tc._apply_knutson_criterion(criterion, scale)
- for i_tile in range(ntiles):
- offset = i_tile * 4
- # no factor applied because of category 0
- np.testing.assert_array_equal(
- tc.intensity[offset + 1, :].toarray(), tc_cc.intensity[offset + 1, :].toarray())
- # no factor applied because of basin "WP"
- np.testing.assert_array_equal(
- tc.intensity[offset + 3, :].toarray(), tc_cc.intensity[offset + 3, :].toarray())
- # factor is applied to the remaining events
- np.testing.assert_array_almost_equal(
- tc.intensity[offset + 0, :].toarray() * 1.03375,
- tc_cc.intensity[offset + 0, :].toarray())
- np.testing.assert_array_almost_equal(
- tc.intensity[offset + 2, :].toarray() * 1.03375,
- tc_cc.intensity[offset + 2, :].toarray())
-
- def test_two_criterion_track(self):
- """Test _apply_criterion function with two criteria"""
- criterion = [
- {'basin': 'NA', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.045, 'variable': 'intensity'},
- {'basin': 'WP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.025, 'variable': 'intensity'},
- {'basin': 'WP', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1.025, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [0, 1, 2, 3, 4, 5],
- 'year': 2100, 'change': 0.7, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [1, 2, 3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [3, 4, 5],
- 'year': 2100, 'change': 1, 'variable': 'frequency'},
- {'basin': 'NA', 'category': [4, 5],
- 'year': 2100, 'change': 2, 'variable': 'frequency'}
- ]
- scale = 0.75
+ ## Retrieve scaling factors for cat 4 to 5 and 0 to 5
+ percentile = '50'
+ target_year = 2035
+ rcp = '8.5'
- intensity = np.zeros((4, 10))
- intensity[0, :] = np.arange(10)
- intensity[1, 5] = 10
- intensity[2, :] = np.arange(10, 20)
- intensity[3, 3] = 3
- tc = TropCyclone(
- intensity=sparse.csr_matrix(intensity),
- frequency=np.ones(4) * 0.5,
- basin=['NA', 'NA', 'NA', 'WP'],
- category=np.array([2, 0, 4, 1]),
- event_id=np.arange(4),
- )
+ future_tc = self.tc.apply_climate_scenario_knu(percentile=percentile,
+ scenario=rcp,
+ target_year=target_year)
- tc_cc = tc._apply_knutson_criterion(criterion, scale)
- self.assertTrue(np.allclose(tc.intensity[1, :].toarray(), tc_cc.intensity[1, :].toarray()))
- self.assertFalse(
- np.allclose(tc.intensity[3, :].toarray(), tc_cc.intensity[3, :].toarray()))
- self.assertFalse(
- np.allclose(tc.intensity[0, :].toarray(), tc_cc.intensity[0, :].toarray()))
- self.assertFalse(
- np.allclose(tc.intensity[2, :].toarray(), tc_cc.intensity[2, :].toarray()))
- self.assertTrue(
- np.allclose(tc.intensity[0, :].toarray() * 1.03375, tc_cc.intensity[0, :].toarray()))
- self.assertTrue(
- np.allclose(tc.intensity[2, :].toarray() * 1.03375, tc_cc.intensity[2, :].toarray()))
- self.assertTrue(
- np.allclose(tc.intensity[3, :].toarray() * 1.01875, tc_cc.intensity[3, :].toarray()))
-
- res_frequency = np.ones(4) * 0.5
- res_frequency[1] = 0.5 * (1 + (0.7 - 1) * scale)
- res_frequency[2] = 0.5 * (1 + (2 - 1) * scale)
- res_frequency[3] = 0.5 * (1 + (1.025 - 1) * scale)
- self.assertTrue(np.allclose(tc_cc.frequency, res_frequency))
-
- def test_negative_freq_error(self):
- """Test _apply_knutson_criterion with infeasible input."""
- criterion = [{'basin': 'SP', 'category': [0, 1],
- 'year': 2100, 'change': 0.5,
- 'variable': 'frequency'}
- ]
-
- tc = TropCyclone(
- frequency=np.ones(2),
- basin=['SP', 'SP'],
- category=np.array([0, 1]),
- )
+ for basin in np.unique(self.tc.basin):
+ basin_sel = np.array(self.tc.basin)==basin
- with self.assertRaises(ValueError):
- tc._apply_knutson_criterion(criterion, 3)
+ scaling_05, scaling_45 = [
+ get_knutson_scaling_factor(percentile=percentile,
+ variable=variable,
+ basin=basin).loc[target_year, rcp]
+ for variable in ['cat05', 'cat45']
+ ]
+
+ ## Calulate scaling factors for cat 0 to 3
+ freq_weighted_scaling_05 = scaling_05 * np.sum(self.tc.frequency[cat05_sel & basin_sel])
+ freq_weighted_scaling_45 = scaling_45 * np.sum(self.tc.frequency[cat45_sel & basin_sel])
+ freq_sum_03 = np.sum(self.tc.frequency[cat03_sel & basin_sel])
+
+ scaling_03 = (freq_weighted_scaling_05 - freq_weighted_scaling_45) / freq_sum_03
+
+ ## Check that frequencies obtained by function are the same as those obtained by scaling
+ ## historic frequencies with retrieved scaling factors
+ np.testing.assert_array_equal(
+ self.tc.frequency[cat03_sel & basin_sel] * (1 + scaling_03/100),
+ future_tc.frequency[cat03_sel & basin_sel]
+ )
+ np.testing.assert_array_equal(
+ self.tc.frequency[cat45_sel & basin_sel] * (1 + scaling_45/100),
+ future_tc.frequency[cat45_sel & basin_sel]
+ )
+ def test_apply_climate_scenario_knu_target_year_out_of_range(self):
+ self.create_tc()
+ with self.assertRaises(KeyError):
+ self.tc.apply_climate_scenario_knu(target_year=2200)
class TestDumpReloadCycle(unittest.TestCase):
def setUp(self):
@@ -453,7 +389,7 @@ def tearDown(self):
if __name__ == "__main__":
- TESTS = unittest.TestLoader().loadTestsFromTestCase(TestReader)
+ TESTS = unittest.TestLoader().loadTestsFromTestCase(TestClimateSce)
TESTS.addTests(unittest.TestLoader().loadTestsFromTestCase(TestClimateSce))
TESTS.addTests(unittest.TestLoader().loadTestsFromTestCase(TestDumpReloadCycle))
unittest.TextTestRunner(verbosity=2).run(TESTS)
diff --git a/climada/hazard/trop_cyclone/trop_cyclone.py b/climada/hazard/trop_cyclone/trop_cyclone.py
index 43c1ed782..6dacb1b7d 100644
--- a/climada/hazard/trop_cyclone/trop_cyclone.py
+++ b/climada/hazard/trop_cyclone/trop_cyclone.py
@@ -37,7 +37,7 @@
from climada.hazard.base import Hazard
from climada.hazard.tc_tracks import TCTracks
-from climada.hazard.tc_clim_change import get_knutson_criterion, calc_scale_knutson
+from climada.hazard.tc_clim_change import get_knutson_scaling_factor
from climada.hazard.centroids.centr import Centroids
import climada.util.constants as u_const
import climada.util.coordinates as u_coord
@@ -366,46 +366,105 @@ def from_tracks(
def apply_climate_scenario_knu(
self,
- ref_year: int = 2050,
- rcp_scenario: int = 45
+ percentile: str='50',
+ scenario: str='4.5',
+ target_year: int=2050,
+ **kwargs
):
"""
- From current TC hazard instance, return new hazard set with
- future events for a given RCP scenario and year based on the
- parametrized values derived from Table 3 in Knutson et al 2015.
- https://doi.org/10.1175/JCLI-D-15-0129.1 . The scaling for different
- years and RCP scenarios is obtained by linear interpolation.
-
- Note: The parametrized values are derived from the overall changes
- in statistical ensemble of tracks. Hence, this method should only be
- applied to sufficiently large tropical cyclone event sets that
- approximate the reference years 1981 - 2008 used in Knutson et. al.
-
- The frequency and intensity changes are applied independently from
- one another. The mean intensity factors can thus slightly deviate
- from the Knutson value (deviation was found to be less than 1%
- for default IBTrACS event sets 1980-2020 for each basin).
+ From current TC hazard instance, return new hazard set with future events
+ for a given RCP scenario and year based on the parametrized values derived
+ by Jewson 2021 (https://doi.org/10.1175/JAMC-D-21-0102.1) based on those
+ published by Knutson 2020 (https://doi.org/10.1175/BAMS-D-18-0194.1). The
+ scaling for different years and RCP scenarios is obtained by linear
+ interpolation.
+
+ Note: Only frequency changes are applied as suggested by Jewson 2022
+ (https://doi.org/10.1007/s00477-021-02142-6). Applying only frequency anyway
+ changes mean intensities and most importantly avoids possible inconsistencies
+ (including possible double-counting) that may arise from the application of both
+ frequency and intensity changes, as the relationship between these two is non
+ trivial to resolve.
Parameters
----------
- ref_year : int
- year between 2000 ad 2100. Default: 2050
- rcp_scenario : int
- 26 for RCP 2.6, 45 for RCP 4.5, 60 for RCP 6.0 and 85 for RCP 8.5.
- The default is 45.
-
+ percentile: str
+ percentiles of Knutson et al. 2020 estimates, representing the mode
+ uncertainty in future changes in TC activity. These estimates come from
+ a review of state-of-the-art literature and models. For the 'cat05' variable
+ (i.e. frequency of all tropical cyclones) the 5th, 25th, 50th, 75th and 95th
+ percentiles are provided. For 'cat45' and 'intensity', the provided percentiles
+ are the 10th, 25th, 50th, 75th and 90th. Please refer to the mentioned publications
+ for more details.
+ possible percentiles:
+ '5/10' either the 5th or 10th percentile depending on variable (see text above)
+ '25' for the 25th percentile
+ '50' for the 50th percentile
+ '75' for the 75th percentile
+ '90/95' either the 90th or 95th percentile depending on variable (see text above)
+ Default: '50'
+ scenario : str
+ possible scenarios:
+ '2.6' for RCP 2.6
+ '4.5' for RCP 4.5
+ '6.0' for RCP 6.0
+ '8.5' for RCP 8.5
+ target_year : int
+ future year to be simulated, between 2000 and 2100. Default: 2050.
Returns
-------
haz_cc : climada.hazard.TropCyclone
Tropical cyclone with frequencies and intensity scaled according
- to the Knutson criterion for the given year and RCP. Returns
- a new instance of climada.hazard.TropCyclone, self is not
+ to the Knutson criterion for the given year, RCP and percentile.
+ Returns a new instance of climada.hazard.TropCyclone, self is not
modified.
"""
- chg_int_freq = get_knutson_criterion()
- scale_rcp_year = calc_scale_knutson(ref_year, rcp_scenario)
- haz_cc = self._apply_knutson_criterion(chg_int_freq, scale_rcp_year)
- return haz_cc
+
+ if self.category.size == 0:
+ LOGGER.warning("Tropical cyclone categories are missing and"
+ "no effect of climate change can be modelled."
+ "The original event set is returned")
+ return self
+
+ tc_cc = copy.deepcopy(self)
+
+ sel_cat05 = np.isin(tc_cc.category, [0, 1, 2, 3, 4, 5])
+ sel_cat03 = np.isin(tc_cc.category, [0, 1, 2, 3])
+ sel_cat45 = np.isin(tc_cc.category, [4, 5])
+
+ years = np.array([dt.datetime.fromordinal(date).year for date in self.date])
+
+ for basin in np.unique(tc_cc.basin):
+ scale_year_rcp_05, scale_year_rcp_45 = [
+ get_knutson_scaling_factor(
+ percentile=percentile,
+ variable=variable,
+ basin=basin,
+ baseline=(np.min(years), np.max(years)),
+ **kwargs
+ ).loc[target_year, scenario]
+ for variable in ['cat05', 'cat45']
+ ]
+
+ bas_sel = np.array(tc_cc.basin) == basin
+
+ cat_05_freqs_change = scale_year_rcp_05 * np.sum(tc_cc.frequency[sel_cat05 & bas_sel])
+ cat_45_freqs_change = scale_year_rcp_45 * np.sum(tc_cc.frequency[sel_cat45 & bas_sel])
+ cat_03_freqs = np.sum(tc_cc.frequency[sel_cat03 & bas_sel])
+
+ scale_year_rcp_03 = (cat_05_freqs_change-cat_45_freqs_change) / cat_03_freqs
+
+ tc_cc.frequency[sel_cat03 & bas_sel] *= 1 + scale_year_rcp_03/100
+ tc_cc.frequency[sel_cat45 & bas_sel] *= 1 + scale_year_rcp_45/100
+
+ if any(tc_cc.frequency) < 0:
+ raise ValueError(
+ " The application of the climate scenario leads to "
+ " negative frequencies. One solution - if appropriate -"
+ " could be to use a less extreme percentile."
+ )
+
+ return tc_cc
def set_climate_scenario_knu(self, *args, **kwargs):
"""This function is deprecated, use TropCyclone.apply_climate_scenario_knu instead."""
diff --git a/doc/tutorial/climada_hazard_TropCyclone.ipynb b/doc/tutorial/climada_hazard_TropCyclone.ipynb
index b82b38184..79b63981a 100644
--- a/doc/tutorial/climada_hazard_TropCyclone.ipynb
+++ b/doc/tutorial/climada_hazard_TropCyclone.ipynb
@@ -1905,7 +1905,7 @@
" \n",
"### b) Implementing climate change\n",
"\n",
- "`apply_climate_scenario_knu` implements the changes on intensity and frequency due to climate change described in *Global projections of intense tropical cyclone activity for the late twenty-first century from dynamical downscaling of CMIP5/RCP4.5 scenarios* of Knutson et al 2015. Other RCP scenarios are approximated from the RCP 4.5 values by interpolating them according to their relative radiative forcing."
+ "`apply_climate_scenario_knu` implements the changes in frequency due to climate change described in Knutson et al (2020) and Jewson et al. (2021). This requires to pass the rcp scenario of interest, the projection's future reference year, the projection's percentile of interest and the historical baseline period. For simplicity we keep these latter two as default values only spefify the rcp (45) and the future reference year (2055)."
]
},
{
@@ -1944,15 +1944,20 @@
"source": [
"# an Irma event-like in 2055 under RCP 4.5:\n",
"tc_irma = TropCyclone.from_tracks(tr_irma, centroids=cent)\n",
- "tc_irma_cc = tc_irma.apply_climate_scenario_knu(ref_year=2055, rcp_scenario=45)\n",
- "tc_irma_cc.plot_intensity('2017242N16333');"
+ "tc_irma_cc = tc_irma.apply_climate_scenario_knu(target_year=2055, scenario='4.5')\n",
+ "\n",
+ "rel_freq_incr = np.round(\n",
+ " (np.mean(tc_irma_cc.frequency) - np.mean(tc_irma.frequency)\n",
+ " ) / np.mean(tc_irma.frequency)*100, 0)\n",
+ "\n",
+ "print(f\"\\nA TC like Irma would undergo a frequency increase of about {rel_freq_incr} % in 2055 under RCP 45\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "**Note:** this method to implement climate change is simplified and does only take into account changes in TC frequency and intensity. However, how hurricane damage changes with climate remains challenging to assess. Records of hurricane damage exhibit widely fluctuating values because they depend on rare, landfalling events which are substantially more volatile than the underlying basin-wide TC characteristics. For more accurate future projections of how a warming climate might shape TC characteristics, there is a two-step process needed. First, the understanding of how climate change affects critical environmental factors (like SST, humidity, etc.) that shape TCs is required. Second, the means of simulating how these changes impact TC characteristics (such as intensity, frequency, etc.) are necessary. Statistical-dynamical models (Emanuel et al., 2006 and Lee et al., 2018) are physics-based and allow for such climate change studies. However, this goes beyond the scope of this tutorial."
+ "**Note:** this method to implement climate change is simplified and does only take into account changes in TC frequency. However, how hurricane damage changes with climate remains challenging to assess. Records of hurricane damage exhibit widely fluctuating values because they depend on rare, landfalling events which are substantially more volatile than the underlying basin-wide TC characteristics. For more accurate future projections of how a warming climate might shape TC characteristics, there is a two-step process needed. First, the understanding of how climate change affects critical environmental factors (like SST, humidity, etc.) that shape TCs is required. Second, the means of simulating how these changes impact TC characteristics (such as intensity, frequency, etc.) are necessary. Statistical-dynamical models (Emanuel et al., 2006 and Lee et al., 2018) are physics-based and allow for such climate change studies. However, this goes beyond the scope of this tutorial."
]
},
{