-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AK landfall and full pipeline #13
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested this PR by executing the script and the "full pipeline" notebook to produce the dataset of landfalling AR events. This works as expected and is producing some great results!!
I've made some very, very minor suggestions. See those, and then this is ready to merge!
File path of output shapefile | ||
csv_fp : File path | ||
csv_fp : Posix path |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Smart play to also output as CSV 💯
fp_6hr : PosixPath | ||
File path for the raw 6hr interval landfall AR shapefile output. | ||
fp_events : PosixPath |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like how these are distinguished: 6hr
vs. events
ak_d = ak_.dissolve() | ||
|
||
# add new datetime column to ars gdf by parsing ISO timestamp | ||
# reformat time column string for output (datetime fields not supported in ESRI shp files) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks ESRI 🙅
@@ -518,17 +520,89 @@ def create_shapefile(all_ars, shp_fp, csv_fp): | |||
pd.DataFrame.from_dict(data=col_dict, orient='index').to_csv(csv_fp, header=False) | |||
|
|||
|
|||
def detect_all_ars(fp, n_criteria, out_shp, out_csv): | |||
def landfall_ars_export(shp_fp, ak_shp, fp_6hr, fp_events): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like how you've commented what is happening with this function!
Co-authored-by: Charlie Parr <[email protected]>
Co-authored-by: Charlie Parr <[email protected]>
Co-authored-by: Charlie Parr <[email protected]>
Overview: This PR adds a landfall analysis of the ARs detected in previous work. It also completes the
ar_detection.py
processing pipeline to be run from a master function via notebook, or entirely from the terminal using the “main” code block. This PR closes #11 , closes #2 , and closes #3. With regard to #3, this is by far the simplest way to merge AR time slices into events, and there are surely more sophisticated ways to do this. In the interest of time, I suggest we go with this solution unless we hear a specific need for something else.Detail: A new
landfall_ars_export()
function was added to thear_detection.py
module, and a corresponding AK coastline shape file was imported from the GVV repo. This new function uses spatial join operations to filter the full AR detection shapefile to include only those polygons intersecting the AK coastline polygon. The landfall geodataframe (with all original AR properties) is exported as own shapefile (landfall_ars_6hr.shp). The new function also groups the landfalling ARs that occur on adjacent dates, and combines the geometry into a multipolygon representing the entire AR event. All AR property attributes are disregarded, except for start and end time of the event, and this event layer is also exported as its own shapefile (landfall_ars_events.shp). This output could definitely be improved, suggestions are welcome! Especially with regards to aggregating the AR properties from individual 6hr timesteps.The
config.py
file was revised to include some more file paths for the AK shapefile and the new shapefile outputs. Thear_detection.py
file was further revised to finish the master processing function and the “main” code block.Instructions: The README and
config.py
files should provide the basic setup for testing this PR. No changes have been made to theenvironment.yml
,download.py
,compute.py
, so any previously downloaded/computed inputs from the “normal period” branch can safely be used with the “cp_ar_avalanche” conda environment. To test this PR:environment.yml
(optional) and activate itconfig.py
filepython download.py
(optional)python compute_ivt.py
(optional)AR_full_pipeline.ipynb
notebook to run the entire detection pipeline from the master function (should take ~45 min)python ar_detection.py
from the terminal with no argumentsAR_full_pipeline.ipynb
notebook to check attributes/visualization if desired