Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modernization #71

Merged
merged 27 commits into from
Feb 18, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
9ebd3e3
Convert to pyproject.toml
maresb Feb 14, 2025
a6df497
Switch to hatch-vcs
maresb Feb 14, 2025
b61efe6
Update MDR file creation
maresb Feb 15, 2025
4e95c4b
Rework convert_lon_to180 without explicit perturbation
maresb Feb 15, 2025
d66f1fe
Replace unreadable pkl with JSON
maresb Feb 15, 2025
2f2c894
Remove loaded but unused basins from run_sample.py
maresb Feb 15, 2025
c882bb3
Fix MDR reference in README
maresb Feb 15, 2025
76bceb8
Fix typos and broken links in README.md
maresb Feb 15, 2025
0ec65b9
Add simple .gitignore
maresb Feb 15, 2025
831d9f9
Remove extraneous requirements.txt
maresb Feb 15, 2025
d5e17c9
Add test
maresb Feb 15, 2025
0b63db7
Switch to src/ layout
maresb Feb 15, 2025
8848154
Add pytest and release workflows
maresb Feb 15, 2025
4333615
Configure doctests
maresb Feb 15, 2025
e4e68fb
Add doctests for utilities.py
maresb Feb 15, 2025
7c24f29
Cover exceptional cases in decompose_pi
maresb Feb 15, 2025
ec8ab36
Use ellipses instead of rounding
maresb Feb 15, 2025
ca061c6
Use module-level doctest ellipses
maresb Feb 15, 2025
48cde13
Allow disabling of numba for testing
maresb Feb 15, 2025
c71871c
Convert large comment blocks to docstrings
maresb Feb 18, 2025
e680642
Make pytest verbose by default
maresb Feb 18, 2025
ecfeff0
Remove xarray as project dependency
maresb Feb 18, 2025
60899f7
Test only on min/max Python versions
maresb Feb 18, 2025
d6cfd4f
Fix pytest workflow
maresb Feb 18, 2025
96fb60a
Separate njit no-op decorator from conditional
maresb Feb 18, 2025
bced721
Make DISABLE_NUMBA specific to tcpyPI
maresb Feb 18, 2025
09b4635
Add doctest for pi
maresb Feb 18, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 46 additions & 0 deletions .github/workflows/pytest.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
name: Run tests

on:
push:
branches:
- master
- main
pull_request:

jobs:
test:
name: Python ${{ matrix.python-version }} on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false # Don't cancel other jobs if one fails
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
# Python 3.7 is no longer available on the -latest series (except for Windows)
python-version: ["3.8", "3.13"]

steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Needed for git history (tags)

- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install test dependencies
run: |
python -m pip install --upgrade pip
python -m pip install pytest pytest-xdist --editable .
python -m pip freeze --local

- name: Run tests not requiring xarray
run: |
python -m pytest -n auto --ignore run_sample.py --ignore tests/test_run_sample.py

- name: Install xarray
run: |
python -m pip install xarray h5netcdf

- name: Run tests requiring xarray
run: |
python -m pytest -n auto tests/test_run_sample.py
50 changes: 50 additions & 0 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
name: release-pipeline

on:
push:
branches:
- master
- main
release:
types:
- published

jobs:
build-package:
runs-on: ubuntu-latest
permissions:
# write attestations and id-token are necessary for attest-build-provenance-github
attestations: write
id-token: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- uses: hynek/build-and-inspect-python-package@v2
with:
# Prove that the packages were built in the context of this workflow.
attest-build-provenance-github: true

publish-package:
# Don't publish from forks
if: github.repository_owner == 'dgilford' && github.event_name == 'release' && github.event.action == 'published'
# Use the `release` GitHub environment to protect the Trusted Publishing (OIDC)
# workflow by requiring signoff from a maintainer.
environment: release
needs: build-package
runs-on: ubuntu-latest
permissions:
# write id-token is necessary for trusted publishing (OIDC)
id-token: write
steps:
- name: Download Distribution Artifacts
uses: actions/download-artifact@v4
with:
# The build-and-inspect-python-package action invokes upload-artifact.
# These are the correct arguments from that action.
name: Packages
path: dist
- name: Publish Package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
# Implicitly attests that the packages were uploaded in the context of this workflow.
28 changes: 28 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Python
__pycache__/
*.py[cod]
*.so
dist/
*.egg-info

# Jupyter
.ipynb_checkpoints

# Environment
.env
venv/
.pixi/

# IDE
.idea/
.vscode/
*.swp
*.swo

# OS
.DS_Store
Thumbs.db

# Data
*.nc
!data/*.nc
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The goals in developing and maintaining pyPI are to:
* carefully document the BE02 algorithm and its Python implementation, and to
* demonstrate and encourage the use of potential intensity theory in tropical cyclone climatology analysis.

If you have any questions, comments, or feedback, please [contact the developer](mailto:[email protected]) or open an [Issue](https://github.com/dgilford/pyPI/issues) in the repository. A paper detailing pyPI is published [at Geoscientific Model Development](https://gmd.copernicus.org/articles/14/2351/2021/gmd-14-2351-2021.pdf).
If you have any questions, comments, or feedback, please [contact the developer](mailto:[email protected]) or open an [Issue](https://github.com/dgilford/pyPI/issues) in the repository. A paper detailing pyPI is published [at Geoscientific Model Development](https://gmd.copernicus.org/articles/14/2351/2021/gmd-14-2351-2021.pdf).

## Citation
pyPI was developed by [Daniel Gilford](https://github.com/dgilford) and has been archived on Zenodo:
Expand Down Expand Up @@ -58,13 +58,13 @@ pip install tcpypi
* [Numba](http://numba.pydata.org/)

Not required by tcpyPI---but highly recommended!---is the versatility in calculating PI over large datasets provided by [xarray](http://xarray.pydata.org/en/stable/).
Dependancy versions were originally handled by [Dependabot](https://dependabot.com/), but the code was not resilient to these changes so they are currently defunct (as of 10 August 2022). Please [notify me](mailto:[email protected]) immediately if installation problems persist.
Dependency versions were originally handled by [Dependabot](https://dependabot.com/), but the code was not resilient to these changes so they are currently defunct (as of 10 August 2022). Please [notify me](mailto:[email protected]) immediately if installation problems persist.

### Python Implementation of "pc_min" (BE02 PI Calculator)

[pi.py](pi.py) is the Python function which directly computes PI given atmospheric and ocean state variables (akin to the BE02 algorithm MATLAB implementation [pc_min.m](pc_min.m)). Given input vector columns of environmental atmospheric temperatures (T) and mixing ratios (R) on a pressure grid (P), sea surface temperatures (SST), and mean sea-level pressures (MSL), the algorithm outputs potential intensity, the outflow level, the outflow temperature, and the minimum central pressure, and a flag that shows the status of the completed PI calculation. pyPI is an improvement on pcmin in that it handles missing values depending on user input flags.
[pi.py](src/tcpyPI/pi.py) is the Python function which directly computes PI given atmospheric and ocean state variables (akin to the BE02 algorithm MATLAB implementation [pc_min.m](matlab_scripts/pc_min.m)). Given input vector columns of environmental atmospheric temperatures (T) and mixing ratios (R) on a pressure grid (P), sea surface temperatures (SST), and mean sea-level pressures (MSL), the algorithm outputs potential intensity, the outflow level, the outflow temperature, and the minimum central pressure, and a flag that shows the status of the completed PI calculation. pyPI is an improvement on pcmin in that it handles missing values depending on user input flags.

Users who want to apply the PI calculation to a set of local environmental conditions need only to download [pi.py](./tcpyPI/pi.py), organize their data appropriately, and call the function to return outputs, e.g.:
Users who want to apply the PI calculation to a set of local environmental conditions need only to download [pi.py](./src/tcpyPI/pi.py), organize their data appropriately, and call the function to return outputs, e.g.:
```
(VMAX,PMIN,IFL,TO,LNB)=pi(SST,MSL,P,T,R)
```
Expand All @@ -80,12 +80,12 @@ and examine the outputs locally produced in [full_sample_output.nc](./data/full_
## File Descriptions

#### Key files
* **[pi.py](./tcpyPI/pi.py)** - The primary function of pyPI, that computes and outputs PI (and associated variables) given atmospheric and ocean state variables.
* **[pi.py](./src/tcpyPI/pi.py)** - The primary function of pyPI, that computes and outputs PI (and associated variables) given atmospheric and ocean state variables.
* **[run_sample.py](run_sample.py)** - Example script that computes PI and accompanying analyses over the entire sample dataset

#### Data
* [sample_data.nc](./data/sample_data.nc) - Sample atmospheric and ocean state variable data and BE02 MATLAB output data; values are monthly averages over the globe from MERRA2 in 2004.
* [mdr.pk1](./data/mdr.pk1) - Python pickled dictionary containing Main Development Region definitions from [Gilford et al. (2017)](https://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-16-0827.1)
* [mdr.json](./data/mdr.json) - Main Development Region definitions from [Gilford et al. (2017)](https://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-16-0827.1)
* [raw_sample_output.nc](./data/raw_sample_output.nc) - Sample outputs from pi.py *only* created by run_sample.py
* [full_sample_output.nc](./data/full_sample_output.nc) - Full set of sample outputs from pi.py as well as sample analyses such as PI decomposition

Expand All @@ -95,9 +95,9 @@ and examine the outputs locally produced in [full_sample_output.nc](./data/full_
* **[sample_output_analyses.ipynb](./notebooks/sample_output_analyses.ipynb)** - Notebook showing examples of pyPI outputs and simple PI analyses

#### Misc.
* **[utilities.py](./tcpyPI/utilities.py)** - Set of functions used in the pyPI codebase
* **[constants.py](./tcpyPI/constants.py)** - Set of meteorological constants used in the pyPI codebase
* **[reference_calculations.m](./matlab_scripts/reference_calculations.m)** - Script used to generate sample BE02 MATLAB outout data from original MERRA2 files monthly mean; included for posterity and transperancy
* **[utilities.py](./src/tcpyPI/utilities.py)** - Set of functions used in the pyPI codebase
* **[constants.py](./src/tcpyPI/constants.py)** - Set of meteorological constants used in the pyPI codebase
* **[reference_calculations.m](./matlab_scripts/reference_calculations.m)** - Script used to generate sample BE02 MATLAB output data from original MERRA2 files monthly mean; included for posterity and transparency
* **[pc_min.m](./matlab_scripts/pc_min.m)** - Original BE02 algorithm from MATLAB, adapted and used to produce analyses of Gilford et al. ([2017](https://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-16-0827.1); [2019](https://journals.ametsoc.org/doi/10.1175/MWR-D-19-0021.1))
* **[clock_pypi.ipynb](./notebooks/clock_pypi.ipynb)** - Notebook estimating the time it takes to run pyPI on a laptop

Expand Down
47 changes: 47 additions & 0 deletions data/mdr.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{
"na": {
"color": "red",
"name": "North Atlantic",
"shortname": "NA",
"lat_min": 7.5,
"lat_max": 32.5,
"lon_min": -95.0,
"lon_max": -50.0
},
"enp": {
"color": "darkgreen",
"name": "Eastern North Pacific",
"shortname": "ENP",
"lat_min": 2.5,
"lat_max": 17.5,
"lon_min": -170.0,
"lon_max": -90.0
},
"wnp": {
"color": "blue",
"name": "Western North Pacific",
"shortname": "WNP",
"lat_min": 2.5,
"lat_max": 17.5,
"lon_min": 130.0,
"lon_max": 180.0
},
"ni": {
"color": "gold",
"name": "North Indian",
"shortname": "NI",
"lat_min": 2.5,
"lat_max": 22.5,
"lon_min": 50.0,
"lon_max": 110.0
},
"sh": {
"color": "black",
"name": "Sothern Hemisphere",
"shortname": "SH",
"lat_min": -20.0,
"lat_max": -2.5,
"lon_min": 60.0,
"lon_max": 180.0
}
}
Binary file removed data/mdr.pk1
Binary file not shown.
99 changes: 54 additions & 45 deletions notebooks/sample_output_analyses.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -162,46 +162,56 @@
"outputs": [],
"source": [
"# find grid spacing and set bounds from central points according with it\n",
"xl=np.abs(ds.lon[1]-ds.lon[0])/2\n",
"yl=np.abs(ds.lat[1]-ds.lat[0])/2\n",
"# create the basins and store in a dictionary:\n",
"na=xr.Dataset({\n",
" 'color': 'red', \n",
" 'bounds': xr.DataArray(np.asarray([8.75-yl, 31.25+yl, convert_lon_to180(266.25-xl), convert_lon_to180(308.75+xl)]).reshape(2,2), coords=[[0,1], [0,1]], dims=['lat_range', 'lon_range']),\n",
" 'name': 'North Atlantic',\n",
" 'shortname': 'NA',\n",
" })\n",
"enp=xr.Dataset({\n",
" 'color': 'darkgreen', \n",
" 'bounds': xr.DataArray(np.asarray([3.75-yl, 16.25+yl, convert_lon_to180(191.25-xl), convert_lon_to180(268.75+xl)]).reshape(2,2), coords=[[0,1], [0,1]], dims=['lat_range', 'lon_range']),\n",
" 'name': 'Eastern North Pacific',\n",
" 'shortname': 'ENP',\n",
" })\n",
"wnp=xr.Dataset({\n",
" 'color': 'blue', \n",
" 'bounds': xr.DataArray(np.asarray([3.75-yl, 16.25+yl, convert_lon_to180(131.25-xl), convert_lon_to180(178.75+xl)]).reshape(2,2), coords=[[0,1], [0,1]], dims=['lat_range', 'lon_range']),\n",
" 'name': 'Western North Pacific',\n",
" 'shortname': 'WNP',\n",
" })\n",
"ni=xr.Dataset({\n",
" 'color': 'gold', \n",
" 'bounds': xr.DataArray(np.asarray([3.75-yl, 21.25+yl, convert_lon_to180(51.25-xl), convert_lon_to180(108.75+xl)]).reshape(2,2), coords=[[0,1], [0,1]], dims=['lat_range', 'lon_range']),\n",
" 'name': 'North Indian',\n",
" 'shortname': 'NI',\n",
" })\n",
"sh=xr.Dataset({\n",
" 'color': 'black', \n",
" 'bounds': xr.DataArray(np.asarray([-18.75-yl, -3.75+yl, convert_lon_to180(61.25-xl), convert_lon_to180(178.75+xl)]).reshape(2,2), coords=[[0,1], [0,1]], dims=['lat_range', 'lon_range']),\n",
" 'name': 'Sothern Hemisphere',\n",
" 'shortname': 'SH',\n",
" })\n",
"basins={\n",
" 'na': na,\n",
" 'enp': enp,\n",
" 'wnp': wnp,\n",
" 'ni': ni,\n",
" 'sh': sh, \n",
" }"
"xl=(np.abs(ds.lon[1]-ds.lon[0])/2).item()\n",
"yl=(np.abs(ds.lat[1]-ds.lat[0])/2).item()\n",
"# create the basins as a dictionary:\n",
"basins = {\n",
" \"na\": {\n",
" \"color\": \"red\",\n",
" \"name\": \"North Atlantic\",\n",
" \"shortname\": \"NA\",\n",
" \"lat_min\": 8.75 - yl,\n",
" \"lat_max\": 31.25 + yl,\n",
" \"lon_min\": convert_lon_to180(266.25 - xl),\n",
" \"lon_max\": convert_lon_to180(308.75 + xl),\n",
" },\n",
" \"enp\": {\n",
" \"color\": \"darkgreen\",\n",
" \"name\": \"Eastern North Pacific\",\n",
" \"shortname\": \"ENP\",\n",
" \"lat_min\": 3.75 - yl,\n",
" \"lat_max\": 16.25 + yl,\n",
" \"lon_min\": convert_lon_to180(191.25 - xl),\n",
" \"lon_max\": convert_lon_to180(268.75 + xl),\n",
" },\n",
" \"wnp\": {\n",
" \"color\": \"blue\",\n",
" \"name\": \"Western North Pacific\",\n",
" \"shortname\": \"WNP\",\n",
" \"lat_min\": 3.75 - yl,\n",
" \"lat_max\": 16.25 + yl,\n",
" \"lon_min\": convert_lon_to180(131.25 - xl),\n",
" \"lon_max\": convert_lon_to180(178.75 + xl),\n",
" },\n",
" \"ni\": {\n",
" \"color\": \"gold\",\n",
" \"name\": \"North Indian\",\n",
" \"shortname\": \"NI\",\n",
" \"lat_min\": 3.75 - yl,\n",
" \"lat_max\": 21.25 + yl,\n",
" \"lon_min\": convert_lon_to180(51.25 - xl),\n",
" \"lon_max\": convert_lon_to180(108.75 + xl),\n",
" },\n",
" \"sh\": {\n",
" \"color\": \"black\",\n",
" \"name\": \"Sothern Hemisphere\",\n",
" \"shortname\": \"SH\",\n",
" \"lat_min\": -18.75 - yl,\n",
" \"lat_max\": -3.75 + yl,\n",
" \"lon_min\": convert_lon_to180(61.25 - xl),\n",
" \"lon_max\": convert_lon_to180(178.75 + xl),\n",
" },\n",
"}"
]
},
{
Expand All @@ -211,11 +221,10 @@
"outputs": [],
"source": [
"# save out the basins as defined\n",
"import pickle\n",
"f = open(\"../data/mdr.pk1\",\"wb\")\n",
"_mdrF=\"../data/mdr.pk1\"\n",
"pickle.dump(basins,f)\n",
"f.close()"
"import json\n",
"from pathlib import Path\n",
"f = Path(\"../data/mdr.json\")\n",
"f.write_text(json.dumps(basins,indent=2))"
]
},
{
Expand Down
50 changes: 50 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
[build-system]
requires = ["hatchling", "hatch-vcs"]
build-backend = "hatchling.build"

[project]
name = "tcpyPI"
dynamic = ["version"]
description = "tcpyPI: Tropical cyclone potential intensity calculations in Python"
readme = "README.md"
requires-python = ">=3.7"
license = {file = "LICENSE" }
authors = [
{ name = "Daniel M. Gilford, PhD", email = "[email protected]" }
]
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"
]
dependencies = [
"numba>=0.51.2",
"numpy>=1.19.5"
]

[project.urls]
Homepage = "https://github.com/dgilford/tcpyPI"
Download = "https://github.com/dgilford/tcpyPI"

# Configure hatch-vcs to extract version information from version control (e.g. Git tags)
[tool.hatch.version]
source = "vcs"

[tool.hatch.build.targets.sdist]
exclude = [
"/.github",
"/data",
"/dist",
"/figures",
"/matlab_scripts",
"/notebooks",
"/tests",
"/pyPI_Users_Guide_v1.3.pdf",
"/run_sample.py",
]

[tool.hatch.build.targets.wheel]
packages = ["src/tcpyPI"]

[tool.pytest.ini_options]
addopts = ["--doctest-modules", "-v"]
2 changes: 0 additions & 2 deletions requirements.txt

This file was deleted.

Loading
Loading