Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename package (namespaced), build for and publish to PyPI #15

Merged
merged 13 commits into from
Aug 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 42 additions & 0 deletions .github/workflows/build-and-release-anaconda-nsidc-channel.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# TODO: Remove once we're on conda-forge!
name: Build & publish to Anaconda.org ("nsidc" channel)

on:
push:
tags:
- v*

# Default to bash in login mode; key to activating conda environment
# https://github.com/mamba-org/provision-with-micromamba#IMPORTANT
defaults:
run:
shell: "bash -l {0}"

jobs:
build-and-release:
name: "Run build and release"
runs-on: "ubuntu-latest"
steps:
- name: "Check out repository"
uses: "actions/checkout@v4"

- name: "Install Conda environment"
uses: "mamba-org/setup-micromamba@v1"
with:
environment-file: "conda-lock.yml"
# When using a lock-file, we have to set an environment name.
environment-name: "iceflow"
cache-environment: true
# Increase this key to trigger cache invalidation
cache-environment-key: 1

- name: "Run conda build"
run: "conda mambabuild recipe/"

- name: "run anaconda upload"
env:
ANACONDA_TOKEN: ${{ secrets.ANACONDA_TOKEN }}
run: |
for ARTIFACT in $(ls /home/runner/micromamba/envs/iceflow/conda-bld/noarch/*.tar.bz2) ; do
anaconda -t $ANACONDA_TOKEN upload -u nsidc -l main $ARTIFACT
done
74 changes: 43 additions & 31 deletions .github/workflows/build-and-release.yml
Original file line number Diff line number Diff line change
@@ -1,41 +1,53 @@
name: build_and_release
name: Build & publish to PyPI

on:
workflow_dispatch:
pull_request:
push:
tags:
- v*
- "v[0-9]+.[0-9]+.[0-9]+*"

# Default to bash in login mode; key to activating conda environment
# https://github.com/mamba-org/provision-with-micromamba#IMPORTANT
defaults:
run:
shell: "bash -l {0}"
concurrency:
group: "${{ github.workflow }}-${{ github.ref }}"
cancel-in-progress: true

env:
# Many color libraries just need this to be set to any value, but at least
# one distinguishes color depth, where "3" -> "256-bit color".
FORCE_COLOR: 3

jobs:
build-and-release:
name: "Run build and release"
runs-on: "ubuntu-latest"
dist:
name: Distribution build
runs-on: ubuntu-latest

steps:
- name: "Check out repository"
uses: "actions/checkout@v4"
- uses: actions/checkout@v4
with:
fetch-depth: 0

- uses: hynek/build-and-inspect-python-package@v2

publish:
if: github.ref_type == 'tag'
name: Publish to PyPI
needs: [dist]
environment: pypi
permissions:
id-token: write
attestations: write
contents: read
runs-on: ubuntu-latest

- name: "Install Conda environment"
uses: "mamba-org/setup-micromamba@v1"
steps:
- uses: actions/download-artifact@v4
with:
name: Packages
path: dist

- name: Generate artifact attestation for sdist and wheel
uses: actions/[email protected]
with:
environment-file: "conda-lock.yml"
# When using a lock-file, we have to set an environment name.
environment-name: "iceflow"
cache-environment: true
# Increase this key to trigger cache invalidation
cache-environment-key: 1

- name: "Run conda build"
run: "conda mambabuild recipe/"

- name: "run anaconda upload"
env:
ANACONDA_TOKEN: ${{ secrets.ANACONDA_TOKEN }}
run: |
for ARTIFACT in $(ls /home/runner/micromamba/envs/iceflow/conda-bld/noarch/*.tar.bz2) ; do
anaconda -t $ANACONDA_TOKEN upload -u nsidc -l main $ARTIFACT
done
subject-path: "dist/*"

- uses: pypa/gh-action-pypi-publish@release/v1
3 changes: 2 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
ci:
autoupdate_schedule: "monthly"
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"
autofix_prs: false # Comment "pre-commit.ci autofix" on a PR to trigger

repos:
- repo: https://github.com/adamchainz/blacken-docs
Expand Down
24 changes: 16 additions & 8 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,13 @@ description of best practices for developing scientific packages.

# Setting up a development environment

You can set up a development environment by running:
You can set up a development environment with `conda` or your environment
manager of choice:

```bash
conda env create -f environment.yml
mfisher87 marked this conversation as resolved.
Show resolved Hide resolved
conda create -n iceflow-dev pip
conda activate iceflow-dev
pip install --editable .[dev]
```

# Pre-commit
Expand All @@ -17,12 +20,11 @@ You should prepare pre-commit, which will help you by checking that commits pass
required checks:

```bash
pip install pre-commit # or brew install pre-commit on macOS
pre-commit install # Will install a pre-commit hook into the git repo
```

You can also/alternatively run `pre-commit run` (changes only) or
`pre-commit run --all-files` to check even without installing the hook.
`pre-commit run --all-files` to check without installing the hook.

# Common tasks

Expand All @@ -48,7 +50,13 @@ reflect the version you plan to release. Then, bump the version with
$ bump-my-version bump {major|minor|patch}
```

This will update files containing the software version number. Commit these
changes and, once ready, merge them into `main` (through the use of a Pull
Request on a feature branch). Tag the commit you want to release on `main` to
initiate a GitHub Action (GHA) that will release the package to anaconda.org.
This will update files containing the software version number.

> [!WARNING]
>
> Please do not attempt to update version numbers by hand!

Commit these changes and, once ready, merge them into `main` (through the use of
a Pull Request on a feature branch). Tag the commit you want to release on
`main` to initiate a GitHub Action (GHA) that will release the package to
anaconda.org.
14 changes: 13 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,19 @@ TODO

## Usage

TODO
### Install

```bash
pip install nsidc-iceflow
```

### Using `iceflow`

```python
from nsidc import iceflow

# TODO
```

## Credit

Expand Down
27 changes: 0 additions & 27 deletions environment.yml

This file was deleted.

40 changes: 37 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
[project]
name = "iceflow"
name = "nsidc-iceflow"
version = "v0.2.0"
authors = [
{ name = "NSIDC", email = "[email protected]" },
]
Expand All @@ -22,14 +23,43 @@ classifiers = [
"Topic :: Scientific/Engineering",
"Typing :: Typed",
]
dynamic = ["version"]
dependencies = [
"numpy >=2.0.1",
"earthaccess >=0.10.0",
"pandas >=2.2",
"h5py >=3.11",
"gps-timemachine >=1.1.4",
"pyproj >=3.6.1",
"shapely >=2.0.5",
"pandera[mypy] >= 0.20.3",
"pydantic >=2.8.2",
]

[project.urls]
Homepage = "https://github.com/NSIDC/iceflow"
"Bug Tracker" = "https://github.com/NSIDC/iceflow/issues"
Discussions = "https://github.com/NSIDC/iceflow/discussions"
Changelog = "https://github.com/NSIDC/iceflow/releases"

[project.optional-dependencies]
dev = [
"bump-my-version",
"conda-lock >=2.5.7",
"invoke",
"mypy >=1.11.1",
"pandas-stubs >=2.2",
"pre-commit",
"pytest",
]


[build-system]
build-backend = "hatchling.build"
requires = ["hatchling"]

[tool.hatch.build.targets.wheel]
packages = ["src/nsidc"]


[tool.pytest.ini_options]
minversion = "6.0"
Expand Down Expand Up @@ -61,6 +91,10 @@ disallow_incomplete_defs = false
check_untyped_defs = true
plugins = "pandera.mypy"

# Needed to work with our namespaced package! See: https://mypy.readthedocs.io/en/stable/running_mypy.html#mapping-paths-to-modules
mypy_path = "$MYPY_CONFIG_FILE_DIR/src"
explicit_package_bases = true

[[tool.mypy.overrides]]
module = [
"iceflow.*",
Expand Down Expand Up @@ -141,7 +175,7 @@ commit = false
tag = false

[[tool.bumpversion.files]]
filename = "src/iceflow/__init__.py"
filename = "src/nsidc/iceflow/__init__.py"
search = '__version__ = "v{current_version}"'
replace = '__version__ = "v{new_version}"'

Expand Down
File renamed without changes.
8 changes: 4 additions & 4 deletions src/iceflow/api.py → src/nsidc/iceflow/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@

import pandas as pd

from iceflow.data.fetch import search_and_download
from iceflow.data.models import (
from nsidc.iceflow.data.fetch import search_and_download
from nsidc.iceflow.data.models import (
DatasetSearchParameters,
IceflowDataFrame,
)
from iceflow.data.read import read_data
from iceflow.itrf.converter import transform_itrf
from nsidc.iceflow.data.read import read_data
from nsidc.iceflow.itrf.converter import transform_itrf


def fetch_iceflow_df(
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from gps_timemachine.gps import leap_seconds
from numpy.typing import DTypeLike

from iceflow.data.models import ATM1BDataFrame
from nsidc.iceflow.data.models import ATM1BDataFrame

"""
The dtypes used to read any of the input ATM1B input files.
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
import pydantic
from pandera.typing import DataFrame, Index, Series

from iceflow.itrf import ITRF_REGEX
from nsidc.iceflow.itrf import ITRF_REGEX


class CommonDataColumnsSchema(pa.DataFrameModel):
Expand Down
4 changes: 2 additions & 2 deletions src/iceflow/data/read.py → src/nsidc/iceflow/data/read.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
import functools
from pathlib import Path

from iceflow.data.atm1b import atm1b_data
from iceflow.data.models import (
from nsidc.iceflow.data.atm1b import atm1b_data
from nsidc.iceflow.data.models import (
ATM1BDataFrame,
ATM1BDataset,
Dataset,
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import pandera as pa
from pyproj import Transformer

from iceflow.data.models import IceflowDataFrame
from iceflow.itrf import check_itrf
from nsidc.iceflow.data.models import IceflowDataFrame
from nsidc.iceflow.itrf import check_itrf


def _datetime_to_decimal_year(date):
Expand Down
File renamed without changes.
8 changes: 6 additions & 2 deletions tests/integration/test_e2e.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,12 @@

import pandas as pd

from iceflow.api import fetch_iceflow_df
from iceflow.data.models import ATM1BDataset, DatasetSearchParameters, IceflowDataFrame
from nsidc.iceflow.api import fetch_iceflow_df
from nsidc.iceflow.data.models import (
ATM1BDataset,
DatasetSearchParameters,
IceflowDataFrame,
)


def test_e2e(tmp_path):
Expand Down
2 changes: 1 addition & 1 deletion tests/unit/test_data_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
import pandera as pa
import pytest

from iceflow.data.models import IceflowDataFrame
from nsidc.iceflow.data.models import IceflowDataFrame

_mock_bad_df = pd.DataFrame(
{
Expand Down
2 changes: 1 addition & 1 deletion tests/unit/test_import_dep.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,6 @@ def test_gps_timemachine_import():


def test_import_package():
import iceflow
from nsidc import iceflow

assert iceflow is not None
Loading