Skip to content

Commit

Permalink
Add Dask to latest and min dependency checkers (#2658)
Browse files Browse the repository at this point in the history
* add dask to latest dependency checker

* add dask to latest dependency checker

* update min deps

* update release notes

* add dask min reqs file

* fix min tests with spark
  • Loading branch information
thehomebrewnerd authored Feb 9, 2024
1 parent 2df1beb commit 57d7ab9
Show file tree
Hide file tree
Showing 5 changed files with 36 additions and 3 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/latest_dependency_checker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
- name: Update dependencies
run: |
python -m pip install --upgrade pip
python -m pip install -e ".[spark,test]"
python -m pip install -e ".[dask,spark,test]"
make checkdeps OUTPUT_PATH=featuretools/tests/requirement_files/latest_requirements.txt
- name: Create pull request
uses: peter-evans/create-pull-request@v3
Expand Down
8 changes: 8 additions & 0 deletions .github/workflows/minimum_dependency_checker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,14 @@ jobs:
paths: 'pyproject.toml'
options: 'dependencies'
output_filepath: featuretools/tests/requirement_files/minimum_core_requirements.txt
- name: Run min dep generator - dask
id: min_dep_gen_dask
uses: alteryx/minimum-dependency-generator@v3
with:
paths: 'pyproject.toml'
options: 'dependencies'
extras_require: 'dask'
output_filepath: featuretools/tests/requirement_files/minimum_dask_requirements.txt
- name: Run min dep generator - spark
id: min_dep_gen_spark
uses: alteryx/minimum-dependency-generator@v3
Expand Down
17 changes: 15 additions & 2 deletions .github/workflows/tests_with_minimum_deps.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
strategy:
fail-fast: false
matrix:
libraries: ["core", "spark - misc", "spark - computational", "spark - entityset_1", "spark - entityset_2", "spark - primitives"]
libraries: ["core", "dask", "spark - misc", "spark - computational", "spark - entityset_1", "spark - entityset_2", "spark - primitives"]
steps:
- name: Checkout repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -45,10 +45,16 @@ jobs:
NUMPY_VERSION=$(cat featuretools/tests/requirement_files/minimum_core_requirements.txt | grep numpy)
python -m pip uninstall numpy -y
python -m pip install $NUMPY_VERSION --no-build-isolation
- if: ${{ matrix.libraries == 'dask' }}
name: Install numpy for dask
run: |
NUMPY_VERSION=$(cat featuretools/tests/requirement_files/minimum_dask_requirements.txt | grep numpy)
python -m pip uninstall numpy -y
python -m pip install $NUMPY_VERSION --no-build-isolation
- name: Install featuretools - minimum tests dependencies
run: |
python -m pip install -r featuretools/tests/requirement_files/minimum_test_requirements.txt
- if: ${{ matrix.libraries == 'spark' }}
- if: ${{ startsWith(matrix.libraries, 'spark') }}
name: Install featuretools - minimum spark, core dependencies
run: |
sudo apt install -y openjdk-11-jre-headless
Expand All @@ -58,9 +64,16 @@ jobs:
name: Install featuretools - minimum core dependencies
run: |
python -m pip install -r featuretools/tests/requirement_files/minimum_core_requirements.txt
- if: ${{ matrix.libraries == 'dask' }}
name: Install featuretools - minimum dask dependencies
run: |
python -m pip install -r featuretools/tests/requirement_files/minimum_dask_requirements.txt
- if: ${{ matrix.libraries == 'core' }}
name: Run unit tests without code coverage
run: python -m pytest -x -n auto featuretools/tests/
- if: ${{ matrix.libraries == 'dask' }}
name: Run dask unit tests without code coverage
run: python -m pytest -x -n auto featuretools/tests/
- if: ${{ matrix.libraries == 'spark - misc' }}
name: Run unit tests (misc)
run: pytest featuretools/ -n auto --ignore=featuretools/tests/computational_backend --ignore=featuretools/tests/entityset_tests --ignore=featuretools/tests/primitive_tests
Expand Down
1 change: 1 addition & 0 deletions docs/source/release_notes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Future Release
* Update tests for compatibility with new versions of ``holidays`` (:pr:`2636`)
* Update ruff to 0.1.6 and use ruff linter/formatter (:pr:`2639`)
* Update ``release.yaml`` to use trusted publisher for PyPI releases (:pr:`2646`, :pr:`2653`, :pr:`2654`)
* Update dependency checkers and tests to include Dask (:pr:`2658`)


Thanks to the following people for contributing to this release:
Expand Down
11 changes: 11 additions & 0 deletions featuretools/tests/requirement_files/minimum_dask_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
cloudpickle==1.5.0
holidays==0.17
numpy==1.21.0
packaging==20.0
pandas==1.5.0
psutil==5.6.6
scipy==1.10.0
tqdm==4.32.0
woodwork[dask]==0.23.0
dask[dataframe]==2022.11.1
distributed==2022.11.1

0 comments on commit 57d7ab9

Please sign in to comment.