Skip to content

Commit ce51df3

Browse files
authored
Drop python 3.8, add 3.11 & 3.12 to testing (#1143)
* Initial python version bump in CI * Add libprotobuf to GPU CI environments * Replace mentions of old env files * Remove strict channel priority to try to unblock env solves? * Establish minimum version for mlflow * Revert "Remove strict channel priority to try to unblock env solves?" This reverts commit e454833. * Try strict channel priority without nodefaults * Bump mlflow min version to fix windows failures * Build python 3.11 wheels * Run wheel builds in PR test * Try protoc action in wheels build to unblock * Skip hive testing on 3.11 for now * Fix workflow syntax errors * Stop running wheel CI * Bump pyo3 abi minor version * Initial run of pyupgrade to py39 * Continue marking test_dask_fsql as flaky * More places to drop 3.8 * Try running tests on python 3.12 * Add environment file * Skip sasl installation * Drop protoc build dep * Drop mlflow constraint * Set min version for mlflow * Drop mlflow from 3.12 tests for now * Relocate docker/server files to continuous_integration * Unpin dask/distributed * unpin 3.9 gpu environment * add 3.12 to classifiers * unpin dask in gpuci 3.9
1 parent dbb36a4 commit ce51df3

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+172
-159
lines changed

.github/workflows/conda.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ jobs:
3131
strategy:
3232
fail-fast: false
3333
matrix:
34-
python: ["3.8", "3.9", "3.10"]
34+
python: ["3.9", "3.10", "3.11", "3.12"]
3535
arch: ["linux-64", "linux-aarch64"]
3636
steps:
3737
- name: Manage disk space
@@ -72,7 +72,7 @@ jobs:
7272
with:
7373
miniforge-variant: Mambaforge
7474
use-mamba: true
75-
python-version: "3.8"
75+
python-version: "3.9"
7676
channel-priority: strict
7777
- name: Install dependencies
7878
run: |

.github/workflows/docker.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ on:
1111
- Cargo.toml
1212
- Cargo.lock
1313
- pyproject.toml
14-
- docker/**
14+
- continuous_integration/docker/**
1515
- .github/workflows/docker.yml
1616

1717
# When this workflow is queued, automatically cancel any previous running
@@ -47,7 +47,7 @@ jobs:
4747
uses: docker/build-push-action@v5
4848
with:
4949
context: .
50-
file: ./docker/main.dockerfile
50+
file: ./continuous_integration/docker/main.dockerfile
5151
build-args: DOCKER_META_VERSION=${{ steps.docker_meta_main.outputs.version }}
5252
platforms: ${{ matrix.platform }}
5353
tags: ${{ steps.docker_meta_main.outputs.tags }}
@@ -68,7 +68,7 @@ jobs:
6868
uses: docker/build-push-action@v5
6969
with:
7070
context: .
71-
file: ./docker/cloud.dockerfile
71+
file: ./continuous_integration/docker/cloud.dockerfile
7272
build-args: DOCKER_META_VERSION=${{ steps.docker_meta_main.outputs.version }}
7373
platforms: ${{ matrix.platform }}
7474
tags: ${{ steps.docker_meta_cloud.outputs.tags }}

.github/workflows/test-upstream.yml

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -36,21 +36,21 @@ jobs:
3636
name: "Test upstream dev (${{ matrix.os }}, python: ${{ matrix.python }}, distributed: ${{ matrix.distributed }})"
3737
runs-on: ${{ matrix.os }}
3838
env:
39-
CONDA_FILE: continuous_integration/environment-${{ matrix.python }}-dev.yaml
39+
CONDA_FILE: continuous_integration/environment-${{ matrix.python }}.yaml
4040
DASK_SQL_DISTRIBUTED_TESTS: ${{ matrix.distributed }}
4141
strategy:
4242
fail-fast: false
4343
matrix:
4444
os: [ubuntu-latest, windows-latest, macos-latest]
45-
python: ["3.8", "3.9", "3.10"]
45+
python: ["3.9", "3.10", "3.11", "3.12"]
4646
distributed: [false]
4747
include:
4848
# run tests on a distributed client
4949
- os: "ubuntu-latest"
50-
python: "3.8"
50+
python: "3.9"
5151
distributed: true
5252
- os: "ubuntu-latest"
53-
python: "3.10"
53+
python: "3.11"
5454
distributed: true
5555
steps:
5656
- uses: actions/checkout@v4
@@ -75,7 +75,6 @@ jobs:
7575
- name: Install hive testing dependencies
7676
if: matrix.os == 'ubuntu-latest'
7777
run: |
78-
mamba install -c conda-forge "sasl>=0.3.1"
7978
docker pull bde2020/hive:2.3.2-postgresql-metastore
8079
docker pull bde2020/hive-metastore-postgresql:2.3.0
8180
- name: Install upstream dev Dask
@@ -109,8 +108,6 @@ jobs:
109108
with:
110109
miniforge-variant: Mambaforge
111110
use-mamba: true
112-
# TODO: drop support for python 3.8, add support for python 3.11
113-
# https://github.com/dask-contrib/dask-sql/pull/1143
114111
python-version: "3.9"
115112
channel-priority: strict
116113
- name: Optionally update upstream cargo dependencies

.github/workflows/test.yml

Lines changed: 5 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -37,21 +37,21 @@ jobs:
3737
needs: [detect-ci-trigger]
3838
runs-on: ${{ matrix.os }}
3939
env:
40-
CONDA_FILE: continuous_integration/environment-${{ matrix.python }}-dev.yaml
40+
CONDA_FILE: continuous_integration/environment-${{ matrix.python }}.yaml
4141
DASK_SQL_DISTRIBUTED_TESTS: ${{ matrix.distributed }}
4242
strategy:
4343
fail-fast: false
4444
matrix:
4545
os: [ubuntu-latest, windows-latest, macos-latest]
46-
python: ["3.8", "3.9", "3.10"]
46+
python: ["3.9", "3.10", "3.11", "3.12"]
4747
distributed: [false]
4848
include:
4949
# run tests on a distributed client
5050
- os: "ubuntu-latest"
51-
python: "3.8"
51+
python: "3.9"
5252
distributed: true
5353
- os: "ubuntu-latest"
54-
python: "3.10"
54+
python: "3.11"
5555
distributed: true
5656
steps:
5757
- uses: actions/checkout@v4
@@ -76,7 +76,6 @@ jobs:
7676
- name: Install hive testing dependencies
7777
if: matrix.os == 'ubuntu-latest'
7878
run: |
79-
mamba install -c conda-forge "sasl>=0.3.1"
8079
docker pull bde2020/hive:2.3.2-postgresql-metastore
8180
docker pull bde2020/hive-metastore-postgresql:2.3.0
8281
- name: Optionally install upstream dev Dask
@@ -107,9 +106,7 @@ jobs:
107106
with:
108107
miniforge-variant: Mambaforge
109108
use-mamba: true
110-
# TODO: drop support for python 3.8, add support for python 3.11
111-
# https://github.com/dask-contrib/dask-sql/pull/1143
112-
python-version: ${{ needs.detect-ci-trigger.outputs.triggered == 'true' && '3.9' || '3.8' }}
109+
python-version: "3.9"
113110
channel-priority: strict
114111
- name: Install dependencies and nothing else
115112
run: |

CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ rustup update
1919
To initialize and activate the conda environment for a given Python version:
2020

2121
```
22-
conda env create -f dask-sql/continuous_integration/environment-{$PYTHON_VER}-dev.yaml
22+
conda env create -f dask-sql/continuous_integration/environment-{$PYTHON_VER}.yaml
2323
conda activate dask-sql
2424
```
2525

Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ async-trait = "0.1.74"
1414
datafusion-python = { git = "https://github.com/apache/arrow-datafusion-python.git", ref = "da6c183" }
1515
env_logger = "0.10"
1616
log = "^0.4"
17-
pyo3 = { version = "0.19.2", features = ["extension-module", "abi3", "abi3-py38"] }
17+
pyo3 = { version = "0.19.2", features = ["extension-module", "abi3", "abi3-py39"] }
1818
pyo3-log = "0.9.0"
1919

2020
[build-dependencies]

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ If you want to have the newest (unreleased) `dask-sql` version or if you plan to
101101

102102
Create a new conda environment and install the development environment:
103103

104-
conda env create -f continuous_integration/environment-3.9-dev.yaml
104+
conda env create -f continuous_integration/environment-3.9.yaml
105105

106106
It is not recommended to use `pip` instead of `conda` for the environment setup.
107107

File renamed without changes.

docker/conda.txt renamed to continuous_integration/docker/conda.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
python>=3.8
2-
dask>=2022.3.0,<=2023.11.0
1+
python>=3.9
2+
dask>=2022.3.0
33
pandas>=1.4.0
44
jpype1>=1.0.2
55
openjdk>=8

docker/main.dockerfile renamed to continuous_integration/docker/main.dockerfile

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,12 @@ RUN sh /rustup-init.sh -y --default-toolchain=stable --profile=minimal \
1111
ENV PATH="/root/.cargo/bin:${PATH}"
1212

1313
# Install conda dependencies for dask-sql
14-
COPY docker/conda.txt /opt/dask_sql/
14+
COPY continuous_integration/docker/conda.txt /opt/dask_sql/
1515
RUN mamba install -y \
1616
# build requirements
1717
"maturin>=1.3,<1.4" \
1818
# core dependencies
19-
"dask>=2022.3.0,<=2023.11.0" \
19+
"dask>=2022.3.0" \
2020
"pandas>=1.4.0" \
2121
"fastapi>=0.92.0" \
2222
"httpx>=0.24.1" \
@@ -44,7 +44,7 @@ RUN cd /opt/dask_sql/ \
4444
&& CONDA_PREFIX="/opt/conda/" maturin develop
4545

4646
# Set the script to execute
47-
COPY scripts/startup_script.py /opt/dask_sql/startup_script.py
47+
COPY continuous_integration/scripts/startup_script.py /opt/dask_sql/startup_script.py
4848

4949
EXPOSE 8080
5050
ENTRYPOINT [ "/usr/bin/prepare.sh", "/opt/conda/bin/python", "/opt/dask_sql/startup_script.py" ]

0 commit comments

Comments
 (0)