Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 7 additions & 14 deletions .github/workflows/go-integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,14 +46,8 @@ jobs:
cache: true
cache-dependency-path: go.sum

- name: Start docker
run: |
docker compose -f internal/recipe/docker-compose.yml up -d
sleep 10
- name: Provision Tables
run: |
docker compose -f internal/recipe/docker-compose.yml exec -T spark-iceberg ipython ./provision.py
sleep 10
- name: Integration setup
run: make integration-setup

- name: Setup environment variables
run: |
Expand All @@ -66,18 +60,17 @@ jobs:
AWS_S3_ENDPOINT: "${{ env.AWS_S3_ENDPOINT }}"
AWS_REGION: "us-east-1"
run: |
go test -tags integration -v -run="^TestScanner" ./table
go test -tags integration -v ./io
go test -tags integration -v -run="^TestRestIntegration$" ./catalog/rest
go test -tags=integration -v ./catalog/hive/...
make integration-io
make integration-rest
make integration-spark
make integration-hive
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use make integration-test which automatically runs all four of these. It allows us to add new integration tests to the Makefile as we create them without needing to update this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

- name: Run spark integration tests
env:
AWS_S3_ENDPOINT: "${{ env.AWS_S3_ENDPOINT }}"
AWS_REGION: "us-east-1"
SPARK_CONTAINER_ID: "${{ env.SPARK_CONTAINER_ID }}"
DOCKER_API_VER: "${{ env.DOCKER_API_VER }}"
run: |
go test -tags integration -v -run="^TestSparkIntegration" ./table
run: make integration-spark
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we still need this as a separate step anymore? We can shift the env vars for SPARK_CONTAINER_ID and DOCKER_API_VER to the step above and just call make integration-test

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

combined.


- name: Show debug logs
if: ${{ failure() }}
Expand Down
52 changes: 52 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the "License");
# you may not use it except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# golangci-lint version (keep in sync with CI and README)
GOLANGCI_LINT_VERSION := v2.8.0

.PHONY: test lint lint-install integration-setup integration-test integration-scanner integration-io integration-rest integration-spark

test:
go test -v ./...

lint:
golangci-lint run --timeout=10m

lint-install:
go install github.com/golangci/golangci-lint/cmd/golangci-lint@$(GOLANGCI_LINT_VERSION)

integration-setup:
docker compose -f internal/recipe/docker-compose.yml up -d
sleep 10
docker compose -f internal/recipe/docker-compose.yml exec -T spark-iceberg ipython ./provision.py
sleep 10

integration-test: integration-scanner integration-io integration-rest integration-spark
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't integration-hive be part of this too?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added


integration-scanner:
go test -tags=integration -v -run="^TestScanner" ./table

integration-io:
go test -tags=integration -v ./io

integration-rest:
go test -tags=integration -v -run="^TestRestIntegration$$" ./catalog/rest

integration-spark:
go test -tags=integration -v -run="^TestSparkIntegration" ./table

integration-hive:
go test -tags=integration -v ./catalog/hive/...
50 changes: 50 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,56 @@ $ git clone https://github.com/apache/iceberg-go.git
$ cd iceberg-go/cmd/iceberg && go build .
```

## Running Tests

Use the [Makefile](Makefile) so commands stay in sync with CI (e.g. golangci-lint version).

### Unit tests

```shell
make test
```

### Linting

```shell
make lint
```

Install the linter first

```shell
make lint-install
# or: go install github.com/golangci/golangci-lint/cmd/golangci-lint@v2.8.0
```

### Integration tests

**Prerequisites:** Docker, Docker Compose

1. Start the Docker containers using docker compose:

```shell
make integration-setup
```

2. Export the required environment variables:

```shell
export AWS_S3_ENDPOINT=http://$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' minio):9000
export AWS_REGION=us-east-1
export SPARK_CONTAINER_ID=$(docker ps -qf 'name=spark-iceberg')
export DOCKER_API_VER=$(docker version -f '{{.Server.APIVersion}}')
```

3. Run the integration tests:

```shell
make integration-test
```

Or run a single suite: `make integration-scanner`, `make integration-io`, `make integration-rest`, `make integration-spark`.

## Feature Support / Roadmap

### FileSystem Support
Expand Down
Loading