(docs) Added steps in README to run unit tests and integration tests#732
(docs) Added steps in README to run unit tests and integration tests#732zeroshade merged 7 commits intoapache:mainfrom
Conversation
| ### Linting | ||
|
|
||
| Run [golangci-lint](https://golangci-lint.run/) (matches CI): | ||
|
|
||
| ```shell | ||
| golangci-lint run --timeout=10m | ||
| ``` | ||
|
|
||
| ### Integration tests | ||
| 1. Start the docker containers using docker compose. | ||
|
|
||
| ```shell | ||
| docker compose -f internal/recipe/docker-compose.yml up -d | ||
| sleep 10 | ||
| docker compose -f internal/recipe/docker-compose.yml exec -T spark-iceberg ipython ./provision.py | ||
| sleep 10 | ||
| ``` | ||
|
|
||
| 2. export the required environment variables. | ||
|
|
||
| ```shell | ||
| export AWS_S3_ENDPOINT=http://$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' minio):9000 | ||
| export AWS_REGION=us-east-1 | ||
| export SPARK_CONTAINER_ID=$(docker ps -qf 'name=spark-iceberg') | ||
| export DOCKER_API_VER=$(docker version -f '{{.Server.APIVersion}}') | ||
| ``` | ||
|
|
||
| 3. Run the tests. | ||
| ```shell | ||
| go test -tags=integration -v -run="^TestScanner" ./table | ||
| go test -tags=integration -v ./io | ||
| go test -tags=integration -v -run="^TestRestIntegration$" ./catalog/rest | ||
| go test -tags=integration -v -run="^TestSparkIntegration" ./table | ||
| ``` |
There was a problem hiding this comment.
shall we add a makefile to use both in CI and readme?
Otherwise, it might easily get outdated or golangci-lint version might be different, etc.
There was a problem hiding this comment.
I don't think we necessarily need a makefile, a simple shell script would likely be enough
zeroshade
left a comment
There was a problem hiding this comment.
if we're going to add a Makefile, we should also update the CI workflows to use it, yes?
.github/workflows/go-integration.yml
Outdated
| make integration-io | ||
| make integration-rest | ||
| make integration-spark | ||
| make integration-hive |
There was a problem hiding this comment.
use make integration-test which automatically runs all four of these. It allows us to add new integration tests to the Makefile as we create them without needing to update this.
.github/workflows/go-integration.yml
Outdated
| - name: Run spark integration tests | ||
| env: | ||
| AWS_S3_ENDPOINT: "${{ env.AWS_S3_ENDPOINT }}" | ||
| AWS_REGION: "us-east-1" | ||
| SPARK_CONTAINER_ID: "${{ env.SPARK_CONTAINER_ID }}" | ||
| DOCKER_API_VER: "${{ env.DOCKER_API_VER }}" | ||
| run: | | ||
| go test -tags integration -v -run="^TestSparkIntegration" ./table | ||
| run: make integration-spark |
There was a problem hiding this comment.
Do we still need this as a separate step anymore? We can shift the env vars for SPARK_CONTAINER_ID and DOCKER_API_VER to the step above and just call make integration-test
Makefile
Outdated
| docker compose -f internal/recipe/docker-compose.yml exec -T spark-iceberg ipython ./provision.py | ||
| sleep 10 | ||
|
|
||
| integration-test: integration-scanner integration-io integration-rest integration-spark |
There was a problem hiding this comment.
shouldn't integration-hive be part of this too?
- Added integration-hive to integration-test
|
Thanks for this! |
Added steps in README to run unit tests and integration tests