Skip to content
Open
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
.env
venv
*venv
.log
__pycache__
.pytest_cache
Expand Down
12 changes: 11 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@ DOCKER_IMAGE_NAME=test_db_image
DOCKER_CONTAINER_NAME=test_db_container
DB_PORT=5432

# Might be worth using poetry here so you can install packages in the venv and then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have a preference between poetry, uv, hatch, pyenv, or our approach with venv + pip + pip-tools?

I would like to keep external tools a minimum for now, and consistent with https://github.com/cowprotocol/solver-rewards and https://github.com/cowprotocol/ebbo.

We have discussed moving to some other tool but could not agree on one with good trade off of ease of use and added robustness.

Copy link
Author

@pierceroberts pierceroberts Feb 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So as I mentioned in my other reply, pyenv is just for versioning. Poetry is for package management which replaces requirements.txt and creates a poetry.toml and poetry.lock file.
I am unsure on uv or hatch so I'll have to read into those.

I'd say requirements.txt does the job, but poetry is probably an upgrade.

# set the shell.
setup-venv:
python -m venv .venv

install:
pip install -r requirements.txt

Expand All @@ -13,11 +18,16 @@ daemon:

test_db:
docker build -t $(DOCKER_IMAGE_NAME) -f Dockerfile.test_db .
docker run -d --name $(DOCKER_CONTAINER_NAME) -p $(DB_PORT):$(DB_PORT) -v ${PWD}/database/00_legacy_tables.sql:/docker-entrypoint-initdb.d/00_legacy_tables.sql -v ${PWD}/database/01_table_creation.sql:/docker-entrypoint-initdb.d/01_table_creation.sql $(DOCKER_IMAGE_NAME)
docker run -d --name $(DOCKER_CONTAINER_NAME) \
-p $(DB_PORT):$(DB_PORT) -v ${PWD}/database/00_legacy_tables.sql:/docker-entrypoint-initdb.d/00_legacy_tables.sql \
-v ${PWD}/database/01_table_creation.sql:/docker-entrypoint-initdb.d/01_table_creation.sql $(DOCKER_IMAGE_NAME)

stop_test_db:
docker stop $(DOCKER_CONTAINER_NAME) || true
docker rm $(DOCKER_CONTAINER_NAME) || true
docker rmi $(DOCKER_IMAGE_NAME) || true

unittest:
pytest tests/unit

.PHONY: install imbalances daemon test_db run_test_db stop_test_db clean
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those new commands should probably also be added here, right?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can, but basically what the .PHONY represents is a way for make to determine what are make commands and what are files. If you have a file named install then make would be confused. If your targets are unique, you can avoid this.
Here is a stackoverflow post to help understand.
https://stackoverflow.com/questions/2145590/what-is-the-purpose-of-phony-in-a-makefile

57 changes: 55 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,31 @@
# token-imbalances

This script is to calculate the raw token imbalances before and after a settlement.
This repository is to calculate the raw token imbalances before and after a settlement.
The raw token imbalances are stored in the raw_token_imbalances table.
Additionally, coingecko prices for fetchable token addresses at the time of transaction are stored in the coingecko_prices table. These tables are a part of the Solver Slippage Database.
These prices can be used to convert raw imbalances to ETH.

## Env Setup

### Docker
This repo uses Docker, but you could potentially use another another container management service like podman, see docs [here](https://podman.io/docs)
For docker installation go to the [docker website](https://docs.docker.com/get-started/get-docker/)

### Python
Install python if you don't have it already, current version is 3.12+
[Installation instructions](https://realpython.com/installing-python/)

For managing different versions of python you could look at using [pyenv](https://github.com/pyenv/pyenv).

There will need to be some env variables that you need to set like `CHAIN_SLEEP_TIME`, you can set those in a .env file. View the sample .env.sample file to see what you might need to set.

Once python has been set up and your env file is populated you can then proceed with the next set up instructions:

**Set up virtual environment:**
```sh
python -m venv .venv
source .venv/bin/activate
```

**Install requirements from root directory:**
```bash
Expand All @@ -25,9 +46,41 @@ python -m src.daemon
```

## Tests
*Note: Make sure docker is installed and the daemon is running so you can execute this*

To build and start a local database for testing use the command
```sh
docker build -t test_db_image -f Dockerfile.test_db .
docker run -d --name test_db_container -p 5432:5432 test_db_image
docker run -d --name test_db_container -p 5432:5432 -v ${PWD}/database/00_legacy_tables.sql:/docker-entrypoint-initdb.d/00_legacy_tables.sql -v ${PWD}/database/01_table_creation.sql:/docker-entrypoint-initdb.d/01_table_creation.sql test_db_image

```

To run the unittests you can use the make target unittest `make unittest`, however you might have a couple issues:
- You might run into the issue of the binary package for psycopg not being installed simply run:
```sh
pip install "psycopg[binary,pool]"
```
Comment on lines +56 to +60
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did this fail for you in a fresh environment? Then we should add it explicitly to requirements.in. Good catch!

Do we need only the binary or both binary and pool?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It did not fail on pip install but it failed when I went to run the tests.
Personally I have only used psycopg2 because it's the newer version so I would advocate for that.
I haven't used psycopg3, but that is the newer version of psycopg2 as well. So maybe we could consider that?
here are the docs:
https://pypi.org/project/psycopg2/

I also had a question as to how you all are handling migrations are you using alembic or something?


To shutdown the docker test db and remove the image/container you can do:

```sh
docker stop test_db_container || true
docker rm test_db_container || true
docker rmi test_db_image || true
```

## Using the Makefile

You can do all of the above also by running the make commands:

```sh
make install

make imbalances

make test_db

make stop_test_db

make unittest
```