diff --git a/.gitignore b/.gitignore index 9a6d7ce..fe1f1c0 100644 --- a/.gitignore +++ b/.gitignore @@ -7,3 +7,4 @@ pip-wheel-metadata/ *.ipynb notes.md *.yaml +dist/ diff --git a/CHANGES.md b/CHANGES.md new file mode 100644 index 0000000..d677b31 --- /dev/null +++ b/CHANGES.md @@ -0,0 +1,13 @@ +# Changes + +## 0.0.2 (2020-07-14) + +- FEATURE: New, fully object oriented base library +- FEATURE: Python 3.8 support added +- FIX: `cleanup` does not delete snapshots on source if they are not present on target. +- FIX: Wait for ZFS's garbage collection after `cleanup` for getting a meaningful value for freed space. +- Dropped Python 3.5 support + +## 0.0.1 (2019-08-05) + +- Initial release. diff --git a/README.md b/README.md index 8f5df65..d907f16 100644 --- a/README.md +++ b/README.md @@ -2,48 +2,97 @@ ## SYNOPSIS -Simple ZFS sync tool. Shows local and remote ZFS dataset trees / zpools. Creates meaningful snapshots only if datasets have actually been changed. Compares a local dataset tree to a remote, backup dataset tree. Pushes backups to remote. Cleanes up older snapshot on local system. Runs form the command line and produces nice, user-friendly, readable, colorized output. +`abgleich` is a simple ZFS sync tool. It displays source and target ZFS zpool, dataset and snapshot trees. It creates meaningful snapshots only if datasets have actually been changed. It compares a source zpool tree to a target, backup zpool tree. It pushes backups from a source to a target. It cleanes up older snapshots on the source side if they are present on the target side. It runs on a command line and produces nice, user-friendly, human-readable, colorized output. + +![demo](https://github.com/pleiszenburg/abgleich/blob/master/docs/demo.png?raw=true "demo") ## INSTALLATION +```bash +pip install -vU abgleich +``` + +or + ```bash pip install -vU git+https://github.com/pleiszenburg/abgleich.git@master ``` -Requires (C)Python 3.5 or later. Tested with ZoL 0.7 and 0.8. +Requires [CPython](https://en.wikipedia.org/wiki/CPython) 3.6 or later, a [Unix shell](https://en.wikipedia.org/wiki/Unix_shell) and [ssh](https://en.wikipedia.org/wiki/Secure_Shell). Tested with [OpenZFS](https://en.wikipedia.org/wiki/OpenZFS) 0.8.x on Linux. + +`abgleich`, CPython and the Unix shell must only be installed on one of the involved systems. Any remote system will be contacted via ssh and provided with direct ZFS commands. + +## INITIALIZATION + +All actions involving a remote host assume that `ssh` with public key authentication instead of passwords is correctly configured and working. + +Let's assume that everything in `source_tank/data` and below should be synced with `target_tank/some_backup/data`. `source_tank` and `target_tank` are zpools. `data` is the "prefix" for the source zpool, `some_backup/data` is the corresponding "prefix" for the target zpool. For `abgleich` to work, `source_tank/data` and `target_tank/some_backup` must exist. `target_tank/some_backup/data` must not exist. The latter will be created by `abgleich`. It is highly recommended to set the mountpoint of `target_tank/some_backup` to `none` before running `abgleich` for the first time. + +Rights to run the following commands are required: + +| command | source | target | +|----------------|:------:|:------:| +| `zfs list` | x | x | +| `zfs get` | x | x | +| `zfs snapshot` | x | | +| `zfs send` | x | | +| `zfs receive` | | x | +| `zfs destroy` | x | | + +### `config.yaml` + +Complete example configuration file: + +```yaml +source: + zpool: tank_ssd + prefix: + host: localhost + user: +target: + zpool: tank_hdd + prefix: BACKUP_SOMEMACHINE + host: bigdata + user: zfsadmin +keep_snapshots: 2 +suffix: _backup +digits: 2 +ignore: + - home/user/CACHE + - home/user/CCACHE +ssh: + compression: no + cipher: aes256-gcm@openssh.com +``` + +The prefix can be empty on either side. If a `host` is set to `localhost`, the `user` field can be left empty. Both source and target can be remote hosts or localhost at the same time. `keep_snapshots` is an integer and must be greater or equal to `1`. It specifies the number of snapshots that are kept per dataset on the source side when a cleanup operation is triggered. `suffix` contains the name suffix for new snapshots. `digits` specifies how many digits are used for a decimal number describing the n-th snapshot per dataset per day as part of the name of new snapshots. `ignore` lists stuff underneath the `prefix` which will be ignored by this tool, i.e. no snapshots, backups or cleanups. `ssh` allows to fine-tune the speed of backups. In fast local networks, it is best to set `compression` to `no` because the compression is usually slowing down the transfer. However, for low-bandwidth transmissions, it makes sense to set it to `yes`. For significantly better speed in fast local networks, make sure that both the source and the target system support a common cipher, which is accelerated by [AES-NI](https://en.wikipedia.org/wiki/AES_instruction_set) on both ends. ## USAGE -### `abgleich tree [hostname]` +All potentially changing or destructive actions are listed in detail before the user is asked to confirm them. None of the commands listed below create, change or destroy a zpool, dataset or snapshot on their own without the user's explicit consent. -Show zfs tree with snapshots, disk space and compression ratio. Append `hostname` (optional) for remote tree. `ssh` without password (public key) required. +### `abgleich tree config.yaml [source|target]` + +Show ZFS tree with snapshots, disk space and compression ratio. Append `source` or `target` (optional). ### `abgleich snap config.yaml` -Determine which datasets have been changed since last snapshot. Generate snapshots where applicable. Superuser privileges required. +Determine which datasets on the source side have been changed since last snapshot. Generate snapshots on the source side where applicable. ### `abgleich compare config.yaml` -Compare local machine with remote host. See what is missing where. `ssh` without password (public key) required. Superuser privileges required. +Compare source ZFS tree with target ZFS tree. See what is missing where. ### `abgleich backup config.yaml` -Send (new) datasets and snapshots to remote host. `ssh` without password (public key) required. Superuser privileges required. +Send (new) datasets and new snapshots from source to target. ### `abgleich cleanup config.yaml` -Cleanup older local snapshots. Keep `keep_snapshots` number of snapshots. Superuser privileges required. +Cleanup older local snapshots on source side if they are present on both sides. Of those snapshots present on both sides, keep at least `keep_snapshots` number of snapshots on source side. -### `config.yaml` +## SPEED -Example configuration file: +`abgleich` uses Python's [type hints](https://docs.python.org/3/library/typing.html) and enforces them with [typeguard](https://github.com/agronholm/typeguard) at runtime. It furthermore makes countless assertions. -```yaml -prefix_local: tank_ssd -prefix_remote: tank_hdd/BACKUP_SOMEMACHINE -host: bigdata -keep_snapshots: 2 -ignore: - - /ernst/CACHE - - /ernst/CCACHE -``` +The enforcement of types and assertions can be controlled through the `PYTHONOPTIMIZE` environment variable. If set to `0` (the implicit default value), all checks are activated. `abgleich` will run slow. For safety, this mode is highly recommended. For significantly higher speed, all type checks and most assertions can be deactivated by setting `PYTHONOPTIMIZE` to `1` or `2`, e.g. `PYTHONOPTIMIZE=1 abgleich tree config.yaml`. This is not recommended. You may want to check if another tool or configuration has altered this environment variable by running `echo $PYTHONOPTIMIZE`. diff --git a/docs/demo.png b/docs/demo.png new file mode 100644 index 0000000..225003c Binary files /dev/null and b/docs/demo.png differ diff --git a/makefile b/makefile index 6a4418d..cff9b42 100644 --- a/makefile +++ b/makefile @@ -1,10 +1,10 @@ clean: -rm -r build/* - find src/ -name '*.pyc' -exec rm -f {} + - find src/ -name '*.pyo' -exec rm -f {} + + find src/ -name '*.pyc' -exec sudo rm -f {} + + find src/ -name '*.pyo' -exec sudo rm -f {} + find src/ -name '*~' -exec rm -f {} + - find src/ -name '__pycache__' -exec rm -fr {} + + find src/ -name '__pycache__' -exec sudo rm -fr {} + find src/ -name '*.htm' -exec rm -f {} + find src/ -name '*.html' -exec rm -f {} + find src/ -name '*.so' -exec rm -f {} + @@ -20,4 +20,10 @@ release: gpg --detach-sign -a dist/abgleich*.tar.gz install: - pip install -v -e . + pip install -vU pip setuptools + pip install -v -e .[dev] + +upload: + for filename in $$(ls dist/*.tar.gz dist/*.whl) ; do \ + twine upload $$filename $$filename.asc ; \ + done diff --git a/setup.py b/setup.py index a66cb9f..4904d7e 100644 --- a/setup.py +++ b/setup.py @@ -8,7 +8,7 @@ setup.py: Used for package distribution - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -30,9 +30,9 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ from setuptools import ( - find_packages, - setup, - ) + find_packages, + setup, +) import os # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ @@ -40,85 +40,80 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # Package version -__version__ = '0.0.1' +__version__ = "0.0.2" # List all versions of Python which are supported +python_minor_min = 6 +python_minor_max = 8 confirmed_python_versions = [ - ('Programming Language :: Python :: %s' % x) - for x in '3.5 3.6 3.7'.split(' ') - ] + "Programming Language :: Python :: 3.{MINOR:d}".format(MINOR=minor) + for minor in range(python_minor_min, python_minor_max + 1) +] # Fetch readme file -with open(os.path.join(os.path.dirname(__file__), 'README.md')) as f: - long_description = f.read() +with open(os.path.join(os.path.dirname(__file__), "README.md")) as f: + long_description = f.read() # Define source directory (path) -SRC_DIR = 'src' +SRC_DIR = "src" # Install package setup( - name = 'abgleich', - packages = find_packages(SRC_DIR), - package_dir = {'': SRC_DIR}, - version = __version__, - description = 'zfs sync tool', - long_description = long_description, - long_description_content_type = 'text/markdown', - author = 'Sebastian M. Ernst', - author_email = 'ernst@pleiszenburg.de', - url = 'https://github.com/pleiszenburg/abgleich', - download_url = 'https://github.com/pleiszenburg/abgleich/archive/v%s.tar.gz' % __version__, - license = 'LGPLv2', - keywords = [ - 'zfs', - 'ssh', - ], - scripts = [], - include_package_data = True, - setup_requires = [], - install_requires = [ - 'click', - 'tabulate', - 'pyyaml', - ], - extras_require = {'dev': [ - # 'pytest', - 'python-language-server', - 'setuptools', - # 'Sphinx', - # 'sphinx_rtd_theme', - 'twine', - 'wheel', - ]}, - zip_safe = False, - entry_points = { - 'console_scripts': [ - 'abgleich = abgleich.cli:cli', - ], - }, - classifiers = [ - 'Development Status :: 5 - Production/Stable', - 'Environment :: Console', - 'Intended Audience :: Developers', - 'Intended Audience :: Education', - 'Intended Audience :: Information Technology', - 'Intended Audience :: Science/Research', - 'Intended Audience :: System Administrators', - 'License :: OSI Approved :: GNU Lesser General Public License v2 (LGPLv2)', - 'Operating System :: MacOS', - 'Operating System :: POSIX :: BSD', - 'Operating System :: POSIX :: Linux', - 'Programming Language :: Python :: 3' - ] + confirmed_python_versions + [ - 'Programming Language :: Python :: 3 :: Only', - 'Programming Language :: Python :: Implementation :: CPython', - 'Topic :: Scientific/Engineering', - 'Topic :: System', - 'Topic :: System :: Archiving', - 'Topic :: System :: Archiving :: Backup', - 'Topic :: System :: Archiving :: Mirroring', - 'Topic :: System :: Filesystems', - 'Topic :: System :: Systems Administration', - 'Topic :: Utilities' - ] - ) + name="abgleich", + packages=find_packages(SRC_DIR), + package_dir={"": SRC_DIR}, + version=__version__, + description="zfs sync tool", + long_description=long_description, + long_description_content_type="text/markdown", + author="Sebastian M. Ernst", + author_email="ernst@pleiszenburg.de", + url="https://github.com/pleiszenburg/abgleich", + download_url="https://github.com/pleiszenburg/abgleich/archive/v%s.tar.gz" + % __version__, + license="LGPLv2", + keywords=["zfs", "ssh",], + scripts=[], + include_package_data=True, + python_requires=">=3.{MINOR:d}".format(MINOR=python_minor_min), + setup_requires=[], + install_requires=["click", "tabulate", "pyyaml", "typeguard",], + extras_require={ + "dev": [ + "black", + "python-language-server[all]", + "setuptools", + "twine", + "wheel", + ] + }, + zip_safe=False, + entry_points={"console_scripts": ["abgleich = abgleich.cli:cli",],}, + classifiers=[ + "Development Status :: 5 - Production/Stable", + "Environment :: Console", + "Intended Audience :: Developers", + "Intended Audience :: Education", + "Intended Audience :: Information Technology", + "Intended Audience :: Science/Research", + "Intended Audience :: System Administrators", + "License :: OSI Approved :: GNU Lesser General Public License v2 (LGPLv2)", + "Operating System :: MacOS", + "Operating System :: POSIX :: BSD", + "Operating System :: POSIX :: Linux", + "Programming Language :: Python :: 3", + ] + + confirmed_python_versions + + [ + "Programming Language :: Python :: 3 :: Only", + "Programming Language :: Python :: Implementation :: CPython", + "Topic :: Scientific/Engineering", + "Topic :: System", + "Topic :: System :: Archiving", + "Topic :: System :: Archiving :: Backup", + "Topic :: System :: Archiving :: Mirroring", + "Topic :: System :: Filesystems", + "Topic :: System :: Systems Administration", + "Topic :: Utilities", + ], +) diff --git a/src/abgleich/__init__.py b/src/abgleich/__init__.py index 6a06ef7..97275ca 100644 --- a/src/abgleich/__init__.py +++ b/src/abgleich/__init__.py @@ -8,7 +8,7 @@ src/abgleich/__init__.py: Package root - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License diff --git a/src/abgleich/cli/__init__.py b/src/abgleich/cli/__init__.py index 24c39e0..f96b87e 100644 --- a/src/abgleich/cli/__init__.py +++ b/src/abgleich/cli/__init__.py @@ -8,7 +8,7 @@ src/abgleich/cli/__init__.py: CLI package root - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License diff --git a/src/abgleich/cli/_main_.py b/src/abgleich/cli/_main_.py index 373a3af..e423bf3 100644 --- a/src/abgleich/cli/_main_.py +++ b/src/abgleich/cli/_main_.py @@ -8,7 +8,7 @@ src/abgleich/cli/_main_.py: CLI auto-detection - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -38,19 +38,20 @@ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + def _add_commands(ctx): - """auto-detects sub-commands""" - for cmd in ( - item[:-3] if item.lower().endswith('.py') else item[:] - for item in os.listdir(os.path.dirname(__file__)) - if not item.startswith('_') - ): - ctx.add_command(getattr(importlib.import_module( - 'abgleich.cli.%s' % cmd - ), cmd)) + """auto-detects sub-commands""" + for cmd in ( + item[:-3] if item.lower().endswith(".py") else item[:] + for item in os.listdir(os.path.dirname(__file__)) + if not item.startswith("_") + ): + ctx.add_command(getattr(importlib.import_module("abgleich.cli.%s" % cmd), cmd)) + @click.group() def cli(): - """abgleich, zfs sync tool""" + """abgleich, zfs sync tool""" + _add_commands(cli) diff --git a/src/abgleich/cli/backup.py b/src/abgleich/cli/backup.py index 3177ade..543e75f 100644 --- a/src/abgleich/cli/backup.py +++ b/src/abgleich/cli/backup.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/cli/backup.py: backup command entry point + src/abgleich/cli/backup.py: backup command entry point - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -30,68 +30,31 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ import click -from tabulate import tabulate -import yaml -from yaml import CLoader - -from ..io import colorize -from ..zfs import ( - get_backup_ops, - get_tree, - push_snapshot, - push_snapshot_incremental, - ) + +from ..core.config import Config +from ..core.zpool import Zpool # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -@click.command(short_help = 'backup a dataset tree into another') -@click.argument('configfile', type = click.File('r', encoding = 'utf-8')) + +@click.command(short_help="backup a dataset tree into another") +@click.argument("configfile", type=click.File("r", encoding="utf-8")) def backup(configfile): - config = yaml.load(configfile.read(), Loader = CLoader) - - datasets_local = get_tree() - datasets_remote = get_tree(config['host']) - ops = get_backup_ops( - datasets_local, - config['prefix_local'], - datasets_remote, - config['prefix_remote'], - config['ignore'] - ) - - table = [] - for op in ops: - row = op.copy() - row[0] = colorize(row[0], 'green' if 'incremental' in row[0] else 'blue') - table.append(row) - - print(tabulate( - table, - headers = ['OP', 'PARAM'], - tablefmt = 'github' - )) - - click.confirm('Do you want to continue?', abort = True) - - for op, param in ops: - if op == 'push_snapshot': - push_snapshot( - config['host'], - config['prefix_local'] + param[0], - param[1], - config['prefix_remote'] + param[0], - # debug = True - ) - elif op == 'push_snapshot_incremental': - push_snapshot_incremental( - config['host'], - config['prefix_local'] + param[0], - param[1], param[2], - config['prefix_remote'] + param[0], - # debug = True - ) - else: - raise ValueError('unknown operation') + config = Config.from_fd(configfile) + + source_zpool = Zpool.from_config("source", config=config) + target_zpool = Zpool.from_config("target", config=config) + + transactions = source_zpool.get_backup_transactions(target_zpool) + + if len(transactions) == 0: + print("nothing to do") + return + transactions.print_table() + + click.confirm("Do you want to continue?", abort=True) + + transactions.run() diff --git a/src/abgleich/cli/cleanup.py b/src/abgleich/cli/cleanup.py index af849fc..dfc63a0 100644 --- a/src/abgleich/cli/cleanup.py +++ b/src/abgleich/cli/cleanup.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/cli/cleanup.py: cleanup command entry point + src/abgleich/cli/cleanup.py: cleanup command entry point - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -29,64 +29,44 @@ # IMPORT # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +import time + import click -from tabulate import tabulate -import yaml -from yaml import CLoader -from ..io import colorize, humanize_size -from ..zfs import ( - get_tree, - get_cleanup_tasks, - delete_snapshot, - ) +from ..core.config import Config +from ..core.io import humanize_size +from ..core.zpool import Zpool # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -@click.command(short_help = 'cleanup older snapshots') -@click.argument('configfile', type = click.File('r', encoding = 'utf-8')) + +@click.command(short_help="cleanup older snapshots") +@click.argument("configfile", type=click.File("r", encoding="utf-8")) def cleanup(configfile): - config = yaml.load(configfile.read(), Loader = CLoader) - - cols = ['NAME', 'DELETE SNAPSHOT'] - col_align = ('left', 'left') - datasets = get_tree() - cleanup_tasks = get_cleanup_tasks( - datasets, - config['prefix_local'], - config['ignore'], - config['keep_snapshots'] - ) - space_before = int(datasets[0]['AVAIL']) - - table = [] - for name, snapshot_name in cleanup_tasks: - table.append([ - name, - snapshot_name - ]) - print(datasets[0]) - - print(tabulate( - table, - headers = cols, - tablefmt = 'github', - colalign = col_align - )) - print('%s available' % humanize_size(space_before, add_color = True)) - - click.confirm('Do you want to continue?', abort = True) - - for name, snapshot_name in cleanup_tasks: - delete_snapshot( - config['prefix_local'] + name, - snapshot_name, - # debug = True - ) - - space_after = int(get_tree()[0]['AVAIL']) - print('%s available' % humanize_size(space_before, add_color = True)) - print('%s freed' % humanize_size(space_before - space_before, add_color = True)) + config = Config.from_fd(configfile) + + source_zpool = Zpool.from_config("source", config=config) + target_zpool = Zpool.from_config("target", config=config) + available_before = Zpool.available("source", config=config) + + transactions = source_zpool.get_cleanup_transactions(target_zpool) + + if len(transactions) == 0: + print("nothing to do") + return + transactions.print_table() + + click.confirm("Do you want to continue?", abort=True) + + transactions.run() + + WAIT = 10 + print(f"waiting {WAIT:d} seconds ...") + time.sleep(WAIT) + available_after = Zpool.available("source", config=config) + print( + f"{humanize_size(available_after, add_color = True):s} available, {humanize_size(available_after - available_before, add_color = True):s} freed" + ) diff --git a/src/abgleich/cli/compare.py b/src/abgleich/cli/compare.py index 44fe8a8..18afc6a 100644 --- a/src/abgleich/cli/compare.py +++ b/src/abgleich/cli/compare.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/cli/compare.py: compare command entry point + src/abgleich/cli/compare.py: compare command entry point - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -30,50 +30,22 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ import click -from tabulate import tabulate -import yaml -from yaml import CLoader -from ..io import colorize -from ..zfs import ( - compare_trees, - get_tree - ) +from ..core.config import Config +from ..core.zpool import Zpool # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -@click.command(short_help = 'compare dataset trees') -@click.argument('configfile', type = click.File('r', encoding = 'utf-8')) + +@click.command(short_help="compare dataset trees") +@click.argument("configfile", type=click.File("r", encoding="utf-8")) def compare(configfile): - config = yaml.load(configfile.read(), Loader = CLoader) - datasets_local = get_tree() - datasets_remote = get_tree(config['host']) - diff = compare_trees( - datasets_local, - config['prefix_local'], - datasets_remote, - config['prefix_remote'] - ) - table = [] - for element in diff: - element = ['' if item == False else item for item in element] - element = ['X' if item == True else item for item in element] - element = ['- ' + item.split('@')[1] if '@' in item else item for item in element] - if element[1:] == ['X', '']: - element[1] = colorize(element[1], 'red') - elif element[1:] == ['X', 'X']: - element[1], element[2] = colorize(element[1], 'green'), colorize(element[2], 'green') - elif element[1:] == ['', 'X']: - element[2] = colorize(element[2], 'blue') - if not element[0].startswith('- '): - element[0] = colorize(element[0], 'white') - else: - element[0] = colorize(element[0], 'grey') - table.append(element) - print(tabulate( - table, - headers = ['NAME', 'LOCAL', 'REMOTE'], - tablefmt = 'github' - )) + + config = Config.from_fd(configfile) + + source_zpool = Zpool.from_config("source", config=config) + target_zpool = Zpool.from_config("target", config=config) + + source_zpool.print_comparison_table(target_zpool) diff --git a/src/abgleich/cli/snap.py b/src/abgleich/cli/snap.py index a983865..e3d1d04 100644 --- a/src/abgleich/cli/snap.py +++ b/src/abgleich/cli/snap.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/cli/snap.py: snap command entry point + src/abgleich/cli/snap.py: snap command entry point - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -30,56 +30,27 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ import click -from tabulate import tabulate -import yaml -from yaml import CLoader -from ..io import colorize, humanize_size -from ..zfs import ( - create_snapshot, - get_tree, - get_snapshot_tasks, - ) +from ..core.config import Config +from ..core.zpool import Zpool # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -@click.command(short_help = 'create snapshots of changed datasets for backups') -@click.argument('configfile', type = click.File('r', encoding = 'utf-8')) + +@click.command(short_help="create snapshots of changed datasets for backups") +@click.argument("configfile", type=click.File("r", encoding="utf-8")) def snap(configfile): - config = yaml.load(configfile.read(), Loader = CLoader) - - cols = ['NAME', 'written', 'FUTURE SNAPSHOT'] - col_align = ('left', 'right') - datasets = get_tree() - snapshot_tasks = get_snapshot_tasks( - datasets, - config['prefix_local'], - config['ignore'] - ) - - table = [] - for name, written, snapshot_name in snapshot_tasks: - table.append([ - name, - humanize_size(written, add_color = True), - snapshot_name - ]) - - print(tabulate( - table, - headers = cols, - tablefmt = 'github', - colalign = col_align - )) - - click.confirm('Do you want to continue?', abort = True) - - for name, _, snapshot_name in snapshot_tasks: - create_snapshot( - config['prefix_local'] + name, - snapshot_name, - # debug = True - ) + zpool = Zpool.from_config("source", config=Config.from_fd(configfile)) + transactions = zpool.get_snapshot_transactions() + + if len(transactions) == 0: + print("nothing to do") + return + transactions.print_table() + + click.confirm("Do you want to continue?", abort=True) + + transactions.run() diff --git a/src/abgleich/cli/tree.py b/src/abgleich/cli/tree.py index db6cbaf..9ff9eba 100644 --- a/src/abgleich/cli/tree.py +++ b/src/abgleich/cli/tree.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/cli/tree.py: tree command entry point + src/abgleich/cli/tree.py: tree command entry point - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -30,37 +30,19 @@ # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ import click -from tabulate import tabulate -from ..io import colorize, humanize_size -from ..zfs import get_tree +from ..core.config import Config +from ..core.zpool import Zpool # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -@click.command(short_help = 'show dataset tree') -@click.argument('host', default = 'localhost', type = str) -def tree(host): - cols = ['NAME', 'USED', 'REFER', 'compressratio'] - col_align = ('left', 'right', 'right', 'decimal') - size_cols = ['USED', 'REFER'] - datasets = get_tree(host if host != 'localhost' else None) - table = [] - for dataset in datasets: - table.append([dataset[col] for col in cols]) - for snapshot in dataset['SNAPSHOTS']: - table.append(['- ' + snapshot['NAME']] + [snapshot[col]for col in cols[1:]]) - for row in table: - for col in [1, 2]: - row[col] = humanize_size(int(row[col]), add_color = True) - if not row[0].startswith('- '): - row[0] = colorize(row[0], 'white') - else: - row[0] = colorize(row[0], 'grey') - print(tabulate( - table, - headers = cols, - tablefmt = 'github', - colalign = col_align - )) + +@click.command(short_help="show dataset tree") +@click.argument("configfile", type=click.File("r", encoding="utf-8")) +@click.argument("side", default="source", type=str) +def tree(configfile, side): + + zpool = Zpool.from_config(side, config=Config.from_fd(configfile)) + zpool.print_table() diff --git a/src/abgleich/cmd.py b/src/abgleich/cmd.py deleted file mode 100644 index 38544f4..0000000 --- a/src/abgleich/cmd.py +++ /dev/null @@ -1,93 +0,0 @@ -# -*- coding: utf-8 -*- - -""" - -ABGLEICH -zfs sync tool -https://github.com/pleiszenburg/abgleich - - src/abgleich/cmd.py: Subprocess wrappers - - Copyright (C) 2019 Sebastian M. Ernst - - -The contents of this file are subject to the GNU Lesser General Public License -Version 2.1 ("LGPL" or "License"). You may not use this file except in -compliance with the License. You may obtain a copy of the License at -https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt -https://github.com/pleiszenburg/abgleich/blob/master/LICENSE - -Software distributed under the License is distributed on an "AS IS" basis, -WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the -specific language governing rights and limitations under the License. - - -""" - - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# IMPORT -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -import subprocess - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# ROUTINES -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -def run_command(cmd_list, debug = False): - if debug: - print_commands(cmd_list) - return - proc = subprocess.Popen(cmd_list, stdout = subprocess.PIPE, stderr = subprocess.PIPE) - outs, errs = proc.communicate() - status_value = not bool(proc.returncode) - output, errors = outs.decode('utf-8'), errs.decode('utf-8') - if len(errors.strip()) != 0 or not status_value: - print(output) - print(errors) - raise - return output - -def run_chain_command(cmd_list_1, cmd_list_2, debug = False): - if debug: - print_commands(cmd_list_1, cmd_list_2) - return - proc_1 = subprocess.Popen( - cmd_list_1, stdout = subprocess.PIPE, stderr = subprocess.PIPE - ) - proc_2 = subprocess.Popen( - cmd_list_2, stdin = proc_1.stdout, stdout = subprocess.PIPE, stderr = subprocess.PIPE - ) - outs_2, errs_2 = proc_2.communicate() - status_value_2 = not bool(proc_2.returncode) - _, errs_1 = proc_1.communicate() - status_value_1 = not bool(proc_1.returncode) - output_2, errors_2 = outs_2.decode('utf-8'), errs_2.decode('utf-8') - errors_1 = errs_1.decode('utf-8') - if any([ - len(errors_1.strip()) != 0, - not status_value_1, - len(errors_2.strip()) != 0, - not status_value_2 - ]): - print(errors_1) - print(output_2) - print(errors_2) - raise - return output_2 - -def print_commands(*args): - commands = [' '.join(cmd_list) for cmd_list in args] - print('#> ' + ' | '.join(commands)) - -def ssh_command(host, cmd_list, compression = False): - return get_ssh_prefix(compression) + [ - host, ' '.join([item.replace(' ', '\\ ') for item in cmd_list]) - ] - -def get_ssh_prefix(compression = False): - return [ - 'ssh', '-T', '-c', 'aes256-gcm@openssh.com', '-o', - 'Compression=yes' if compression else 'Compression=no' - ] diff --git a/src/abgleich/core/__init__.py b/src/abgleich/core/__init__.py new file mode 100644 index 0000000..edf8425 --- /dev/null +++ b/src/abgleich/core/__init__.py @@ -0,0 +1,25 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/__init__.py: Core package root + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" diff --git a/src/abgleich/core/abc.py b/src/abgleich/core/abc.py new file mode 100644 index 0000000..7b30159 --- /dev/null +++ b/src/abgleich/core/abc.py @@ -0,0 +1,79 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/abc.py: Abstract base classes + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import abc + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASSES +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +class CloneABC(abc.ABC): + pass + + +class CommandABC(abc.ABC): + pass + + +class ComparisonABC(abc.ABC): + pass + + +class ComparisonItemABC(abc.ABC): + pass + + +class DatasetABC(abc.ABC): + pass + + +class PropertyABC(abc.ABC): + pass + + +class SnapshotABC(abc.ABC): + pass + + +class TransactionABC(abc.ABC): + pass + + +class TransactionListABC(abc.ABC): + pass + + +class TransactionMetaABC(abc.ABC): + pass + + +class ZpoolABC(abc.ABC): + pass diff --git a/src/abgleich/core/command.py b/src/abgleich/core/command.py new file mode 100644 index 0000000..ba43f7e --- /dev/null +++ b/src/abgleich/core/command.py @@ -0,0 +1,131 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/command.py: Sub-process wrapper for commands + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import subprocess +import typing + +import typeguard + +from .abc import CommandABC + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Command(CommandABC): + def __init__(self, cmd: typing.List[str]): + + self._cmd = cmd.copy() + + def __str__(self) -> str: + + return " ".join(self._cmd) + + def run(self): + + proc = subprocess.Popen( + self.cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE + ) + output, errors = proc.communicate() + status = not bool(proc.returncode) + output, errors = output.decode("utf-8"), errors.decode("utf-8") + + if not status or len(errors.strip()) > 0: + raise SystemError("command failed", self.cmd, output, errors) + + return output, errors + + def run_pipe(self, other: CommandABC): + + proc_1 = subprocess.Popen( + self.cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE + ) + proc_2 = subprocess.Popen( + other.cmd, + stdin=proc_1.stdout, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + ) + + output_2, errors_2 = proc_2.communicate() + status_2 = not bool(proc_2.returncode) + _, errors_1 = proc_1.communicate() + status_1 = not bool(proc_1.returncode) + + errors_1 = errors_1.decode("utf-8") + output_2, errors_2 = output_2.decode("utf-8"), errors_2.decode("utf-8") + + if any( + ( + not status_1, + len(errors_1.strip()) > 0, + not status_2, + len(errors_2.strip()) > 0, + ) + ): + raise SystemError( + "command pipe failed", self.cmd, other.cmd, errors_1, output_2, errors_2 + ) + + return errors_1, output_2, errors_2 + + @property + def cmd(self) -> typing.List[str]: + + return self._cmd.copy() + + @classmethod + def on_side( + cls, cmd: typing.List[str], side: str, config: typing.Dict + ) -> CommandABC: + + if config[side]["host"] == "localhost": + return cls(cmd) + return cls.with_ssh(cmd, side_config=config[side], ssh_config=config["ssh"]) + + @classmethod + def with_ssh( + cls, cmd: typing.List[str], side_config: typing.Dict, ssh_config: typing.Dict + ) -> CommandABC: + + cmd_str = " ".join([item.replace(" ", "\\ ") for item in cmd]) + cmd = [ + "ssh", + "-T", # Disable pseudo-terminal allocation + "-o", + "Compression=yes" if ssh_config["compression"] else "Compression=no", + ] + if ssh_config["cipher"] is not None: + cmd.extend(("-c", ssh_config["cipher"])) + cmd.extend([f'{side_config["user"]:s}@{side_config["host"]:s}', cmd_str]) + + return cls(cmd) diff --git a/src/abgleich/core/comparison.py b/src/abgleich/core/comparison.py new file mode 100644 index 0000000..b1b541a --- /dev/null +++ b/src/abgleich/core/comparison.py @@ -0,0 +1,420 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/comparison.py: ZFS comparison + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import itertools +import typing + +import typeguard + +from .abc import ComparisonABC, ComparisonItemABC, DatasetABC, SnapshotABC, ZpoolABC + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# TYPING +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +ComparisonParentTypes = typing.Union[ + ZpoolABC, DatasetABC, None, +] +ComparisonMergeTypes = typing.Union[ + typing.Generator[DatasetABC, None, None], typing.Generator[SnapshotABC, None, None], +] +ComparisonItemType = typing.Union[ + DatasetABC, SnapshotABC, None, +] +ComparisonStrictItemType = typing.Union[ + DatasetABC, SnapshotABC, +] + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Comparison(ComparisonABC): + def __init__( + self, + a: ComparisonParentTypes, + b: ComparisonParentTypes, + merged: typing.List[ComparisonItemABC], + ): + + assert a is not None or b is not None + if a is not None and b is not None: + assert type(a) == type(b) + + self._a, self._b, self._merged = a, b, merged + + @property + def a(self) -> ComparisonParentTypes: + + return self._a + + @property + def a_head(self) -> typing.List[ComparisonStrictItemType]: + + return self._head( + source=[item.a for item in self._merged], + target=[item.b for item in self._merged], + ) + + @property + def a_overlap_tail(self) -> typing.List[ComparisonStrictItemType]: + + return self._overlap_tail( + source=[item.a for item in self._merged], + target=[item.b for item in self._merged], + ) + + @property + def b(self) -> ComparisonParentTypes: + + return self._b + + @property + def b_head(self) -> typing.List[ComparisonStrictItemType]: + + return self._head( + source=[item.b for item in self._merged], + target=[item.a for item in self._merged], + ) + + @property + def b_overlap_tail(self) -> typing.List[ComparisonStrictItemType]: + + return self._overlap_tail( + source=[item.b for item in self._merged], + target=[item.a for item in self._merged], + ) + + @property + def merged(self) -> typing.Generator[ComparisonItemABC, None, None]: + + return (item for item in self._merged) + + @classmethod + def _head( + cls, + source: typing.List[ComparisonItemType], + target: typing.List[ComparisonItemType], + ) -> typing.List[ComparisonItemType]: + """ + Returns new elements from source. + If target is empty, returns source. + If head of target and head of source are identical, returns empty list. + """ + + source, target = cls._strip_none(source), cls._strip_none(target) + + if any((element is None for element in source)): + raise ValueError("source is not consecutive") + if any((element is None for element in target)): + raise ValueError("target is not consecutive") + + if len(source) == 0: + raise ValueError("source must not be empty") + + if len(set([item.name for item in source])) != len(source): + raise ValueError("source contains doublicate entires") + if len(set([item.name for item in target])) != len(target): + raise ValueError("target contains doublicate entires") + + if len(target) == 0: + return source # all of source, target is empty + + try: + source_index = [item.name for item in source].index(target[-1].name) + except ValueError: + raise ValueError("last target element not in source") + + old_source = source[: source_index + 1] + + if len(old_source) <= len(target): + if target[-len(old_source) :] != old_source: + raise ValueError( + "no clean match between end of target and beginning of source" + ) + else: + if target != source[source_index + 1 - len(target) : source_index + 1]: + raise ValueError( + "no clean match between entire target and beginning of source" + ) + + return source[source_index + 1 :] + + @classmethod + def _overlap_tail( + cls, + source: typing.List[ComparisonItemType], + target: typing.List[ComparisonItemType], + ) -> typing.List[ComparisonItemType]: + """ + Overlap must include first element of source. + """ + + source, target = cls._strip_none(source), cls._strip_none(target) + + if len(source) == 0 or len(target) == 0: + return [] + + if any((element is None for element in source)): + raise ValueError("source is not consecutive") + if any((element is None for element in target)): + raise ValueError("target is not consecutive") + + source_names = {item.name for item in source} + target_names = {item.name for item in target} + + if len(source_names) != len(source): + raise ValueError("source contains doublicate entires") + if len(target_names) != len(target): + raise ValueError("target contains doublicate entires") + + overlap_tail = [] + for item in source: + if item.name not in target_names: + break + overlap_tail.append(item) + + if len(overlap_tail) == 0: + return overlap_tail + + target_index = target.index(overlap_tail[0]) + if overlap_tail != target[target_index : target_index + len(overlap_tail)]: + raise ValueError("no clean match in overlap area") + + return overlap_tail + + @classmethod + def _strip_none( + cls, elements: typing.List[ComparisonItemType] + ) -> typing.List[ComparisonItemType]: + + elements = cls._left_strip_none(elements) # left strip + elements.reverse() # flip into reverse + elements = cls._left_strip_none(elements) # right strip + elements.reverse() # flip back + + return elements + + @staticmethod + def _left_strip_none( + elements: typing.List[ComparisonItemType], + ) -> typing.List[ComparisonItemType]: + + return list(itertools.dropwhile(lambda element: element is None, elements)) + + @staticmethod + def _single_items( + items_a: typing.Union[ComparisonMergeTypes, None], + items_b: typing.Union[ComparisonMergeTypes, None], + ) -> typing.List[ComparisonItemABC]: + + assert items_a is not None or items_b is not None + + if items_a is None: + return [ComparisonItem(None, item) for item in items_b] + return [ComparisonItem(item, None) for item in items_a] + + @staticmethod + def _merge_datasets( + items_a: typing.Generator[DatasetABC, None, None], + items_b: typing.Generator[DatasetABC, None, None], + ) -> typing.List[ComparisonItemABC]: + + items_a = {item.subname: item for item in items_a} + items_b = {item.subname: item for item in items_b} + + names = list(items_a.keys() | items_b.keys()) + merged = [ + ComparisonItem(items_a.get(name, None), items_b.get(name, None)) + for name in names + ] + merged.sort(key=lambda item: item.get_item().name) + + return merged + + @classmethod + def from_zpools( + cls, + zpool_a: typing.Union[ZpoolABC, None], + zpool_b: typing.Union[ZpoolABC, None], + ) -> ComparisonABC: + + assert zpool_a is not None or zpool_b is not None + + if zpool_a is None or zpool_b is None: + return cls( + a=zpool_a, + b=zpool_b, + merged=cls._single_items( + getattr(zpool_a, "datasets", None), + getattr(zpool_b, "datasets", None), + ), + ) + + assert zpool_a is not zpool_b + assert zpool_a != zpool_b + + return cls( + a=zpool_a, + b=zpool_b, + merged=cls._merge_datasets(zpool_a.datasets, zpool_b.datasets), + ) + + @staticmethod + def _merge_snapshots( + items_a: typing.Generator[SnapshotABC, None, None], + items_b: typing.Generator[SnapshotABC, None, None], + ) -> typing.List[ComparisonItemABC]: + + items_a = list(items_a) + items_b = list(items_b) + names_a = [item.name for item in items_a] + names_b = [item.name for item in items_b] + + assert len(set(names_a)) == len(items_a) # unique names + assert len(set(names_b)) == len(items_b) # unique names + + if len(items_a) == 0 and len(items_b) == 0: + return [] + if len(items_a) == 0: + return [ComparisonItem(None, item) for item in items_b] + if len(items_b) == 0: + return [ComparisonItem(item, None) for item in items_a] + + try: + start_b = names_a.index(names_b[0]) + except ValueError: + start_b = None + try: + start_a = names_b.index(names_a[0]) + except ValueError: + start_a = None + + assert start_a is not None or start_b is not None # overlap + + prefix_a = [] if start_a is None else [None for _ in range(start_a)] + prefix_b = [] if start_b is None else [None for _ in range(start_b)] + items_a = prefix_a + items_a + items_b = prefix_b + items_b + suffix_a = ( + [] + if len(items_a) >= len(items_b) + else [None for _ in range(len(items_b) - len(items_a))] + ) + suffix_b = ( + [] + if len(items_b) >= len(items_a) + else [None for _ in range(len(items_a) - len(items_b))] + ) + items_a = items_a + suffix_a + items_b = items_b + suffix_b + + assert len(items_a) == len(items_b) + + alt_a, alt_b, state_a, state_b = 0, 0, False, False + merged = [] + for item_a, item_b in zip(items_a, items_b): + new_state_a, new_state_b = item_a is not None, item_b is not None + if new_state_a != state_a: + alt_a, state_a = alt_a + 1, new_state_a + if alt_a > 2: + raise ValueError("gap in snapshot series") + if new_state_b != state_b: + alt_b, state_b = alt_b + 1, new_state_b + if alt_b > 2: + raise ValueError("gap in snapshot series") + if state_a and state_b: + if item_a.name != item_b.name: + raise ValueError("inconsistent snapshot names") + merged.append(ComparisonItem(item_a, item_b)) + + return merged + + @classmethod + def from_datasets( + cls, + dataset_a: typing.Union[DatasetABC, None], + dataset_b: typing.Union[DatasetABC, None], + ) -> ComparisonABC: + + assert dataset_a is not None or dataset_b is not None + + if dataset_a is None or dataset_b is None: + return cls( + a=dataset_a, + b=dataset_b, + merged=cls._single_items( + getattr(dataset_a, "snapshots", None), + getattr(dataset_b, "snapshots", None), + ), + ) + + assert dataset_a is not dataset_b + assert dataset_a == dataset_b + + return cls( + a=dataset_a, + b=dataset_b, + merged=cls._merge_snapshots(dataset_a.snapshots, dataset_b.snapshots), + ) + + +@typeguard.typechecked +class ComparisonItem(ComparisonItemABC): + def __init__(self, a: ComparisonItemType, b: ComparisonItemType): + + assert a is not None or b is not None + if a is not None and b is not None: + assert type(a) == type(b) + + self._a, self._b = a, b + + def get_item(self) -> ComparisonStrictItemType: + + if self._a is not None: + return self._a + return self._b + + @property + def complete(self) -> bool: + + return self._a is not None and self._b is not None + + @property + def a(self) -> ComparisonItemType: + + return self._a + + @property + def b(self) -> ComparisonItemType: + + return self._b diff --git a/src/abgleich/core/config.py b/src/abgleich/core/config.py new file mode 100644 index 0000000..f207f33 --- /dev/null +++ b/src/abgleich/core/config.py @@ -0,0 +1,86 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/config.py: Handles configuration data + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import typing + +import typeguard +import yaml +from yaml import CLoader + +from .lib import valid_name + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Config(dict): + @classmethod + def from_fd(cls, fd: typing.TextIO): + + ssh_schema = { + "compression": lambda v: isinstance(v, bool), + "cipher": lambda v: isinstance(v, str) or v is None, + } + + side_schema = { + "zpool": lambda v: isinstance(v, str) and len(v) > 0, + "prefix": lambda v: isinstance(v, str) or v is None, + "host": lambda v: isinstance(v, str) and len(v) > 0, + "user": lambda v: isinstance(v, str) or v is None, + } + + root_schema = { + "source": lambda v: cls._validate(data=v, schema=side_schema), + "target": lambda v: cls._validate(data=v, schema=side_schema), + "keep_snapshots": lambda v: isinstance(v, int) and v >= 1, + "suffix": lambda v: v is None or (isinstance(v, str) and valid_name(v)), + "digits": lambda v: isinstance(v, int) and v >= 1, + "ignore": lambda v: isinstance(v, list) + and all((isinstance(item, str) and len(item) > 0 for item in v)), + "ssh": lambda v: cls._validate(data=v, schema=ssh_schema), + } + + config = yaml.load(fd.read(), Loader=CLoader) + cls._validate(data=config, schema=root_schema) + return cls(config) + + @classmethod + def _validate(cls, data: typing.Dict, schema: typing.Dict): + + for field, validator in schema.items(): + if field not in data.keys(): + raise KeyError(f'missing configuration field "{field:s}"') + if not validator(data[field]): + raise ValueError(f'invalid value in field "{field:s}"') + + return True diff --git a/src/abgleich/core/dataset.py b/src/abgleich/core/dataset.py new file mode 100644 index 0000000..c1451aa --- /dev/null +++ b/src/abgleich/core/dataset.py @@ -0,0 +1,207 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/dataset.py: ZFS dataset + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import datetime +import typing + +import typeguard + +from .abc import DatasetABC, PropertyABC, TransactionABC, SnapshotABC +from .command import Command +from .lib import root +from .property import Property +from .transaction import Transaction, TransactionMeta +from .snapshot import Snapshot + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Dataset(DatasetABC): + def __init__( + self, + name: str, + properties: typing.Dict[str, PropertyABC], + snapshots: typing.List[SnapshotABC], + side: str, + config: typing.Dict, + ): + + self._name = name + self._properties = properties + self._snapshots = snapshots + self._side = side + self._config = config + + self._root = root(config[side]["zpool"], config[side]["prefix"]) + + assert self._name.startswith(self._root) + self._subname = self._name[len(self._root) :].strip("/") + + def __eq__(self, other: DatasetABC) -> bool: + + return self.subname == other.subname + + def __len__(self) -> int: + + return len(self._snapshots) + + def __getitem__(self, key: typing.Union[str, int, slice]) -> PropertyABC: + + if isinstance(key, str): + return self._properties[key] + return self._snapshots[key] + + @property + def changed(self) -> bool: + + if len(self) == 0: + return True + if self._properties["written"].value == 0: + return False + if self._properties["written"].value > (1024 ** 2): + return True + if self._properties["type"].value == "volume": + return True + + output, _ = Command.on_side( + ["zfs", "diff", f"{self._name:s}@{self._snapshots[-1].name:s}"], + self._side, + self._config, + ).run() + return len(output.strip(" \t\n")) > 0 + + @property + def name(self) -> str: + + return self._name + + @property + def subname(self) -> str: + + return self._subname + + @property + def snapshots(self) -> typing.Generator[SnapshotABC, None, None]: + + return (snapshot for snapshot in self._snapshots) + + @property + def root(self) -> str: + + return self._root + + def get_snapshot_transaction(self) -> TransactionABC: + + snapshot_name = self._new_snapshot_name() + + return Transaction( + TransactionMeta( + type="snapshot", + dataset_subname=self._subname, + snapshot_name=snapshot_name, + written=self._properties["written"].value, + ), + [ + Command.on_side( + ["zfs", "snapshot", f"{self._name:s}@{snapshot_name:s}"], + self._side, + self._config, + ) + ], + ) + + def _new_snapshot_name(self) -> str: + + today = datetime.datetime.now().strftime("%Y%m%d") + max_snapshots = (10 ** self._config["digits"]) - 1 + suffix = self._config["suffix"] if self._config["suffix"] is not None else "" + + todays_names = [ + snapshot.name + for snapshot in self._snapshots + if all( + ( + snapshot.name.startswith(today), + snapshot.name.endswith(suffix), + len(snapshot.name) + == len(today) + self._config["digits"] + len(suffix), + ) + ) + ] + todays_numbers = [ + int(name[len(today) : len(today) + self._config["digits"]]) + for name in todays_names + if name[len(today) : len(today) + self._config["digits"]].isnumeric() + ] + if len(todays_numbers) != 0: + todays_numbers.sort() + new_number = todays_numbers[-1] + 1 + if new_number > max_snapshots: + raise ValueError(f"more than {max_snapshots:d} snapshots per day") + else: + new_number = 1 + + return f"{today:s}{new_number:02d}{suffix}" + + @classmethod + def from_entities( + cls, + name: str, + entities: typing.OrderedDict[str, typing.List[typing.List[str]]], + side: str, + config: typing.Dict, + ) -> DatasetABC: + + properties = { + property.name: property + for property in (Property.from_params(*params) for params in entities[name]) + } + entities.pop(name) + + snapshots = [] + snapshots.extend( + ( + Snapshot.from_entity( + snapshot_name, entities[snapshot_name], snapshots, side, config, + ) + for snapshot_name in entities.keys() + ) + ) + + return cls( + name=name, + properties=properties, + snapshots=snapshots, + side=side, + config=config, + ) diff --git a/src/abgleich/io.py b/src/abgleich/core/io.py similarity index 54% rename from src/abgleich/io.py rename to src/abgleich/core/io.py index cc2db54..425d2eb 100644 --- a/src/abgleich/io.py +++ b/src/abgleich/core/io.py @@ -6,9 +6,9 @@ zfs sync tool https://github.com/pleiszenburg/abgleich - src/abgleich/io.py: Command line IO + src/abgleich/core/io.py: Command line IO - Copyright (C) 2019 Sebastian M. Ernst + Copyright (C) 2019-2020 Sebastian M. Ernst The contents of this file are subject to the GNU Lesser General Public License @@ -31,47 +31,49 @@ # https://en.wikipedia.org/wiki/ANSI_escape_code c = { - 'RESET': '\033[0;0m', - 'BOLD': '\033[;1m', - 'REVERSE': '\033[;7m', - 'GREY': '\033[1;30m', - 'RED': '\033[1;31m', - 'GREEN': '\033[1;32m', - 'YELLOW': '\033[1;33m', - 'BLUE': '\033[1;34m', - 'MAGENTA': '\033[1;35m', - 'CYAN': '\033[1;36m', - 'WHITE': '\033[1;37m' - } + "RESET": "\033[0;0m", + "BOLD": "\033[;1m", + "REVERSE": "\033[;7m", + "GREY": "\033[1;30m", + "RED": "\033[1;31m", + "GREEN": "\033[1;32m", + "YELLOW": "\033[1;33m", + "BLUE": "\033[1;34m", + "MAGENTA": "\033[1;35m", + "CYAN": "\033[1;36m", + "WHITE": "\033[1;37m", +} # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ROUTINES # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + def colorize(text, col): - return c.get(col.upper(), c['GREY']) + text + c['RESET'] - -def humanize_size(size, add_color = False): - - suffix = 'B' - - for unit, color in ( - ('', 'cyan'), - ('Ki', 'green'), - ('Mi', 'yellow'), - ('Gi', 'red'), - ('Ti', 'magenta'), - ('Pi', 'white'), - ('Ei', 'white'), - ('Zi', 'white'), - ('Yi', 'white') - ): - if abs(size) < 1024.0: - text = '%3.1f %s%s' % (size, unit, suffix) - if add_color: - text = colorize(text, color) - return text - size /= 1024.0 - - raise ValueError('"size" too large') + return c.get(col.upper(), c["GREY"]) + text + c["RESET"] + + +def humanize_size(size, add_color=False): + + suffix = "B" + + for unit, color in ( + ("", "cyan"), + ("Ki", "green"), + ("Mi", "yellow"), + ("Gi", "red"), + ("Ti", "magenta"), + ("Pi", "white"), + ("Ei", "white"), + ("Zi", "white"), + ("Yi", "white"), + ): + if abs(size) < 1024.0: + text = "%3.1f %s%s" % (size, unit, suffix) + if add_color: + text = colorize(text, color) + return text + size /= 1024.0 + + raise ValueError('"size" too large') diff --git a/src/abgleich/core/lib.py b/src/abgleich/core/lib.py new file mode 100644 index 0000000..167e803 --- /dev/null +++ b/src/abgleich/core/lib.py @@ -0,0 +1,73 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/lib.py: ZFS library + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import re +import typing + +import typeguard + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# ROUTINES +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +def join(*args: str) -> str: + + if len(args) < 2: + raise ValueError("not enough elements to join") + + args = [arg.strip("/ \t\n") for arg in args] + + if any((len(arg) == 0 for arg in args)): + raise ValueError("can not join empty path elements") + + return "/".join(args) + + +@typeguard.typechecked +def root(zpool: str, prefix: typing.Union[str, None]) -> str: + + if prefix is None: + return zpool + return join(zpool, prefix) + + +_name_re = re.compile("^[A-Za-z0-9_]+$") + + +@typeguard.typechecked +def valid_name(name: str, min_len: int = 1) -> bool: + + assert min_len >= 0 + + if len(name) < min_len: + return False + return bool(_name_re.match(name)) diff --git a/src/abgleich/core/property.py b/src/abgleich/core/property.py new file mode 100644 index 0000000..ee12bdf --- /dev/null +++ b/src/abgleich/core/property.py @@ -0,0 +1,91 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/filesystem.py: ZFS filesystem + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import typing + +import typeguard + +from .abc import PropertyABC + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# TYPING +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +PropertyTypes = typing.Union[str, int, float, None] + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Property(PropertyABC): + def __init__( + self, name: str, value: PropertyTypes, src: PropertyTypes, + ): + + self._name = name + self._value = value + self._src = src + + @property + def name(self) -> str: + return self._name + + @property + def value(self) -> PropertyTypes: + return self._value + + @property + def src(self) -> PropertyTypes: + return self._src + + @classmethod + def _convert(cls, value: str) -> PropertyTypes: + + value = value.strip() + + if value.isnumeric(): + return int(value) + + if value.strip() == "" or value == "-" or value.lower() == "none": + return None + + try: + return float(value) + except ValueError: + pass + + return value + + @classmethod + def from_params(cls, name, value, src) -> PropertyABC: + + return cls(name=name, value=cls._convert(value), src=cls._convert(src),) diff --git a/src/abgleich/core/snapshot.py b/src/abgleich/core/snapshot.py new file mode 100644 index 0000000..8c200dc --- /dev/null +++ b/src/abgleich/core/snapshot.py @@ -0,0 +1,191 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/snapshot.py: ZFS snapshot + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import typing + +import typeguard + +from .abc import PropertyABC, SnapshotABC, TransactionABC +from .command import Command +from .lib import root +from .property import Property +from .transaction import Transaction, TransactionMeta + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Snapshot(SnapshotABC): + def __init__( + self, + name: str, + parent: str, + properties: typing.Dict[str, PropertyABC], + context: typing.List[SnapshotABC], + side: str, + config: typing.Dict, + ): + + self._name = name + self._parent = parent + self._properties = properties + self._context = context + self._side = side + self._config = config + + self._root = root(config[side]["zpool"], config[side]["prefix"]) + + assert self._parent.startswith(self._root) + self._subparent = self._parent[len(self._root) :].strip("/") + + def __eq__(self, other: SnapshotABC) -> bool: + + return self.subparent == other.subparent and self.name == other.name + + def __getitem__(self, name: str) -> PropertyABC: + + return self._properties[name] + + def get_cleanup_transaction(self) -> TransactionABC: + + assert self._side == "source" + + return Transaction( + meta=TransactionMeta( + type="cleanup_snapshot", + snapshot_subparent=self._subparent, + snapshot_name=self._name, + ), + commands=[ + Command.on_side( + ["zfs", "destroy", f"{self._parent:s}@{self._name:s}"], + self._side, + self._config, + ) + ], + ) + + def get_backup_transaction( + self, source_dataset: str, target_dataset: str, + ) -> TransactionABC: + + assert self._side == "source" + + ancestor = self.ancestor + + commands = [ + Command.on_side( + ["zfs", "send", "-c", f"{source_dataset:s}@{self.name:s}",] + if ancestor is None + else [ + "zfs", + "send", + "-c", + "-i", + f"{source_dataset:s}@{ancestor.name:s}", + f"{source_dataset:s}@{self.name:s}", + ], + "source", + self._config, + ), + Command.on_side( + ["zfs", "receive", f"{target_dataset:s}"], "target", self._config + ), + ] + + return Transaction( + meta=TransactionMeta( + type="push_snapshot" + if ancestor is None + else "push_snapshot_incremental", + snapshot_subparent=self._subparent, + ancestor_name="" if ancestor is None else ancestor.name, + snapshot_name=self.name, + ), + commands=commands, + ) + + @property + def name(self) -> str: + + return self._name + + @property + def parent(self) -> str: + + return self._parent + + @property + def subparent(self) -> str: + + return self._subparent + + @property + def ancestor(self) -> typing.Union[None, SnapshotABC]: + + assert self in self._context + self_index = self._context.index(self) + + if self_index == 0: + return None + return self._context[self_index - 1] + + @property + def root(self) -> str: + + return self._root + + @classmethod + def from_entity( + cls, + name: str, + entity: typing.List[typing.List[str]], + context: typing.List[SnapshotABC], + side: str, + config: typing.Dict, + ) -> SnapshotABC: + + properties = { + property.name: property + for property in (Property.from_params(*params) for params in entity) + } + + parent, name = name.split("@") + + return cls( + name=name, + parent=parent, + properties=properties, + context=context, + side=side, + config=config, + ) diff --git a/src/abgleich/core/transaction.py b/src/abgleich/core/transaction.py new file mode 100644 index 0000000..0228380 --- /dev/null +++ b/src/abgleich/core/transaction.py @@ -0,0 +1,237 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/transaction.py: ZFS transactions + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +import typing + +from tabulate import tabulate +import typeguard + +from .abc import CommandABC, TransactionABC, TransactionListABC, TransactionMetaABC +from .io import colorize, humanize_size + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Transaction(TransactionABC): + def __init__( + self, meta: TransactionMetaABC, commands: typing.List[CommandABC], + ): + + assert len(commands) in (1, 2) + + self._meta, self._commands = meta, commands + + self._complete = False + self._running = False + self._error = None + + @property + def complete(self) -> bool: + + return self._complete + + @property + def commands(self) -> typing.Tuple[CommandABC]: + + return self._commands + + @property + def error(self) -> typing.Union[Exception, None]: + + return self._error + + @property + def meta(self) -> TransactionMetaABC: + + return self._meta + + @property + def running(self) -> bool: + + return self._running + + def run(self): + + if self._complete: + return + self._running = True + + try: + if len(self._commands) == 1: + output, errors = self._commands[0].run() + else: + errors_1, output_2, errors_2 = self._commands[0].run_pipe( + self._commands[1] + ) + except SystemError as error: + self._error = error + finally: + self._running = False + self._complete = True + + +MetaTypes = typing.Union[str, int, float] +MetaNoneTypes = typing.Union[str, int, float, None] + + +@typeguard.typechecked +class TransactionMeta(TransactionMetaABC): + def __init__(self, **kwargs: MetaTypes): + + self._meta = kwargs + + def __getitem__(self, key: str) -> MetaTypes: + + return self._meta[key] + + def __len__(self) -> int: + + return len(self._meta) + + def get(self, key: str) -> MetaNoneTypes: + + return self._meta.get(key, None) + + def keys(self) -> typing.Generator[str, None, None]: + + return (key for key in self._meta.keys()) + + +TransactionIterableTypes = typing.Union[ + typing.Generator[TransactionABC, None, None], + typing.List[TransactionABC], + typing.Tuple[TransactionABC], +] + + +@typeguard.typechecked +class TransactionList(TransactionListABC): + def __init__(self): + + self._transactions = [] + + def __len__(self) -> int: + + return len(self._transactions) + + def append(self, transaction: TransactionABC): + + self._transactions.append(transaction) + + def extend(self, transactions: TransactionIterableTypes): + + self._transactions.extend(transactions) + + def print_table(self): + + if len(self) == 0: + return + + headers = self._table_headers() + colalign = self._table_colalign(headers) + + table = [ + [ + self._table_format_cell(header, transaction.meta.get(header)) + for header in headers + ] + for transaction in self._transactions + ] + + print(tabulate(table, headers=headers, tablefmt="github", colalign=colalign,)) + + @staticmethod + def _table_format_cell(header: str, value: MetaNoneTypes) -> str: + + FORMAT = { + "written": lambda v: humanize_size(v, add_color=True), + } + + return FORMAT.get(header, str)(value) + + @staticmethod + def _table_colalign(headers: typing.List[str]) -> typing.List[str]: + + RIGHT = ("written",) + DECIMAL = tuple() + + colalign = [] + for header in headers: + if header in RIGHT: + colalign.append("right") + elif header in DECIMAL: + colalign.append("decimal") + else: + colalign.append("left") + + return colalign + + def _table_headers(self) -> typing.List[str]: + + headers = set() + for transaction in self._transactions: + keys = list(transaction.meta.keys()) + assert "type" in keys + headers.update(keys) + headers = list(headers) + headers.sort() + + type_index = headers.index("type") + if type_index != 0: + headers.pop(type_index) + headers.insert(0, "type") + + return headers + + def run(self): + + for transaction in self._transactions: + + print( + f'({colorize(transaction.meta["type"], "white"):s}) ' + f'{colorize(" | ".join([str(command) for command in transaction.commands]), "yellow"):s}' + ) + + assert not transaction.running + assert not transaction.complete + + transaction.run() + + assert not transaction.running + assert transaction.complete + + if transaction.error is not None: + print(colorize("FAILED", "red")) + raise transaction.error + else: + print(colorize("OK", "green")) diff --git a/src/abgleich/core/zpool.py b/src/abgleich/core/zpool.py new file mode 100644 index 0000000..728ef83 --- /dev/null +++ b/src/abgleich/core/zpool.py @@ -0,0 +1,306 @@ +# -*- coding: utf-8 -*- + +""" + +ABGLEICH +zfs sync tool +https://github.com/pleiszenburg/abgleich + + src/abgleich/core/zpool.py: ZFS zpool + + Copyright (C) 2019-2020 Sebastian M. Ernst + + +The contents of this file are subject to the GNU Lesser General Public License +Version 2.1 ("LGPL" or "License"). You may not use this file except in +compliance with the License. You may obtain a copy of the License at +https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt +https://github.com/pleiszenburg/abgleich/blob/master/LICENSE + +Software distributed under the License is distributed on an "AS IS" basis, +WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the +specific language governing rights and limitations under the License. + + +""" + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# IMPORT +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +from collections import OrderedDict +import typing + +from tabulate import tabulate +import typeguard + +from .abc import ( + ComparisonItemABC, + DatasetABC, + SnapshotABC, + TransactionListABC, + ZpoolABC, +) +from .command import Command +from .comparison import Comparison +from .dataset import Dataset +from .io import colorize, humanize_size +from .lib import join, root +from .property import Property +from .transaction import TransactionList + +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +# CLASS +# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + + +@typeguard.typechecked +class Zpool(ZpoolABC): + def __init__( + self, datasets: typing.List[DatasetABC], side: str, config: typing.Dict, + ): + + self._datasets = datasets + self._side = side + self._config = config + + self._root = root(config[side]["zpool"], config[side]["prefix"]) + + def __eq__(self, other: ZpoolABC) -> bool: + + return self.side == other.side + + @property + def datasets(self) -> typing.Generator[DatasetABC, None, None]: + + return (dataset for dataset in self._datasets) + + @property + def side(self) -> str: + + return self._side + + @property + def root(self) -> str: + + return self._root + + def get_cleanup_transactions(self, other: ZpoolABC) -> TransactionListABC: + + assert self.side == "source" + assert other.side == "target" + + zpool_comparison = Comparison.from_zpools(self, other) + transactions = TransactionList() + + for dataset_item in zpool_comparison.merged: + + if dataset_item.get_item().subname in self._config["ignore"]: + continue + if dataset_item.a is None or dataset_item.b is None: + continue + + dataset_comparison = Comparison.from_datasets( + dataset_item.a, dataset_item.b + ) + snapshots = dataset_comparison.a_overlap_tail[ + : -self._config["keep_snapshots"] + ] + + transactions.extend( + (snapshot.get_cleanup_transaction() for snapshot in snapshots) + ) + + return transactions + + def get_backup_transactions(self, other: ZpoolABC) -> TransactionListABC: + + assert self.side == "source" + assert other.side == "target" + + zpool_comparison = Comparison.from_zpools(self, other) + transactions = TransactionList() + + for dataset_item in zpool_comparison.merged: + + if dataset_item.get_item().subname in self._config["ignore"]: + continue + if dataset_item.a is None: + continue + + if dataset_item.b is None: + snapshots = list(dataset_item.a.snapshots) + else: + dataset_comparison = Comparison.from_datasets( + dataset_item.a, dataset_item.b + ) + snapshots = dataset_comparison.a_head + + if len(snapshots) == 0: + continue + + source_dataset = ( + self.root + if len(dataset_item.a.subname) == 0 + else join(self.root, dataset_item.a.subname) + ) + target_dataset = ( + other.root + if len(dataset_item.a.subname) == 0 + else join(other.root, dataset_item.a.subname) + ) + + transactions.extend( + ( + snapshot.get_backup_transaction(source_dataset, target_dataset,) + for snapshot in snapshots + ) + ) + + return transactions + + def get_snapshot_transactions(self) -> TransactionListABC: + + assert self._side == "source" + + transactions = TransactionList() + for dataset in self._datasets: + if dataset.subname in self._config["ignore"]: + continue + if dataset["mountpoint"].value is None: + continue + if dataset.changed: + transactions.append(dataset.get_snapshot_transaction()) + + return transactions + + def print_table(self): + + table = [] + for dataset in self._datasets: + table.append(self._table_row(dataset)) + for snapshot in dataset.snapshots: + table.append(self._table_row(snapshot)) + + print( + tabulate( + table, + headers=("NAME", "USED", "REFER", "compressratio"), + tablefmt="github", + colalign=("left", "right", "right", "decimal"), + ) + ) + + @staticmethod + def _table_row(entity: typing.Union[SnapshotABC, DatasetABC]) -> typing.List[str]: + + return [ + "- " + colorize(entity.name, "grey") + if isinstance(entity, SnapshotABC) + else colorize(entity.name, "white"), + humanize_size(entity["used"].value, add_color=True), + humanize_size(entity["referenced"].value, add_color=True), + f'{entity["compressratio"].value:.02f}', + ] + + def print_comparison_table(self, other: ZpoolABC): + + zpool_comparison = Comparison.from_zpools(self, other) + table = [] + + for dataset_item in zpool_comparison.merged: + table.append(self._comparison_table_row(dataset_item)) + if dataset_item.complete: + dataset_comparison = Comparison.from_datasets( + dataset_item.a, dataset_item.b + ) + elif dataset_item.a is not None: + dataset_comparison = Comparison.from_datasets(dataset_item.a, None) + else: + dataset_comparison = Comparison.from_datasets(None, dataset_item.b) + for snapshot_item in dataset_comparison.merged: + table.append(self._comparison_table_row(snapshot_item)) + + print( + tabulate(table, headers=["NAME", self.side, other.side], tablefmt="github",) + ) + + @staticmethod + def _comparison_table_row(item: ComparisonItemABC) -> typing.List[str]: + + entity = item.get_item() + name = entity.name if isinstance(entity, SnapshotABC) else entity.subname + + if item.a is not None and item.b is not None: + a, b = colorize("X", "green"), colorize("X", "green") + elif item.a is None and item.b is not None: + a, b = "", colorize("X", "blue") + elif item.a is not None and item.b is None: + a, b = colorize("X", "red"), "" + + return [ + "- " + colorize(name, "grey") + if isinstance(entity, SnapshotABC) + else colorize(name, "white"), + a, + b, + ] + + @staticmethod + def available(side: str, config: typing.Dict,) -> int: + + output, _ = Command.on_side( + [ + "zfs", + "get", + "available", + "-H", + "-p", + root(config[side]["zpool"], config[side]["prefix"]), + ], + side, + config, + ).run() + + return Property.from_params(*output.strip().split("\t")[1:]).value + + @classmethod + def from_config(cls, side: str, config: typing.Dict,) -> ZpoolABC: + + output, _ = Command.on_side( + [ + "zfs", + "get", + "all", + "-r", + "-H", + "-p", + root(config[side]["zpool"], config[side]["prefix"]), + ], + side, + config, + ).run() + output = [ + line.split("\t") for line in output.split("\n") if len(line.strip()) > 0 + ] + entities = OrderedDict((line[0], []) for line in output) + for line_list in output: + entities[line_list[0]].append(line_list[1:]) + + datasets = [ + Dataset.from_entities( + name, + OrderedDict( + (k, v) + for k, v in entities.items() + if k == name or k.startswith(f"{name:s}@") + ), + side, + config, + ) + for name in entities.keys() + if "@" not in name + ] + datasets.sort(key=lambda dataset: dataset.name) + + return cls(datasets=datasets, side=side, config=config,) diff --git a/src/abgleich/zfs/__init__.py b/src/abgleich/zfs/__init__.py deleted file mode 100644 index 5b035d8..0000000 --- a/src/abgleich/zfs/__init__.py +++ /dev/null @@ -1,357 +0,0 @@ -# -*- coding: utf-8 -*- - -""" - -ABGLEICH -zfs sync tool -https://github.com/pleiszenburg/abgleich - - src/abgleich/zfs/__init__.py: ZFS package root - - Copyright (C) 2019 Sebastian M. Ernst - - -The contents of this file are subject to the GNU Lesser General Public License -Version 2.1 ("LGPL" or "License"). You may not use this file except in -compliance with the License. You may obtain a copy of the License at -https://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt -https://github.com/pleiszenburg/abgleich/blob/master/LICENSE - -Software distributed under the License is distributed on an "AS IS" basis, -WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License for the -specific language governing rights and limitations under the License. - - -""" - - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# IMPORT -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -import datetime - -from ..cmd import ( - run_chain_command, - run_command, - ssh_command, - ) - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# ROUTINES -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -def compare_trees(tree_a, prefix_a, tree_b, prefix_b): - assert not prefix_a.endswith('/') - assert not prefix_b.endswith('/') - prefix_a += '/' - prefix_b += '/' - subdict_a = { - '/' + dataset['NAME'][len(prefix_a):]: dataset - for dataset in tree_a - if dataset['NAME'].startswith(prefix_a) # or dataset['NAME'] == prefix_a[:-1] - } - subdict_b = { - '/' + dataset['NAME'][len(prefix_b):]: dataset - for dataset in tree_b - if dataset['NAME'].startswith(prefix_b) # or dataset['NAME'] == prefix_b[:-1] - } - tree_names = list(sorted(subdict_a.keys() | subdict_b.keys())) - res = list() - for name in tree_names: - res.append([name, name in subdict_a.keys(), name in subdict_b.keys()]) - res.extend(__merge_snapshots__( - name, - subdict_a[name]['SNAPSHOTS'] if name in subdict_a else list(), - subdict_b[name]['SNAPSHOTS'] if name in subdict_b else list() - )) - return res - -def __merge_snapshots__(dataset_name, snap_a, snap_b): - if len(snap_a) == 0 and len(snap_b) == 0: - return list() - names_a = [snapshot['NAME'] for snapshot in snap_a] - names_b = [snapshot['NAME'] for snapshot in snap_b] - if len(names_a) == 0 and len(names_b) > 0: - return [[dataset_name + '@' + name, False, True] for name in names_b] - if len(names_b) == 0 and len(names_a) > 0: - return [[dataset_name + '@' + name, True, False] for name in names_a] - creations_a = {snapshot['creation']: snapshot for snapshot in snap_a} - creations_b = {snapshot['creation']: snapshot for snapshot in snap_b} - creations = list(sorted(creations_a.keys() | creations_b.keys())) - ret = list() - for creation in creations: - in_a = creation in creations_a.keys() - in_b = creation in creations_b.keys() - if in_a: - name = creations_a[creation]['NAME'] - elif in_b: - name = creations_b[creation]['NAME'] - else: - raise ValueError('this should not happen') - if in_a and in_b: - if creations_a[creation]['NAME'] != creations_b[creation]['NAME']: - raise ValueError('snapshot name mismatch for equal creation times') - ret.append([dataset_name + '@' + name, in_a, in_b]) - return ret - -def get_backup_ops(tree_a, prefix_a, tree_b, prefix_b, ignore): - assert not prefix_a.endswith('/') - assert not prefix_b.endswith('/') - prefix_a += '/' - prefix_b += '/' - subdict_a = { - '/' + dataset['NAME'][len(prefix_a):]: dataset - for dataset in tree_a - if dataset['NAME'].startswith(prefix_a) - } - subdict_b = { - '/' + dataset['NAME'][len(prefix_b):]: dataset - for dataset in tree_b - if dataset['NAME'].startswith(prefix_b) - } - tree_names = list(sorted(subdict_a.keys() | subdict_b.keys())) - res = list() - for name in tree_names: - if name in ignore: - continue - dataset_in_a = name in subdict_a.keys() - dataset_in_b = name in subdict_b.keys() - if not dataset_in_a and dataset_in_b: - raise ValueError('no source dataset "%s" - only remote' % name) - if dataset_in_a and not dataset_in_b and len(subdict_a[name]['SNAPSHOTS']) == 0: - raise ValueError('no snapshots in dataset "%s" - can not send' % name) - if dataset_in_a and not dataset_in_b: - res.append([ - 'push_snapshot', - (name, subdict_a[name]['SNAPSHOTS'][0]['NAME']) - ]) - for snapshot_1, snapshot_2 in zip( - subdict_a[name]['SNAPSHOTS'][:-1], - subdict_a[name]['SNAPSHOTS'][1:] - ): - res.append([ - 'push_snapshot_incremental', - (name, snapshot_1['NAME'], snapshot_2['NAME']) - ]) - continue - last_remote_shapshot = subdict_b[name]['SNAPSHOTS'][-1]['NAME'] - source_index = None - for index, source_snapshot in enumerate(subdict_a[name]['SNAPSHOTS']): - if source_snapshot['NAME'] == last_remote_shapshot: - source_index = index - break - if source_index is None: - raise ValueError('no common snapshots in dataset "%s" - can not send incremental' % name) - for snapshot_1, snapshot_2 in zip( - subdict_a[name]['SNAPSHOTS'][source_index:-1], - subdict_a[name]['SNAPSHOTS'][(source_index + 1):] - ): - res.append([ - 'push_snapshot_incremental', - (name, snapshot_1['NAME'], snapshot_2['NAME']) - ]) - - return res - -def get_cleanup_tasks(tree, prefix, ignore, keep_snapshots): - - res = list() - skip = len(prefix) - - for dataset in tree: - name = dataset['NAME'][skip:] - if name in ignore or len(name) == 0: - continue - # if dataset['MOUNTPOINT'] == 'none': - # continue - if len(dataset['SNAPSHOTS']) <= keep_snapshots: - continue - del_snapshots = dataset['SNAPSHOTS'][:(-1 * keep_snapshots)] - for snapshot in del_snapshots: - res.append([name, snapshot['NAME']]) - - return res - -def get_snapshot_tasks(tree, prefix, ignore): - - res = list() - skip = len(prefix) - date = datetime.datetime.now().strftime('%Y%m%d') - suffix = '_backup' - - def make_name(snapshots): - snapshot_names = [snapshot['NAME'] for snapshot in snapshots] - for index in range(1, 100): - new_name = '%s%02d%s' % (date, index, suffix) - if new_name not in snapshot_names: - return new_name - raise ValueError('more than 99 snapshots per day') - - for dataset in tree: - name = dataset['NAME'][skip:] - written = int(dataset['written']) - if name in ignore or len(name) == 0: - continue - if dataset['MOUNTPOINT'] == 'none': - continue - if len(dataset['SNAPSHOTS']) == 0: - res.append([name, written, date + '01' + suffix]) - continue - if written == 0: - continue - if written > (1024 ** 2): - res.append([name, written, make_name(dataset['SNAPSHOTS'])]) - continue - if dataset['type'] == 'volume': - res.append([name, written, make_name(dataset['SNAPSHOTS'])]) - continue - diff_out = run_command([ - 'zfs', 'diff', dataset['NAME'] + '@' + dataset['SNAPSHOTS'][-1]['NAME'] - ]) - if len(diff_out.strip(' \t\n')) > 0: - res.append([name, written, make_name(dataset['SNAPSHOTS'])]) - - return res - -def get_tree(host = None): - - cmd_list = ['zfs', 'list', '-H', '-p'] - cmd_list_snapshot = ['zfs', 'list', '-t', 'snapshot', '-H', '-p'] - cmd_list_property = ['zfs', 'get', 'all', '-H', '-p'] - - if host is not None: - cmd_list = ssh_command(host, cmd_list, compression = True) - cmd_list_snapshot = ssh_command(host, cmd_list_snapshot, compression = True) - cmd_list_property = ssh_command(host, cmd_list_property, compression = True) - - datasets = parse_table( - run_command(cmd_list), - ['NAME', 'USED', 'AVAIL', 'REFER', 'MOUNTPOINT'] - ) - snapshots = parse_table( - run_command(cmd_list_snapshot), - ['NAME', 'USED', 'AVAIL', 'REFER', 'MOUNTPOINT'] - ) - properties = parse_table( - run_command(cmd_list_property), - ['NAME', 'PROPERTY', 'VALUE', 'SOURCE'] - ) - merge_properties(datasets, snapshots, properties) - merge_snapshots_into_datasets(datasets, snapshots) - - return datasets - -def merge_properties(datasets, snapshots, properties): - - elements = {dataset['NAME']: dataset for dataset in datasets} - elements.update({snapshot['NAME']: snapshot for snapshot in snapshots}) - for property in properties: - elements[property['NAME']][property['PROPERTY']] = property['VALUE'] - -def merge_snapshots_into_datasets(datasets, snapshots): - - for dataset in datasets: - dataset['SNAPSHOTS'] = [] - datasets_dict = {dataset['NAME']: dataset for dataset in datasets} - for snapshot in snapshots: - dataset_name, snapshot['NAME'] = snapshot['NAME'].split('@') - datasets_dict[dataset_name]['SNAPSHOTS'].append(snapshot) - -def parse_table(raw, head): - - table = [item.split('\t') for item in raw.split('\n') if len(item.strip()) > 0] - return [{k: v for k, v in zip(head, line)} for line in table] - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# ROUTINES: MODIFY -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -def create_snapshot(dataset_name, snapshot_name, debug = False): - print('CREATING SNAPSHOT %s@%s ...' % (dataset_name, snapshot_name)) - cmd = ['zfs', 'snapshot', '%s@%s' % (dataset_name, snapshot_name)] - run_command(cmd, debug = debug) - print('... CREATING SNAPSHOT DONE.') - -def delete_snapshot(dataset_name, snapshot_name, debug = False): - print('DELETING SNAPSHOT %s@%s ...' % (dataset_name, snapshot_name)) - cmd = ['zfs', 'destroy', '%s@%s' % (dataset_name, snapshot_name)] - run_command(cmd, debug = debug) - print('... DELETING SNAPSHOT DONE.') - -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -# ROUTINES: SEND & RECEIVE -# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -def pull_snapshot(host, src, src_firstsnapshot, dest, debug = False): - print('PULLING FIRST %s@%s to %s ...' % (src, src_firstsnapshot, dest)) - cmd1 = ssh_command( - host, - ['zfs', 'send', '-c', '%s@%s' % (src, src_firstsnapshot)], - compression = False - ) - cmd2 = ['zfs', 'receive', dest] - run_chain_command(cmd1, cmd2, debug = debug) - print('... PULLING FIRST DONE.') - -def pull_snapshot_incremental(host, src, src_a, src_b, dest, debug = False): - print('PULLING FOLLOW-UP %s@[%s - %s] to %s ...' % (src, src_a, src_b, dest)) - cmd1 = ssh_command( - host, - ['zfs', 'send', '-c', '-i', '%s@%s' % (src, src_a), '%s@%s' % (src, src_b)], - compression = False - ) - cmd2 = ['zfs', 'receive', dest] - run_chain_command(cmd1, cmd2, debug = debug) - print('... PULLING FOLLOW-UP DONE.') - -def pull_new(host, dataset_src, dest, debug = False): - print('PULLING NEW %s to %s ...' % (dataset_src['NAME'], dest)) - src = dataset_src['NAME'] - src_firstsnapshot = dataset_src['SNAPSHOTS'][0]['NAME'] - src_snapshotpairs = [ - (a['NAME'], b['NAME']) - for a, b in zip(dataset_src['SNAPSHOTS'][:-1], dataset_src['SNAPSHOTS'][1:]) - ] - pull_snapshot(host, src, src_firstsnapshot, dest, debug = debug) - for src_a, src_b in src_snapshotpairs: - pull_snapshot_incremental(host, src, src_a, src_b, dest, debug = debug) - print('... PULLING NEW DONE.') - -def push_snapshot(host, src, src_firstsnapshot, dest, debug = False): - print('PUSHING FIRST %s@%s to %s ...' % (src, src_firstsnapshot, dest)) - cmd1 = ['zfs', 'send', '-c', '%s@%s' % (src, src_firstsnapshot)] - cmd2 = ssh_command( - host, - ['zfs', 'receive', dest], - compression = False - ) - run_chain_command(cmd1, cmd2, debug = debug) - print('... PUSHING FIRST DONE.') - -def push_snapshot_incremental(host, src, src_a, src_b, dest, debug = False): - print('PUSHING FOLLOW-UP %s@[%s - %s] to %s ...' % (src, src_a, src_b, dest)) - cmd1 = [ - 'zfs', 'send', '-c', - '-i', '%s@%s' % (src, src_a), '%s@%s' % (src, src_b) - ] - cmd2 = ssh_command( - host, - ['zfs', 'receive', dest], - compression = False - ) - run_chain_command(cmd1, cmd2, debug = debug) - print('... PUSHING FOLLOW-UP DONE.') - -def push_new(host, dataset_src, dest, debug = False): - print('PUSHING NEW %s to %s ...' % (dataset_src['NAME'], dest)) - src = dataset_src['NAME'] - src_firstsnapshot = dataset_src['SNAPSHOTS'][0]['NAME'] - src_snapshotpairs = [ - (a['NAME'], b['NAME']) - for a, b in zip(dataset_src['SNAPSHOTS'][:-1], dataset_src['SNAPSHOTS'][1:]) - ] - push_snapshot(host, src, src_firstsnapshot, dest, debug = debug) - for src_a, src_b in src_snapshotpairs: - push_snapshot_incremental(host, src, src_a, src_b, dest, debug = debug) - print('... PUSHING NEW DONE.')