Summary
Add a new dazzle_filekit.data submodule that exposes high-level file
operations (copy, move, sync, verify) which compose existing
filekit primitives with unctools and preservelib when available.
This is the primary user-facing entry point for the v0.3.0 "seamless
file operations" vision. See the epic for context.
API sketch
from dazzle_filekit import data
# Single-file copy with rich preservation
data.copy(
src: Union[str, Path],
dst: Union[str, Path],
*,
preserve: Literal["none", "basic", "all"] = "basic",
verify: Optional[str] = None, # None | "md5" | "sha256" | ...
manifest: bool = False,
overwrite: bool = False,
dry_run: bool = False,
) -> CopyResult
data.move(src, dst, *, preserve="basic", verify=None, manifest=False, ...)
data.sync(src_dir, dst_dir, *, mode="mirror", verify=None, ...)
data.verify(path, expected_hash=None, *, algorithm="sha256") -> VerifyResult
CopyResult / VerifyResult are dataclasses with .success, .src,
.dst, .bytes_copied, .hash, .metadata, .warnings fields.
Behavior
- Input path handling: both
src and dst go through the new
parse_user_input_path() (sub-issue N+4) so the caller can pass
any pasted form
- UNC translation: if unctools is installed AND the src/dst is a
UNC path AND a local drive mapping exists, optionally translate
(gated by a prefer_local_over_unc=True default)
- Preservation levels:
"none" -- content only, no metadata
"basic" -- existing copy_file(preserve_attrs=True) behavior
"all" -- full rich metadata via the v0.2.4 metadata module
(SDDL ACLs, ctime, xattrs, ADS warnings)
- Verification: if
verify="sha256", compute the hash before and
after the copy, raise on mismatch
- Manifest: if
manifest=True, use preservelib's manifest writer
(or fall back to filekit's atomic_write_json with a minimal
manifest schema if preservelib isn't installed)
- Dry run:
dry_run=True returns a CopyResult with .success=False
and a description of what would have happened, without touching disk
Acceptance criteria
Dependencies
- Blocked by: #N+4 (parse_user_input_path), #N+3 (optional-peer
integration pattern)
- Blocks: #N+7 (documentation), #N+2 (DataRef builds on these
function entry points)
Non-goals
- This issue does NOT define the
DataRef / object-style API -- that's
sub-issue #N+2
- This issue does NOT tackle cross-runtime operation continuity (WSL
alias detection) -- that's sub-issue #N+5
Summary
Add a new
dazzle_filekit.datasubmodule that exposes high-level fileoperations (
copy,move,sync,verify) which compose existingfilekit primitives with unctools and preservelib when available.
This is the primary user-facing entry point for the v0.3.0 "seamless
file operations" vision. See the epic for context.
API sketch
CopyResult/VerifyResultare dataclasses with.success,.src,.dst,.bytes_copied,.hash,.metadata,.warningsfields.Behavior
srcanddstgo through the newparse_user_input_path()(sub-issue N+4) so the caller can passany pasted form
UNC path AND a local drive mapping exists, optionally translate
(gated by a
prefer_local_over_unc=Truedefault)"none"-- content only, no metadata"basic"-- existingcopy_file(preserve_attrs=True)behavior"all"-- full rich metadata via the v0.2.4 metadata module(SDDL ACLs, ctime, xattrs, ADS warnings)
verify="sha256", compute the hash before andafter the copy, raise on mismatch
manifest=True, use preservelib's manifest writer(or fall back to filekit's
atomic_write_jsonwith a minimalmanifest schema if preservelib isn't installed)
dry_run=Truereturns aCopyResultwith.success=Falseand a description of what would have happened, without touching disk
Acceptance criteria
from dazzle_filekit import dataworksdata.copy,data.move,data.sync,data.verifyare callabledata.copy(src, dst, preserve="all", verify="sha256", manifest=True)works end-to-end on both Windows and Linux
(with clear error messages or documented fallbacks)
scripts/run-cross-platform-tests.shDependencies
integration pattern)
function entry points)
Non-goals
DataRef/ object-style API -- that'ssub-issue #N+2
alias detection) -- that's sub-issue #N+5