-
-
Notifications
You must be signed in to change notification settings - Fork 304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
object-store
-based Store implementation
#1661
Open
kylebarron
wants to merge
52
commits into
zarr-developers:main
Choose a base branch
from
kylebarron:kyle/object-store
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+430
−5
Open
Changes from all commits
Commits
Show all changes
52 commits
Select commit
Hold shift + click to select a range
14be826
Initial object-store implementation
kylebarron a492bf0
Merge branch 'v3' into kyle/object-store
d-v-b 50b6c47
Merge branch 'v3' into kyle/object-store
jhamman afa79af
Update src/zarr/v3/store/object_store.py
kylebarron c466f9f
Merge branch 'main' into kyle/object-store
kylebarron c3e7296
Merge branch 'main' into kyle/object-store
kylebarron f5c884b
update
kylebarron af2a39b
Handle list streams
kylebarron d7cfbee
Update get
kylebarron cb40015
wip refactor get_partial_values
kylebarron 619df43
Merge branch 'main' into kyle/object-store
kylebarron b976450
Fixes to _get_partial_values
kylebarron cca70d7
Merge branch 'main' into kyle/object-store
kylebarron f2c827d
Fix constructing prototype from get
kylebarron 5c8903f
lint
kylebarron 50e1dec
Merge branch 'main' into kyle/object-store
kylebarron 8bb252e
Add docstring
kylebarron 559eafd
Make names private
kylebarron 5486e69
Implement eq
kylebarron 9a05c01
Add obstore as a test dep
maxrjones 56b7a0b
Run store tests on ObjectStore
maxrjones d5d0d4d
Merge pull request #1 from maxrjones/object-store-tests
kylebarron b38ada1
import or skip
kylebarron ab00b46
Bump obstore beta version
kylebarron 9c65e4d
bump pre-commit
kylebarron 77d7c12
Add read_only param for __init__
kylebarron 4418426
Bump obstore
maxrjones 7a71174
Add fixtures for object store tests
maxrjones e73bcc9
Cast return from __eq__ as bool
maxrjones c2cd6b8
Avoid recursion error on repr
maxrjones a95ec59
Check store type at runtime
maxrjones fb8b16d
Merge pull request #2 from maxrjones/update-tests
kylebarron ca261b1
Check if store is writable for setting or deleting objects
maxrjones 0eb416a
Add test for object store repr
maxrjones 247432f
Add attribute tests
maxrjones 4b31b3c
Add get and set methods to test class
maxrjones d49d1ff
Raise an exeption for previously set key
maxrjones c2ebc8f
Update src/zarr/testing/store.py
maxrjones eb76698
Update _transform_list_dir to not remove all items
maxrjones f310260
Return bytes from GetResult
maxrjones 86951b8
Don't raise an exception on set_if_not_exists
maxrjones f989884
Remove test that stores empty file
maxrjones 40e1b25
Handle None as start or end of byte range request
maxrjones 3aa3578
Merge pull request #3 from maxrjones/check_writable
kylebarron 6da7976
Merge branch 'main' into object-store-update
maxrjones 26fa37e
Use new ByteRequest syntax
maxrjones 315e22e
Raise not implemented error on pickling
maxrjones 264eac6
Merge pull request #4 from maxrjones/object-store-update
kylebarron fc93029
Bump obstore
maxrjones 1b9f9f2
Catch allowed exceptions
maxrjones 72c9b30
Merge pull request #5 from maxrjones/object-store
kylebarron 0f8820d
Merge branch 'main' into kyle/object-store
kylebarron File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change | ||||||
---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,361 @@ | ||||||||
from __future__ import annotations | ||||||||
|
||||||||
import asyncio | ||||||||
import contextlib | ||||||||
from collections import defaultdict | ||||||||
from collections.abc import Iterable | ||||||||
from typing import TYPE_CHECKING, Any, TypedDict | ||||||||
|
||||||||
import obstore as obs | ||||||||
|
||||||||
from zarr.abc.store import ( | ||||||||
ByteRequest, | ||||||||
OffsetByteRequest, | ||||||||
RangeByteRequest, | ||||||||
Store, | ||||||||
SuffixByteRequest, | ||||||||
) | ||||||||
from zarr.core.buffer import Buffer | ||||||||
from zarr.core.buffer.core import BufferPrototype | ||||||||
|
||||||||
if TYPE_CHECKING: | ||||||||
from collections.abc import AsyncGenerator, Coroutine, Iterable | ||||||||
from typing import Any | ||||||||
|
||||||||
from obstore import ListStream, ObjectMeta, OffsetRange, SuffixRange | ||||||||
from obstore.store import ObjectStore as _ObjectStore | ||||||||
|
||||||||
from zarr.core.buffer import Buffer, BufferPrototype | ||||||||
from zarr.core.common import BytesLike | ||||||||
|
||||||||
ALLOWED_EXCEPTIONS: tuple[type[Exception], ...] = ( | ||||||||
FileNotFoundError, | ||||||||
IsADirectoryError, | ||||||||
NotADirectoryError, | ||||||||
) | ||||||||
|
||||||||
|
||||||||
class ObjectStore(Store): | ||||||||
"""A Zarr store that uses obstore for fast read/write from AWS, GCP, and Azure. | ||||||||
|
||||||||
Parameters | ||||||||
---------- | ||||||||
store : obstore.store.ObjectStore | ||||||||
An obstore store instance that is set up with the proper credentials. | ||||||||
""" | ||||||||
|
||||||||
store: _ObjectStore | ||||||||
"""The underlying obstore instance.""" | ||||||||
|
||||||||
def __eq__(self, value: object) -> bool: | ||||||||
if not isinstance(value, ObjectStore): | ||||||||
return False | ||||||||
|
||||||||
return bool(self.store.__eq__(value.store)) | ||||||||
|
||||||||
def __init__(self, store: _ObjectStore, *, read_only: bool = False) -> None: | ||||||||
if not isinstance( | ||||||||
store, | ||||||||
( | ||||||||
obs.store.AzureStore, | ||||||||
obs.store.GCSStore, | ||||||||
obs.store.HTTPStore, | ||||||||
obs.store.S3Store, | ||||||||
obs.store.LocalStore, | ||||||||
obs.store.MemoryStore, | ||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||
), | ||||||||
): | ||||||||
raise TypeError(f"expected ObjectStore class, got {store!r}") | ||||||||
self.store = store | ||||||||
super().__init__(read_only=read_only) | ||||||||
|
||||||||
def __str__(self) -> str: | ||||||||
return f"object://{self.store}" | ||||||||
|
||||||||
def __repr__(self) -> str: | ||||||||
return f"ObjectStore({self})" | ||||||||
|
||||||||
def __getstate__(self) -> None: | ||||||||
raise NotImplementedError("Pickling has not been implement for ObjectStore") | ||||||||
|
||||||||
def __setstate__(self) -> None: | ||||||||
raise NotImplementedError("Pickling has not been implement for ObjectStore") | ||||||||
|
||||||||
async def get( | ||||||||
self, key: str, prototype: BufferPrototype, byte_range: ByteRequest | None = None | ||||||||
) -> Buffer | None: | ||||||||
try: | ||||||||
if byte_range is None: | ||||||||
resp = await obs.get_async(self.store, key) | ||||||||
return prototype.buffer.from_bytes(await resp.bytes_async()) | ||||||||
elif isinstance(byte_range, RangeByteRequest): | ||||||||
resp = await obs.get_range_async( | ||||||||
self.store, key, start=byte_range.start, end=byte_range.end | ||||||||
) | ||||||||
return prototype.buffer.from_bytes(memoryview(resp)) | ||||||||
elif isinstance(byte_range, OffsetByteRequest): | ||||||||
resp = await obs.get_async( | ||||||||
self.store, key, options={"range": {"offset": byte_range.offset}} | ||||||||
) | ||||||||
return prototype.buffer.from_bytes(await resp.bytes_async()) | ||||||||
elif isinstance(byte_range, SuffixByteRequest): | ||||||||
resp = await obs.get_async( | ||||||||
self.store, key, options={"range": {"suffix": byte_range.suffix}} | ||||||||
) | ||||||||
return prototype.buffer.from_bytes(await resp.bytes_async()) | ||||||||
else: | ||||||||
raise ValueError(f"Unexpected input to `get`: {byte_range}") | ||||||||
except ALLOWED_EXCEPTIONS: | ||||||||
return None | ||||||||
|
||||||||
async def get_partial_values( | ||||||||
self, | ||||||||
prototype: BufferPrototype, | ||||||||
key_ranges: Iterable[tuple[str, ByteRequest | None]], | ||||||||
) -> list[Buffer | None]: | ||||||||
return await _get_partial_values(self.store, prototype=prototype, key_ranges=key_ranges) | ||||||||
|
||||||||
async def exists(self, key: str) -> bool: | ||||||||
try: | ||||||||
await obs.head_async(self.store, key) | ||||||||
except FileNotFoundError: | ||||||||
return False | ||||||||
else: | ||||||||
return True | ||||||||
|
||||||||
@property | ||||||||
def supports_writes(self) -> bool: | ||||||||
return True | ||||||||
|
||||||||
async def set(self, key: str, value: Buffer) -> None: | ||||||||
self._check_writable() | ||||||||
buf = value.to_bytes() | ||||||||
await obs.put_async(self.store, key, buf) | ||||||||
|
||||||||
async def set_if_not_exists(self, key: str, value: Buffer) -> None: | ||||||||
self._check_writable() | ||||||||
buf = value.to_bytes() | ||||||||
with contextlib.suppress(obs.exceptions.AlreadyExistsError): | ||||||||
await obs.put_async(self.store, key, buf, mode="create") | ||||||||
|
||||||||
@property | ||||||||
def supports_deletes(self) -> bool: | ||||||||
return True | ||||||||
|
||||||||
async def delete(self, key: str) -> None: | ||||||||
self._check_writable() | ||||||||
await obs.delete_async(self.store, key) | ||||||||
|
||||||||
@property | ||||||||
def supports_partial_writes(self) -> bool: | ||||||||
return False | ||||||||
|
||||||||
async def set_partial_values( | ||||||||
self, key_start_values: Iterable[tuple[str, int, BytesLike]] | ||||||||
) -> None: | ||||||||
raise NotImplementedError | ||||||||
|
||||||||
@property | ||||||||
def supports_listing(self) -> bool: | ||||||||
return True | ||||||||
|
||||||||
def list(self) -> AsyncGenerator[str, None]: | ||||||||
objects: ListStream[list[ObjectMeta]] = obs.list(self.store) | ||||||||
return _transform_list(objects) | ||||||||
|
||||||||
def list_prefix(self, prefix: str) -> AsyncGenerator[str, None]: | ||||||||
objects: ListStream[list[ObjectMeta]] = obs.list(self.store, prefix=prefix) | ||||||||
return _transform_list(objects) | ||||||||
|
||||||||
def list_dir(self, prefix: str) -> AsyncGenerator[str, None]: | ||||||||
objects: ListStream[list[ObjectMeta]] = obs.list(self.store, prefix=prefix) | ||||||||
return _transform_list_dir(objects, prefix) | ||||||||
|
||||||||
|
||||||||
async def _transform_list( | ||||||||
list_stream: AsyncGenerator[list[ObjectMeta], None], | ||||||||
) -> AsyncGenerator[str, None]: | ||||||||
async for batch in list_stream: | ||||||||
for item in batch: | ||||||||
yield item["path"] | ||||||||
|
||||||||
|
||||||||
async def _transform_list_dir( | ||||||||
list_stream: AsyncGenerator[list[ObjectMeta], None], prefix: str | ||||||||
) -> AsyncGenerator[str, None]: | ||||||||
# We assume that the underlying object-store implementation correctly handles the | ||||||||
# prefix, so we don't double-check that the returned results actually start with the | ||||||||
# given prefix. | ||||||||
prefix_len = len(prefix) + 1 # If one is not added to the length, all items will contain "/" | ||||||||
async for batch in list_stream: | ||||||||
for item in batch: | ||||||||
# Yield this item if "/" does not exist after the prefix | ||||||||
item_path = item["path"][prefix_len:] | ||||||||
if "/" not in item_path: | ||||||||
yield item_path | ||||||||
|
||||||||
|
||||||||
class _BoundedRequest(TypedDict): | ||||||||
"""Range request with a known start and end byte. | ||||||||
|
||||||||
These requests can be multiplexed natively on the Rust side with | ||||||||
`obstore.get_ranges_async`. | ||||||||
""" | ||||||||
|
||||||||
original_request_index: int | ||||||||
"""The positional index in the original key_ranges input""" | ||||||||
|
||||||||
start: int | ||||||||
"""Start byte offset.""" | ||||||||
|
||||||||
end: int | ||||||||
"""End byte offset.""" | ||||||||
|
||||||||
|
||||||||
class _OtherRequest(TypedDict): | ||||||||
"""Offset or suffix range requests. | ||||||||
|
||||||||
These requests cannot be concurrent on the Rust side, and each need their own call | ||||||||
to `obstore.get_async`, passing in the `range` parameter. | ||||||||
""" | ||||||||
|
||||||||
original_request_index: int | ||||||||
"""The positional index in the original key_ranges input""" | ||||||||
|
||||||||
path: str | ||||||||
"""The path to request from.""" | ||||||||
|
||||||||
range: OffsetRange | SuffixRange | ||||||||
"""The range request type.""" | ||||||||
|
||||||||
|
||||||||
class _Response(TypedDict): | ||||||||
"""A response buffer associated with the original index that it should be restored to.""" | ||||||||
|
||||||||
original_request_index: int | ||||||||
"""The positional index in the original key_ranges input""" | ||||||||
|
||||||||
buffer: Buffer | ||||||||
"""The buffer returned from obstore's range request.""" | ||||||||
|
||||||||
|
||||||||
async def _make_bounded_requests( | ||||||||
store: obs.store.ObjectStore, | ||||||||
path: str, | ||||||||
requests: list[_BoundedRequest], | ||||||||
prototype: BufferPrototype, | ||||||||
) -> list[_Response]: | ||||||||
"""Make all bounded requests for a specific file. | ||||||||
|
||||||||
`obstore.get_ranges_async` allows for making concurrent requests for multiple ranges | ||||||||
within a single file, and will e.g. merge concurrent requests. This only uses one | ||||||||
single Python coroutine. | ||||||||
""" | ||||||||
|
||||||||
starts = [r["start"] for r in requests] | ||||||||
ends = [r["end"] for r in requests] | ||||||||
responses = await obs.get_ranges_async(store, path=path, starts=starts, ends=ends) | ||||||||
|
||||||||
buffer_responses: list[_Response] = [] | ||||||||
for request, response in zip(requests, responses, strict=True): | ||||||||
buffer_responses.append( | ||||||||
{ | ||||||||
"original_request_index": request["original_request_index"], | ||||||||
"buffer": prototype.buffer.from_bytes(memoryview(response)), | ||||||||
} | ||||||||
) | ||||||||
|
||||||||
return buffer_responses | ||||||||
|
||||||||
|
||||||||
async def _make_other_request( | ||||||||
store: obs.store.ObjectStore, | ||||||||
request: _OtherRequest, | ||||||||
prototype: BufferPrototype, | ||||||||
) -> list[_Response]: | ||||||||
"""Make suffix or offset requests. | ||||||||
|
||||||||
We return a `list[_Response]` for symmetry with `_make_bounded_requests` so that all | ||||||||
futures can be gathered together. | ||||||||
""" | ||||||||
if request["range"] is None: | ||||||||
resp = await obs.get_async(store, request["path"]) | ||||||||
else: | ||||||||
resp = await obs.get_async(store, request["path"], options={"range": request["range"]}) | ||||||||
buffer = await resp.bytes_async() | ||||||||
return [ | ||||||||
{ | ||||||||
"original_request_index": request["original_request_index"], | ||||||||
"buffer": prototype.buffer.from_bytes(buffer), | ||||||||
} | ||||||||
] | ||||||||
|
||||||||
|
||||||||
async def _get_partial_values( | ||||||||
store: obs.store.ObjectStore, | ||||||||
prototype: BufferPrototype, | ||||||||
key_ranges: Iterable[tuple[str, ByteRequest | None]], | ||||||||
) -> list[Buffer | None]: | ||||||||
"""Make multiple range requests. | ||||||||
|
||||||||
ObjectStore has a `get_ranges` method that will additionally merge nearby ranges, | ||||||||
but it's _per_ file. So we need to split these key_ranges into **per-file** key | ||||||||
ranges, and then reassemble the results in the original order. | ||||||||
|
||||||||
We separate into different requests: | ||||||||
|
||||||||
- One call to `obstore.get_ranges_async` **per target file** | ||||||||
- One call to `obstore.get_async` for each other request. | ||||||||
""" | ||||||||
key_ranges = list(key_ranges) | ||||||||
per_file_bounded_requests: dict[str, list[_BoundedRequest]] = defaultdict(list) | ||||||||
other_requests: list[_OtherRequest] = [] | ||||||||
|
||||||||
for idx, (path, byte_range) in enumerate(key_ranges): | ||||||||
if byte_range is None: | ||||||||
other_requests.append( | ||||||||
{ | ||||||||
"original_request_index": idx, | ||||||||
"path": path, | ||||||||
"range": None, | ||||||||
} | ||||||||
) | ||||||||
elif isinstance(byte_range, RangeByteRequest): | ||||||||
per_file_bounded_requests[path].append( | ||||||||
{"original_request_index": idx, "start": byte_range.start, "end": byte_range.end} | ||||||||
) | ||||||||
elif isinstance(byte_range, OffsetByteRequest): | ||||||||
other_requests.append( | ||||||||
{ | ||||||||
"original_request_index": idx, | ||||||||
"path": path, | ||||||||
"range": {"offset": byte_range.offset}, | ||||||||
} | ||||||||
) | ||||||||
elif isinstance(byte_range, SuffixByteRequest): | ||||||||
other_requests.append( | ||||||||
{ | ||||||||
"original_request_index": idx, | ||||||||
"path": path, | ||||||||
"range": {"suffix": byte_range.suffix}, | ||||||||
} | ||||||||
) | ||||||||
else: | ||||||||
raise ValueError(f"Unsupported range input: {byte_range}") | ||||||||
|
||||||||
futs: list[Coroutine[Any, Any, list[_Response]]] = [] | ||||||||
for path, bounded_ranges in per_file_bounded_requests.items(): | ||||||||
futs.append(_make_bounded_requests(store, path, bounded_ranges, prototype)) | ||||||||
|
||||||||
for request in other_requests: | ||||||||
futs.append(_make_other_request(store, request, prototype)) # noqa: PERF401 | ||||||||
|
||||||||
buffers: list[Buffer | None] = [None] * len(key_ranges) | ||||||||
|
||||||||
# TODO: this gather a list of list of Response; not sure if there's a way to | ||||||||
# unpack these lists inside of an `asyncio.gather`? | ||||||||
for responses in await asyncio.gather(*futs): | ||||||||
for resp in responses: | ||||||||
buffers[resp["original_request_index"]] = resp["buffer"] | ||||||||
|
||||||||
return buffers |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if
store
is not picklable, you'll need to implement__getstate__
and__setstate__
below. The basic strategy you'll want to use is:__getstate__
, extract the needed config to recreate an the_ObjectStore
__setstate__
, use the config from above to recreate the_ObjectStore
and setself.store
directlyThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @jhamman. @kylebarron and I discussed xfailing the pickling tests in this PR and implementing
__getstate__
and__setstate__
later on, sinceObjectStore
could be used for non-distributed reading/writing in the meantime. Would you be open to punting this a bit down the road? IIUC Kyle was supportive of adding pickling eventually and I'd be glad to help but it doesn't seem crucial to us right now given that this store will be marked as experimental.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, let's add the
__getstate__
methods with aNotImplementedError
so we can provide a clear error message when things go wrong.