-
Notifications
You must be signed in to change notification settings - Fork 117
streaming ingestion support for PUT operation #643
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
sreekanth-db
wants to merge
7
commits into
databricks:main
Choose a base branch
from
sreekanth-db:streaming-ops
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+376
−37
Open
Changes from all commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
b79ca86
streaming ingestion support for PUT operation
sreekanth-db 4cd2cee
code formatter
sreekanth-db 8ae220c
type error fix
sreekanth-db 1563a18
addressing review comments
sreekanth-db dd375bb
code formatting
sreekanth-db 68b0d3f
updated test and using enums
sreekanth-db 63e5c85
linting fix
sreekanth-db File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
#!/usr/bin/env python3 | ||
""" | ||
Simple example of streaming PUT operations. | ||
|
||
This demonstrates the basic usage of streaming PUT with the __input_stream__ token. | ||
""" | ||
|
||
import io | ||
import os | ||
from databricks import sql | ||
|
||
with sql.connect( | ||
server_hostname=os.getenv("DATABRICKS_SERVER_HOSTNAME"), | ||
http_path=os.getenv("DATABRICKS_HTTP_PATH"), | ||
access_token=os.getenv("DATABRICKS_TOKEN"), | ||
) as connection: | ||
|
||
with connection.cursor() as cursor: | ||
# Create a simple data stream | ||
data = b"Hello, streaming world!" | ||
stream = io.BytesIO(data) | ||
|
||
# Get catalog, schema, and volume from environment variables | ||
catalog = os.getenv("DATABRICKS_CATALOG") | ||
schema = os.getenv("DATABRICKS_SCHEMA") | ||
volume = os.getenv("DATABRICKS_VOLUME") | ||
|
||
# Upload to Unity Catalog volume | ||
cursor.execute( | ||
f"PUT '__input_stream__' INTO '/Volumes/{catalog}/{schema}/{volume}/hello.txt' OVERWRITE", | ||
input_stream=stream | ||
) | ||
|
||
print("File uploaded successfully!") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
import time | ||
from typing import Dict, Tuple, List, Optional, Any, Union, Sequence | ||
from typing import Dict, Tuple, List, Optional, Any, Union, Sequence, BinaryIO | ||
import pandas | ||
|
||
try: | ||
|
@@ -67,6 +67,7 @@ | |
) | ||
from databricks.sql.telemetry.latency_logger import log_latency | ||
from databricks.sql.telemetry.models.enums import StatementType | ||
from databricks.sql.common.http import DatabricksHttpClient, HttpMethod, UploadType | ||
|
||
logger = logging.getLogger(__name__) | ||
|
||
|
@@ -615,8 +616,34 @@ def _check_not_closed(self): | |
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
def _validate_staging_http_response( | ||
self, response: requests.Response, operation_name: str = "staging operation" | ||
) -> None: | ||
|
||
# Check response codes | ||
OK = requests.codes.ok # 200 | ||
CREATED = requests.codes.created # 201 | ||
ACCEPTED = requests.codes.accepted # 202 | ||
NO_CONTENT = requests.codes.no_content # 204 | ||
|
||
if response.status_code not in [OK, CREATED, NO_CONTENT, ACCEPTED]: | ||
raise OperationalError( | ||
f"{operation_name} over HTTP was unsuccessful: {response.status_code}-{response.text}", | ||
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
if response.status_code == ACCEPTED: | ||
logger.debug( | ||
"Response code %s from server indicates %s was accepted " | ||
"but not yet applied on the server. It's possible this command may fail later.", | ||
ACCEPTED, | ||
operation_name, | ||
) | ||
|
||
def _handle_staging_operation( | ||
self, staging_allowed_local_path: Union[None, str, List[str]] | ||
self, | ||
staging_allowed_local_path: Union[None, str, List[str]], | ||
input_stream: Optional[BinaryIO] = None, | ||
): | ||
"""Fetch the HTTP request instruction from a staging ingestion command | ||
and call the designated handler. | ||
|
@@ -625,6 +652,28 @@ def _handle_staging_operation( | |
is not descended from staging_allowed_local_path. | ||
""" | ||
|
||
assert self.active_result_set is not None | ||
row = self.active_result_set.fetchone() | ||
assert row is not None | ||
|
||
# Parse headers | ||
headers = ( | ||
json.loads(row.headers) if isinstance(row.headers, str) else row.headers | ||
) | ||
headers = dict(headers) if headers else {} | ||
|
||
# Handle __input_stream__ token for PUT operations | ||
if ( | ||
row.operation == "PUT" | ||
and getattr(row, "localFile", None) == "__input_stream__" | ||
): | ||
return self._handle_staging_put_stream( | ||
presigned_url=row.presignedUrl, | ||
stream=input_stream, | ||
headers=headers, | ||
) | ||
|
||
# For non-streaming operations, validate staging_allowed_local_path | ||
if isinstance(staging_allowed_local_path, type(str())): | ||
_staging_allowed_local_paths = [staging_allowed_local_path] | ||
elif isinstance(staging_allowed_local_path, type(list())): | ||
|
@@ -639,10 +688,6 @@ def _handle_staging_operation( | |
os.path.abspath(i) for i in _staging_allowed_local_paths | ||
] | ||
|
||
assert self.active_result_set is not None | ||
row = self.active_result_set.fetchone() | ||
assert row is not None | ||
|
||
# Must set to None in cases where server response does not include localFile | ||
abs_localFile = None | ||
|
||
|
@@ -665,19 +710,16 @@ def _handle_staging_operation( | |
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
# May be real headers, or could be json string | ||
headers = ( | ||
json.loads(row.headers) if isinstance(row.headers, str) else row.headers | ||
) | ||
|
||
handler_args = { | ||
"presigned_url": row.presignedUrl, | ||
"local_file": abs_localFile, | ||
"headers": dict(headers) or {}, | ||
"headers": headers, | ||
} | ||
|
||
logger.debug( | ||
f"Attempting staging operation indicated by server: {row.operation} - {getattr(row, 'localFile', '')}" | ||
"Attempting staging operation indicated by server: %s - %s", | ||
row.operation, | ||
getattr(row, "localFile", ""), | ||
) | ||
|
||
# TODO: Create a retry loop here to re-attempt if the request times out or fails | ||
|
@@ -696,6 +738,45 @@ def _handle_staging_operation( | |
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
@log_latency(StatementType.SQL) | ||
def _handle_staging_put_stream( | ||
self, | ||
presigned_url: str, | ||
stream: BinaryIO, | ||
headers: dict = {}, | ||
) -> None: | ||
"""Handle PUT operation with streaming data. | ||
|
||
Args: | ||
presigned_url: The presigned URL for upload | ||
stream: Binary stream to upload | ||
headers: HTTP headers | ||
|
||
Raises: | ||
ProgrammingError: If no input stream is provided | ||
OperationalError: If the upload fails | ||
""" | ||
|
||
if not stream: | ||
raise ProgrammingError( | ||
"No input stream provided for streaming operation", | ||
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
http_client = DatabricksHttpClient.get_instance() | ||
|
||
# Stream directly to presigned URL | ||
with http_client.execute( | ||
method=HttpMethod.PUT, | ||
url=presigned_url, | ||
data=stream, | ||
headers=headers, | ||
timeout=300, # 5 minute timeout | ||
) as response: | ||
Comment on lines
+766
to
+775
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @vikrantpuppala can you plz discuss , on the unified approach being introduced. Thanks |
||
self._validate_staging_http_response( | ||
response, UploadType.STREAM_UPLOAD.value | ||
) | ||
|
||
sreekanth-db marked this conversation as resolved.
Show resolved
Hide resolved
|
||
@log_latency(StatementType.SQL) | ||
def _handle_staging_put( | ||
self, presigned_url: str, local_file: str, headers: Optional[dict] = None | ||
|
@@ -714,27 +795,7 @@ def _handle_staging_put( | |
with open(local_file, "rb") as fh: | ||
r = requests.put(url=presigned_url, data=fh, headers=headers) | ||
|
||
# fmt: off | ||
# Design borrowed from: https://stackoverflow.com/a/2342589/5093960 | ||
|
||
OK = requests.codes.ok # 200 | ||
CREATED = requests.codes.created # 201 | ||
ACCEPTED = requests.codes.accepted # 202 | ||
NO_CONTENT = requests.codes.no_content # 204 | ||
|
||
# fmt: on | ||
|
||
if r.status_code not in [OK, CREATED, NO_CONTENT, ACCEPTED]: | ||
raise OperationalError( | ||
f"Staging operation over HTTP was unsuccessful: {r.status_code}-{r.text}", | ||
session_id_hex=self.connection.get_session_id_hex(), | ||
) | ||
|
||
if r.status_code == ACCEPTED: | ||
logger.debug( | ||
f"Response code {ACCEPTED} from server indicates ingestion command was accepted " | ||
+ "but not yet applied on the server. It's possible this command may fail later." | ||
) | ||
self._validate_staging_http_response(r, UploadType.FILE_UPLOAD.value) | ||
|
||
@log_latency(StatementType.SQL) | ||
def _handle_staging_get( | ||
|
@@ -784,6 +845,7 @@ def execute( | |
operation: str, | ||
parameters: Optional[TParameterCollection] = None, | ||
enforce_embedded_schema_correctness=False, | ||
input_stream: Optional[BinaryIO] = None, | ||
) -> "Cursor": | ||
""" | ||
Execute a query and wait for execution to complete. | ||
|
@@ -820,7 +882,6 @@ def execute( | |
logger.debug( | ||
"Cursor.execute(operation=%s, parameters=%s)", operation, parameters | ||
) | ||
|
||
param_approach = self._determine_parameter_approach(parameters) | ||
if param_approach == ParameterApproach.NONE: | ||
prepared_params = NO_NATIVE_PARAMS | ||
|
@@ -857,7 +918,8 @@ def execute( | |
|
||
if self.active_result_set and self.active_result_set.is_staging_operation: | ||
self._handle_staging_operation( | ||
staging_allowed_local_path=self.connection.staging_allowed_local_path | ||
staging_allowed_local_path=self.connection.staging_allowed_local_path, | ||
input_stream=input_stream, | ||
) | ||
|
||
return self | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,66 @@ | ||
#!/usr/bin/env python3 | ||
""" | ||
E2E tests for streaming PUT operations. | ||
""" | ||
|
||
import io | ||
import logging | ||
import pytest | ||
from datetime import datetime | ||
|
||
logger = logging.getLogger(__name__) | ||
|
||
|
||
class PySQLStreamingPutTestSuiteMixin: | ||
"""Test suite for streaming PUT operations.""" | ||
|
||
def test_streaming_put_basic(self, catalog, schema): | ||
sreekanth-db marked this conversation as resolved.
Show resolved
Hide resolved
|
||
"""Test basic streaming PUT functionality.""" | ||
|
||
# Create test data | ||
test_data = b"Hello, streaming world! This is test data." | ||
filename = "streaming_put_test.txt" | ||
file_path = f"/Volumes/{catalog}/{schema}/e2etests/{filename}" | ||
|
||
try: | ||
with self.connection() as conn: | ||
with conn.cursor() as cursor: | ||
self._cleanup_test_file(file_path) | ||
|
||
with io.BytesIO(test_data) as stream: | ||
cursor.execute( | ||
f"PUT '__input_stream__' INTO '{file_path}'", | ||
input_stream=stream | ||
) | ||
|
||
# Verify file exists | ||
cursor.execute(f"LIST '/Volumes/{catalog}/{schema}/e2etests/'") | ||
files = cursor.fetchall() | ||
|
||
# Check if our file is in the list | ||
file_paths = [row[0] for row in files] | ||
assert file_path in file_paths, f"File {file_path} not found in {file_paths}" | ||
finally: | ||
self._cleanup_test_file(file_path) | ||
|
||
def test_streaming_put_missing_stream(self, catalog, schema): | ||
"""Test that missing stream raises appropriate error.""" | ||
|
||
with self.connection() as conn: | ||
with conn.cursor() as cursor: | ||
# Test without providing stream | ||
with pytest.raises(Exception): # Should fail | ||
cursor.execute( | ||
f"PUT '__input_stream__' INTO '/Volumes/{catalog}/{schema}/e2etests/test.txt'" | ||
# Note: No input_stream parameter | ||
) | ||
|
||
def _cleanup_test_file(self, file_path): | ||
"""Clean up a test file if it exists.""" | ||
try: | ||
with self.connection(extra_params={"staging_allowed_local_path": "/"}) as conn: | ||
with conn.cursor() as cursor: | ||
cursor.execute(f"REMOVE '{file_path}'") | ||
logger.info("Successfully cleaned up test file: %s", file_path) | ||
except Exception as e: | ||
logger.error("Cleanup failed for %s: %s", file_path, e) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.