-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-51817][SPARK-49578][CONNECT] Re-introduce ansiConfig fields in messageParameters of CAST_INVALID_INPUT and CAST_OVERFLOW #50604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
HyukjinKwon
approved these changes
Apr 16, 2025
cloud-fan
approved these changes
Apr 23, 2025
there are still test failures in
|
@cloud-fan thanks. I've fixed these now. Waiting for CI to confirm. |
Merged to master and branch-4.0. |
HyukjinKwon
pushed a commit
that referenced
this pull request
Apr 27, 2025
… messageParameters of CAST_INVALID_INPUT and CAST_OVERFLOW In Spark Connect, we guarantee that older clients are compatible with newer versions of the Spark Connect service. A previous change - e28c33b - broke this compatibility by removing the "ansiConfig" field in the message parameters for two error codes - "CAST_OVERFLOW" and "CAST_INVALID_INPUT". The Spark Connect client includes GrpcExceptionConverter.scala\[1] to convert error codes from the server to produce SQL compliant error codes on the client. The SQL compliant error codes and corresponding error messages are included in the error-conditions.json file. Older clients do not include the change (e28c33b) to this file and still include the `ansiConfig` parameter. Later versions of the Spark Connect service don't return this parameter resulting in an internal error\[2] that the correct error condition could not be formulated. This change reverts the changes on the server to continue producing the "ansiConfig" field so older clients can still correctly reformulate the error class. \[1]: https://github.com/apache/spark/blob/2ba156096e83adf7b0b2f5c38453d6fd37d95ded/sql/connect/common/src/main/scala/org/apache/spark/sql/connect/client/GrpcExceptionConverter.scala#L184 \[2]: https://github.com/apache/spark/blob/2ba156096e83adf7b0b2f5c38453d6fd37d95ded/common/utils/src/main/scala/org/apache/spark/ErrorClassesJSONReader.scala#L58 Explained above. No. Manual tests. No. Closes #50604 from nija-at/cast-invalid-input. Authored-by: Niranjan Jayakar <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]> (cherry picked from commit 528fe20) Signed-off-by: Hyukjin Kwon <[email protected]>
LuciferYang
added a commit
that referenced
this pull request
Apr 28, 2025
…at4.sql` and `postgreSQL/int8.sql` for Java 21 ### What changes were proposed in this pull request? This pr aims to Re-generate golden files of `postgreSQL/float4.sql` and `postgreSQL/int8.sql` for Java 21. #50604 forgot to generate them. ### Why are the changes needed? Restore Java 21 daily test: - master: https://github.com/apache/spark/actions/runs/14699959163/job/41247643789 - branch-4.0: https://github.com/apache/spark/actions/runs/14700610338/job/41249373892   ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Actions - locally test with Java 21 ### Was this patch authored or co-authored using generative AI tooling? No Closes #50740 from LuciferYang/SPARK-51817-FOLLOWUP. Authored-by: yangjie01 <[email protected]> Signed-off-by: yangjie01 <[email protected]>
LuciferYang
added a commit
that referenced
this pull request
Apr 28, 2025
…at4.sql` and `postgreSQL/int8.sql` for Java 21 ### What changes were proposed in this pull request? This pr aims to Re-generate golden files of `postgreSQL/float4.sql` and `postgreSQL/int8.sql` for Java 21. #50604 forgot to generate them. ### Why are the changes needed? Restore Java 21 daily test: - master: https://github.com/apache/spark/actions/runs/14699959163/job/41247643789 - branch-4.0: https://github.com/apache/spark/actions/runs/14700610338/job/41249373892   ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Actions - locally test with Java 21 ### Was this patch authored or co-authored using generative AI tooling? No Closes #50740 from LuciferYang/SPARK-51817-FOLLOWUP. Authored-by: yangjie01 <[email protected]> Signed-off-by: yangjie01 <[email protected]> (cherry picked from commit 6c37ee4) Signed-off-by: yangjie01 <[email protected]>
Kimahriman
pushed a commit
to Kimahriman/spark
that referenced
this pull request
May 13, 2025
… messageParameters of CAST_INVALID_INPUT and CAST_OVERFLOW ### What changes were proposed in this pull request? In Spark Connect, we guarantee that older clients are compatible with newer versions of the Spark Connect service. A previous change - e28c33b - broke this compatibility by removing the "ansiConfig" field in the message parameters for two error codes - "CAST_OVERFLOW" and "CAST_INVALID_INPUT". The Spark Connect client includes GrpcExceptionConverter.scala\[1] to convert error codes from the server to produce SQL compliant error codes on the client. The SQL compliant error codes and corresponding error messages are included in the error-conditions.json file. Older clients do not include the change (e28c33b) to this file and still include the `ansiConfig` parameter. Later versions of the Spark Connect service don't return this parameter resulting in an internal error\[2] that the correct error condition could not be formulated. This change reverts the changes on the server to continue producing the "ansiConfig" field so older clients can still correctly reformulate the error class. \[1]: https://github.com/apache/spark/blob/2ba156096e83adf7b0b2f5c38453d6fd37d95ded/sql/connect/common/src/main/scala/org/apache/spark/sql/connect/client/GrpcExceptionConverter.scala#L184 \[2]: https://github.com/apache/spark/blob/2ba156096e83adf7b0b2f5c38453d6fd37d95ded/common/utils/src/main/scala/org/apache/spark/ErrorClassesJSONReader.scala#L58 ### Why are the changes needed? Explained above. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manual tests. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#50604 from nija-at/cast-invalid-input. Authored-by: Niranjan Jayakar <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
Kimahriman
pushed a commit
to Kimahriman/spark
that referenced
this pull request
May 13, 2025
…at4.sql` and `postgreSQL/int8.sql` for Java 21 ### What changes were proposed in this pull request? This pr aims to Re-generate golden files of `postgreSQL/float4.sql` and `postgreSQL/int8.sql` for Java 21. apache#50604 forgot to generate them. ### Why are the changes needed? Restore Java 21 daily test: - master: https://github.com/apache/spark/actions/runs/14699959163/job/41247643789 - branch-4.0: https://github.com/apache/spark/actions/runs/14700610338/job/41249373892   ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Actions - locally test with Java 21 ### Was this patch authored or co-authored using generative AI tooling? No Closes apache#50740 from LuciferYang/SPARK-51817-FOLLOWUP. Authored-by: yangjie01 <[email protected]> Signed-off-by: yangjie01 <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
In Spark Connect, we guarantee that older clients are compatible with newer
versions of the Spark Connect service.
A previous change - e28c33b - broke this compatibility by removing the
"ansiConfig" field in the message parameters for two error codes -
"CAST_OVERFLOW" and "CAST_INVALID_INPUT".
The Spark Connect client includes GrpcExceptionConverter.scala[1] to
convert error codes from the server to produce SQL compliant error codes
on the client. The SQL compliant error codes and corresponding error
messages are included in the error-conditions.json file. Older clients do not
include the change (e28c33b) to this file and still include the
ansiConfig
parameter. Later versions of the Spark Connect service don't return this
parameter resulting in an internal error[2] that the correct error condition
could not be formulated.
This change reverts the changes on the server to continue producing the
"ansiConfig" field so older clients can still correctly reformulate the error class.
[1]:
spark/sql/connect/common/src/main/scala/org/apache/spark/sql/connect/client/GrpcExceptionConverter.scala
Line 184 in 2ba1560
[2]:
spark/common/utils/src/main/scala/org/apache/spark/ErrorClassesJSONReader.scala
Line 58 in 2ba1560
Why are the changes needed?
Explained above.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Manual tests.
Was this patch authored or co-authored using generative AI tooling?
No.