Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added fallback using reflection for backward-compatibility #1573

Merged
merged 4 commits into from
Mar 31, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/spark_sql_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ jobs:
matrix:
os: [ubuntu-24.04]
java-version: [11]
spark-version: [{short: '3.4', full: '3.4.3'}, {short: '3.5', full: '3.5.5'}]
spark-version: [{short: '3.4', full: '3.4.3'}, {short: '3.5', full: '3.5.4'}, {short: '3.5', full: '3.5.5'}]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we advanced the minimum version to 3.5.5, I am not sure if we should add back 3.5.4 tests.
SQL tests can be done here, but tests in Comet are not done with 3.5.4 anymore.

If we decide to support 3.5.4, I would vote for doing it by adding a maven profile.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a maven profile using spark 3.5.4 will break compilation for the 3.5 shim -
So to support both 3.5.4 and 3.5.5 in compilation we will need either separate shims for each patch version,
or the 3.5 shim to support all versions through reflection.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say we should create a separate Shims for Spark 3.5.4 where it is necessary. I know it is right now we only do profile at 3.4, 3.5, 4.0 level. but I observed a couple of API changes between patch version changes.

Comet Release for 3.5 does not mean that it works with all 3.5.x, currently. Now the minimum version will be 3.5.5 in the next release. Either we drop the 3.5.4 support, or create a new profile.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went for the reflection approach after my post-merge discussion with @wForget in #1565 -
since creating different shims for 3.5.4 also means a different release for 3.5.4, which might cause confusion for users.
Since I did the upgrade to 3.5.5 separately, dropping support for 3.5.4 simply means discarding this PR.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't need to add 3.5.4 ci, just make sure it can be compiled successfully. Or how about deleting it after verification in this pr?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But it can't be compiled succesfully - for the same reason 3.5.5 couldn't be compiled when the 3.5 shims were compiled against 3.5.4...

module:
- {name: "catalyst", args1: "catalyst/test", args2: ""}
- {name: "sql/core-1", args1: "", args2: sql/testOnly * -- -l org.apache.spark.tags.ExtendedSQLTest -l org.apache.spark.tags.SlowSQLTest}
Expand Down
Loading
Loading