-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-30416][SQL] Log a warning for deprecated SQL config in set() and unset()
#27092
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #116099 has finished for PR 27092 at commit
|
|
@HyukjinKwon Please, have a look at the PR. |
|
@cloud-fan @maropu Please, have a look at the PR. |
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
Outdated
Show resolved
Hide resolved
|
Test build #116164 has finished for PR 27092 at commit
|
| } | ||
|
|
||
| test("log deprecation warnings") { | ||
| val logAppender = new AppenderSkeleton { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: The same class logAppender seems to be defined in some places below, so can we define a helper method for this test purpose somewhere (e.g., TestUtils)?
$grep -nr "extends AppenderSkeleton" .
./catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ResolveHintsSuite.scala:36: class MockAppender extends AppenderSkeleton {
./catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CodeGenerationSuite.scala:525: class MockAppender extends AppenderSkeleton {
./catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/OptimizerLoggingSuite.scala:42: class MockAppender extends AppenderSkeleton {
./core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala:1766: class TestAppender extends AppenderSkeleton {
./core/src/test/scala/org/apache/spark/sql/JoinHintSuite.scala:41: class MockAppender extends AppenderSkeleton {
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it is slightly orthogonal to the PR but if you think it makes sense I will do that here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, I think its ok in follow-up.
maropu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks useful! LGTM. cc: @HyukjinKwon
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
Outdated
Show resolved
Hide resolved
| */ | ||
| val deprecatedSQLConfigs: Map[String, DeprecatedConfig] = { | ||
| val configs = Seq( | ||
| DeprecatedConfig(VARIABLE_SUBSTITUTE_DEPTH.key, "2.1", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't found where this config is used. We can remove it, I think.
| .doc("When true, check all the partition paths under the table\'s root directory " + | ||
| "when reading data stored in HDFS. This configuration will be deprecated in the future " + | ||
| "releases and replaced by spark.files.ignoreMissingFiles.") | ||
| s"releases and replaced by ${SPARK_IGNORE_MISSING_FILES.key}.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@cloud-fan Regarding to your comment #19868 (comment), spark.sql.hive.verifyPartitionPath can be changed at runtime but spark.files.ignoreMissingFiles cannot be. Is it fair replacement?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can just slightly reword that users can use spark.files.ignoreMissingFiles instead of saying it's a replacement if you're concerned. I don't believe this configuration is commonly used enough, and it should be fine.
If users find this is unreasonable, we can un-deprecate it later given the feedback.
|
Test build #116292 has finished for PR 27092 at commit
|
|
Test build #116299 has finished for PR 27092 at commit
|
|
@MaxGekk mind resolving conflicts? |
…cated-sql-configs # Conflicts: # sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
|
Test build #116349 has finished for PR 27092 at commit
|
|
jenkins, retest this, please |
|
Test build #116371 has finished for PR 27092 at commit
|
|
@viirya, mind asking it again to the r sysadmin? It starts to fail again: |
Sure, just asked CRAN admin for help. :) |
|
@HyukjinKwon Although I don't get reply, looks like it was fixed because I saw successful build from other PR. |
|
retest this please |
|
Test build #116416 has finished for PR 27092 at commit
|
|
Thanks, @viirya Merged to master. |
What changes were proposed in this pull request?
Put all deprecated SQL configs the map
SQLConf.deprecatedSQLConfigswith extra info about when configs were deprecated and additional comments that explain why a config was deprecated, what an user can use instead of it. Here is the list of already deprecated configs:Output warning in
set()andunset()about deprecated SQL configsWhy are the changes needed?
This should improve UX with Spark SQL and notify users about already deprecated SQL configs.
Does this PR introduce any user-facing change?
Yes, before:
After:
How was this patch tested?
Add new test which registers new log appender and catches all logging to check that
set()andunset()log any warning.