[SPARK-16459][SQL] Prevent dropping current database#14115
[SPARK-16459][SQL] Prevent dropping current database#14115dongjoon-hyun wants to merge 6 commits intoapache:masterfrom dongjoon-hyun:SPARK-16459
Conversation
|
Test build #62022 has finished for PR 14115 at commit
|
|
I think it's scary to do this. It'd be better to forbid dropping the current database. This is Postgres' behavior: |
|
Sure. No problem! |
|
I'll fix this one first. This one is easier. :) |
default when current database is dropped| if (dbName == "default") { | ||
| throw new AnalysisException(s"Can not drop default database") | ||
| if (dbName == DEFAULT_DATABASE || dbName == getCurrentDatabase) { | ||
| throw new AnalysisException(s"Can not drop `${DEFAULT_DATABASE}` or current database") |
There was a problem hiding this comment.
hm i think we should break this into two messages rather than a single one.
|
Now, it preserves the previous behavior for dropping default db, and throws new exception message for new cases only. |
| if (dbName == "default") { | ||
| if (dbName == DEFAULT_DATABASE) { | ||
| throw new AnalysisException(s"Can not drop default database") | ||
| } else if (dbName == getCurrentDatabase) { |
There was a problem hiding this comment.
do we need to check case sensitivity?
There was a problem hiding this comment.
Oops. formatDatabaseName returns the raw string for case sensitive case.
Hm. That is the same situation with "default" database, isn't?
There was a problem hiding this comment.
I'll add two testcases for both and fix this.
There was a problem hiding this comment.
Oh, I was confused. For case sensitive case, we don't need to handle that.
Dropping with different names will fails due to lookup failure.
|
Test build #62032 has finished for PR 14115 at commit
|
|
Test build #62034 has finished for PR 14115 at commit
|
|
Test build #62037 has finished for PR 14115 at commit
|
| hadoopConf: Configuration) extends Logging { | ||
| import CatalogTypes.TablePartitionSpec | ||
|
|
||
| val DEFAULT_DATABASE = "default" |
There was a problem hiding this comment.
Nit: this should be in an object SessionCatalog?
There was a problem hiding this comment.
Oh, you're right. object is the best to keep this one. Thank you for review, @srowen .
There was a problem hiding this comment.
Anyway, I'll make that first.
It's easy to revert one commit.
And, if possible, I prefer to have object SessionCatalog for the future
|
Thank you, @srowen . It's done! |
|
Test build #62062 has finished for PR 14115 at commit
|
|
Hi, @rxin . |
|
After merging this, I'll add the same prevention for |
|
This looks alright. cc @hvanhovell can you review and merge? |
|
Hi, @hvanhovell . |
| */ | ||
| private[sql] def reset(): Unit = synchronized { | ||
| val default = "default" | ||
| val default = DEFAULT_DATABASE |
There was a problem hiding this comment.
NIT: Shouldn't we just replace the val default by DEFAULT_DATABASE?
|
Thank you for review, @hvanhovell . |
|
LGTM - pending jenkins. |
|
Test build #3177 has finished for PR 14115 at commit
|
|
Test build #62086 has finished for PR 14115 at commit
|
|
Thank you for review and merging, @rxin and @hvanhovell . |
|
@hvanhovell please leave a message on the pr next time when you merge one -- and also indicate the branch this was merged in (master in this case). |
|
Should this be in 2.0? |
|
If possible, I hope. :) |
This PR prevents dropping the current database to avoid errors like the followings.
```scala
scala> sql("create database delete_db")
scala> sql("use delete_db")
scala> sql("drop database delete_db")
scala> sql("create table t as select 1")
org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database `delete_db` not found;
```
Pass the Jenkins tests including an updated testcase.
Author: Dongjoon Hyun <dongjoon@apache.org>
Closes #14115 from dongjoon-hyun/SPARK-16459.
(cherry picked from commit 7ac79da)
Signed-off-by: Herman van Hovell <hvanhovell@databricks.com>
|
Cherry picked this into 2.0. |
What changes were proposed in this pull request?
This PR prevents dropping the current database to avoid errors like the followings.
How was this patch tested?
Pass the Jenkins tests including an updated testcase.