Skip to content

Commit 6377ada

Browse files
viiryaAndrew Or
authored andcommitted
[SPARK-3970] Remove duplicate removal of local dirs
The shutdown hook of `DiskBlockManager` would remove localDirs. So do not need to register them with `Utils.registerShutdownDeleteDir`. It causes duplicate removal of these local dirs and corresponding exceptions. Author: Liang-Chi Hsieh <[email protected]> Closes #2826 from viirya/fix_duplicate_localdir_remove and squashes the following commits: 051d4b5 [Liang-Chi Hsieh] check dir existing and return empty List as default. 2b91a9c [Liang-Chi Hsieh] remove duplicate removal of local dirs.
1 parent f4e8c28 commit 6377ada

File tree

2 files changed

+8
-5
lines changed

2 files changed

+8
-5
lines changed

core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -149,7 +149,6 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
149149
}
150150

151151
private def addShutdownHook() {
152-
localDirs.foreach(localDir => Utils.registerShutdownDeleteDir(localDir))
153152
Runtime.getRuntime.addShutdownHook(new Thread("delete Spark local dirs") {
154153
override def run(): Unit = Utils.logUncaughtExceptions {
155154
logDebug("Shutdown hook called")

core/src/main/scala/org/apache/spark/util/Utils.scala

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -739,11 +739,15 @@ private[spark] object Utils extends Logging {
739739
}
740740

741741
private def listFilesSafely(file: File): Seq[File] = {
742-
val files = file.listFiles()
743-
if (files == null) {
744-
throw new IOException("Failed to list files for dir: " + file)
742+
if (file.exists()) {
743+
val files = file.listFiles()
744+
if (files == null) {
745+
throw new IOException("Failed to list files for dir: " + file)
746+
}
747+
files
748+
} else {
749+
List()
745750
}
746-
files
747751
}
748752

749753
/**

0 commit comments

Comments
 (0)