Skip to content
Closed
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ private[spark] object HttpBroadcast extends Logging {
SparkEnv.get.blockManager.master.removeBroadcast(id, removeFromDriver, blocking)
if (removeFromDriver) {
val file = getFile(id)
files.remove(file.toString)
files.remove(file.getCanonicalPath)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm and also note that the file names are added using the value of getAbsolutePath(). If its absolute path were not-canonicalized it would not work.

Why not just have File objects in the set files? They implement the desired equals() semantics:
http://docs.oracle.com/javase/7/docs/api/java/io/File.html#equals(java.lang.Object)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have tried both of the following :

  1. deleteBroadcastFile(new File(file.toString).getCanonicalPath)
  2. deleteBroadcastFile(new File((file.toString).getCanonicalPath))
    In both the cases IDE complaints !

Niraj

On Thu, Apr 24, 2014 at 10:21 PM, Sean Owen [email protected]:

In core/src/main/scala/org/apache/spark/broadcast/HttpBroadcast.scala:

@@ -213,7 +213,7 @@ private[spark] object HttpBroadcast extends Logging {
SparkEnv.get.blockManager.master.removeBroadcast(id, removeFromDriver, blocking)
if (removeFromDriver) {
val file = getFile(id)

  •  files.remove(file.toString)
    
  •  files.remove(file.getCanonicalPath)
    

Hmm and also note that the file names are added using the value of
getAbsolutePath(). If its absolute path were not-canonicalized it would
not work.

Why not just have File objects in the set files? They implement the
desired equals() semantics:

http://docs.oracle.com/javase/7/docs/api/java/io/File.html#equals(java.lang.Object)


Reply to this email directly or view it on GitHubhttps://github.com//pull/546/files#r11983521
.

deleteBroadcastFile(file)
}
}
Expand All @@ -229,7 +229,7 @@ private[spark] object HttpBroadcast extends Logging {
val (file, time) = (entry.getKey, entry.getValue)
if (time < cleanupTime) {
iterator.remove()
deleteBroadcastFile(new File(file.toString))
deleteBroadcastFile(new File(file.getCanonicalPath))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be new File(file.toString).getCanonicalPath ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Removed my earlier incorrect comment) @techaddict is correct since the value file is a String. Niraj your two new versions are either create a String argument where a File is needed or call File methods on a String. I think the correct invocation is simply deleteBroadcastFile(new File(file)). Everything else would be superfluous. (And this too would simplify if the set of values contained Files not Strings of paths.)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure Sean, I think I agree with you cause hashSet entries are string:

private val files = new TimeStampedHashSet[String]

and I am making it to deleteBroadcastFile(new File(file)) as per your
suggestion.

Niraj

On Thu, Apr 24, 2014 at 10:52 PM, Sean Owen [email protected]:

In core/src/main/scala/org/apache/spark/broadcast/HttpBroadcast.scala:

@@ -229,7 +229,7 @@ private[spark] object HttpBroadcast extends Logging {
val (file, time) = (entry.getKey, entry.getValue)
if (time < cleanupTime) {
iterator.remove()

  •    deleteBroadcastFile(new File(file.toString))
    
  •    deleteBroadcastFile(new File(file.getCanonicalPath))
    

(Removed my earlier incorrect comment) @techaddicthttps://github.com/techaddictis correct since the value
file is a String. Niraj your two new versions are either create a Stringargument where a
File is needed or call File methods on a String. I think the correct
invocation is simply deleteBroadcastFile(new File(file)). Everything else
would be superfluous. (And this too would simplify if the set of values
contained Files not Strings of paths.)


Reply to this email directly or view it on GitHubhttps://github.com//pull/546/files#r11983954
.

}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,14 @@ import org.scalatest.{BeforeAndAfterEach, FunSuite}

import org.apache.spark.SparkConf


class DiskBlockManagerSuite extends FunSuite with BeforeAndAfterEach {
private val testConf = new SparkConf(false)
val rootDir0 = Files.createTempDir()
rootDir0.deleteOnExit()
val rootDir1 = Files.createTempDir()
rootDir1.deleteOnExit()
val rootDirs = rootDir0.getName + "," + rootDir1.getName
val rootDirs = rootDir0.getCanonicalPath + "," + rootDir1.getCanonicalPath
println("Created root dirs: " + rootDirs)

// This suite focuses primarily on consolidation features,
Expand Down