Skip to content

Conversation

@kevinyu98
Copy link
Contributor

What changes were proposed in this pull request?

Hive supports these cli commands to manage the resource Hive Doc :
ADD/DELETE (FILE(s)<filepath..>|JAR(s) <jarpath..>)
LIST (FILE(S) [filepath ...] | JAR(S) [jarpath ...])

but SPARK only supports two commands
ADD (FILE <filepath> | JAR <jarpath>)
LIST (FILE(S) [filepath ...] | JAR(S) [jarpath ...]) for now.

This PR is to add the DELETE FILE command into Spark SQL and I will submit another PR for the DELETE JAR(s).

DELETE FILE <filepath>

Example:

DELETE FILE

scala> spark.sql("add file /Users/qianyangyu/myfile.txt")
res0: org.apache.spark.sql.DataFrame = []

scala> spark.sql("add file /Users/qianyangyu/myfile2.txt")
res1: org.apache.spark.sql.DataFrame = []

scala> spark.sql("list file")
res2: org.apache.spark.sql.DataFrame = [Results: string]

scala> spark.sql("list file").show(false)
+----------------------------------+
|Results                           |
+----------------------------------+
|file:/Users/qianyangyu/myfile2.txt|
|file:/Users/qianyangyu/myfile.txt |
+----------------------------------+
scala> spark.sql("delete file /Users/qianyangyu/myfile.txt")
res4: org.apache.spark.sql.DataFrame = []

scala> spark.sql("list file").show(false)
+----------------------------------+
|Results                           |
+----------------------------------+
|file:/Users/qianyangyu/myfile2.txt|
+----------------------------------+


scala> spark.sql("delete file /Users/qianyangyu/myfile2.txt")
res6: org.apache.spark.sql.DataFrame = []

scala> spark.sql("list file").show(false)
+-------+
|Results|
+-------+
+-------+

How was this patch tested?

Add test cases in Spark-SQL SPARK-Shell and SparkContext suites.

(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)

get latest code from upstream
adding trim characters support
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

* filesystems), or an HTTP, HTTPS or FTP URI. To access the file in Spark jobs,
* use `SparkFiles.get(fileName)` to find its download location.
*
*/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is fairly confusing -- i'd assume this is actually deleting the path given.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Reynold: Thanks very much for reviewing the code.
yes, it is deleting the path from the addedFile hashmap, the path will be generated as key and stored in the map.
The addFile use this logical to generate the key and stored in the hashmap, so in order to find the same key, I have to use the same logical to generate the key.
For example:
for this local file, the addFile will generate a 'file' in front of the path.

spark.sql("add file /Users/qianyangyu/myfile.txt")

scala> spark.sql("list file").show(false)
+----------------------------------+
|Results |
+----------------------------------+
|file:/Users/qianyangyu/myfile2.txt|
|file:/Users/qianyangyu/myfile.txt |
+----------------------------------+

but for the remote location file, it will just take the path.

scala> spark.sql("add file hdfs://bdavm009.svl.ibm.com:8020/tmp/test.txt")
res17: org.apache.spark.sql.DataFrame = []

scala> spark.sql("list file").show(false)
+---------------------------------------------+
|Results |
+---------------------------------------------+
|file:/Users/qianyangyu/myfile.txt |
|hdfs://bdavm009.svl.ibm.com:8020/tmp/test.txt|
+---------------------------------------------+

if the command is issued from the worker node and add local file, the path will be added into the NettyStreamManager's hashmap and using that environment's path as key to store in the addedFiles.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have updated the deleteFile comments to make it more clear. Thanks for reviewing.

@vanzin
Copy link
Contributor

vanzin commented Aug 4, 2016

@kevinyu98 Could you update the PR and fix merge conflicts? Thanks

@kevinyu98
Copy link
Contributor Author

@vanzin Hello Marcelo: I am so sorry that I didn't notice your update. I have fix the merge conflicts and can you help review it? Thanks.

@gatorsmile
Copy link
Member

@kevinyu98 Can you please close it? It seems like there is not a lot of interest in adding this functionality natively in Spark. If anybody wants this feature, we can reopen it later?

@kevinyu98
Copy link
Contributor Author

sure

@gatorsmile
Copy link
Member

We are closing it due to inactivity. please do reopen if you want to push it forward. Thanks!

@asfgit asfgit closed this in b32bd00 Jun 27, 2017
zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
## What changes were proposed in this pull request?

This PR proposes to close stale PRs, mostly the same instances with apache#18017

I believe the author in apache#14807 removed his account.

Closes apache#7075
Closes apache#8927
Closes apache#9202
Closes apache#9366
Closes apache#10861
Closes apache#11420
Closes apache#12356
Closes apache#13028
Closes apache#13506
Closes apache#14191
Closes apache#14198
Closes apache#14330
Closes apache#14807
Closes apache#15839
Closes apache#16225
Closes apache#16685
Closes apache#16692
Closes apache#16995
Closes apache#17181
Closes apache#17211
Closes apache#17235
Closes apache#17237
Closes apache#17248
Closes apache#17341
Closes apache#17708
Closes apache#17716
Closes apache#17721
Closes apache#17937

Added:
Closes apache#14739
Closes apache#17139
Closes apache#17445
Closes apache#18042
Closes apache#18359

Added:
Closes apache#16450
Closes apache#16525
Closes apache#17738

Added:
Closes apache#16458
Closes apache#16508
Closes apache#17714

Added:
Closes apache#17830
Closes apache#14742

## How was this patch tested?

N/A

Author: hyukjinkwon <[email protected]>

Closes apache#18417 from HyukjinKwon/close-stale-pr.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants