Skip to content

Conversation

@chu11
Copy link
Contributor

@chu11 chu11 commented Jun 11, 2014

If SPARK_CONF_DIR environment variable is set, search it for spark-defaults.conf.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@vanzin
Copy link
Contributor

vanzin commented Jun 11, 2014

Very similar to: #997

@mateiz
Copy link
Contributor

mateiz commented Jul 29, 2014

When are you guys running into this, is it when you launch an app through an IDE or something like that?

@chu11
Copy link
Contributor Author

chu11 commented Jul 29, 2014

No, just launching normally via command line.

The purpose of this patch is convenience. Its purpose is similar to the HADOOP_CONF_DIR, HBASE_CONF_DIR, etc. environment variables that exist with Hadoop, Hbase, etc.

@mateiz
Copy link
Contributor

mateiz commented Jul 30, 2014

Alright, but I'm actually kind of confused then, won't spark-submit find its own conf dir from SPARK_HOME? Is that somehow missing in binary builds of Spark, or something like that? Because the way we envision this being used is that people install Spark in a location and call the spark-submit from that location, maybe adding it to their path. I guess you want one installation of Spark but multiple conf dirs?

@mateiz
Copy link
Contributor

mateiz commented Jul 30, 2014

BTW for that second case, we could support it, but then we'd also need to make the start-cluster scripts support multiple conf dirs. If you'd like to do that, update the JIRA to say this, and we should patch all of them at once.

@chu11
Copy link
Contributor Author

chu11 commented Jul 31, 2014

I wouldn't say "multiple" conf directories, but alternate ones from the default. In Hadoop I can stick all my config files in a /tmp/foo directory, set HADOOP_CONF_DIR, and hadoop will read all of its configuration files out of there instead of its default location.

spark-env.sh is already searched for in SPARK_CONF_DIR via load-spark-env.sh, so that isn't a problem. However, spark-defaults.conf is not searched for in SPARK_CONF_DIR. So I can't put all the config files in one directory.

I hope that clarifies things?

@mateiz
Copy link
Contributor

mateiz commented Aug 1, 2014

I see, that makes sense. I didn't realize we supported a different conf dir for spark-env.sh through that variable.

Jenkins, test this please

@mateiz
Copy link
Contributor

mateiz commented Aug 1, 2014

test this please

pretty please?

@mateiz
Copy link
Contributor

mateiz commented Aug 1, 2014

Jenkins, test this please

@SparkQA
Copy link

SparkQA commented Aug 1, 2014

QA tests have started for PR 1059. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17697/consoleFull

@SparkQA
Copy link

SparkQA commented Aug 1, 2014

QA results for PR 1059:
- This patch FAILED unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17697/consoleFull

@mateiz
Copy link
Contributor

mateiz commented Aug 1, 2014

Jenkins, test this please

@SparkQA
Copy link

SparkQA commented Aug 1, 2014

QA tests have started for PR 1059. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17704/consoleFull

@SparkQA
Copy link

SparkQA commented Aug 1, 2014

QA results for PR 1059:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17704/consoleFull

@mateiz
Copy link
Contributor

mateiz commented Aug 2, 2014

Merged this into 1.1. Thanks!

@asfgit asfgit closed this in 0da07da Aug 2, 2014
xiliu82 pushed a commit to xiliu82/spark that referenced this pull request Sep 4, 2014
If SPARK_CONF_DIR environment variable is set, search it for spark-defaults.conf.

Author: Albert Chu <[email protected]>

Closes apache#1059 from chu11/SPARK-2116 and squashes the following commits:

9f3ac94 [Albert Chu] SPARK-2116: If SPARK_CONF_DIR environment variable is set, search it for spark-defaults.conf.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants