Skip to content

Conversation

@EugenCepoi
Copy link
Contributor

Fixes https://issues.apache.org/jira/browse/SPARK-2058

The general idea is to use SPARK_CONF_DIR content instead of SPARK_HOME/conf when defined.

  • When SPARK_CONF_DIR is defined, spark-defaults.conf will be loaded from there.
  • SPARK_CONF_DIR is added to the classpath, overriding the default conf from SPARK_HOME/conf

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't this condition inverted?

@vanzin vanzin mentioned this pull request Jun 11, 2014
@chu11
Copy link
Contributor

chu11 commented Jun 12, 2014

Modification to compute-classpath.sh similar to SPARK-1559, pull request #471

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use Files.createTempDir(), and delete it when the test is done (using scalatest's BeforeAndAfter); there are several examples of this in other tests.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a clarification: I see you're deleting the directory below, so BeforeAndAfter does not apply here. But still I'd prefer to use Files.createTempDir() instead of a hardcoded name. Just for paranoia.

@vanzin
Copy link
Contributor

vanzin commented Jun 12, 2014

LGTM aside from the minor test issues.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While we're here, this can be one call to Utils.deleteRecursively(tmpDir)

@EugenCepoi
Copy link
Contributor Author

Thx for the comments. Updated the PR based on them.

@vanzin
Copy link
Contributor

vanzin commented Jun 13, 2014

LGTM. Thanks!

@tgravescs
Copy link
Contributor

We should document SPARK_CONF_DIR in the configuration guide.

@EugenCepoi
Copy link
Contributor Author

Updated the doc. BTW, actually the behaviour is not symmetric, for example spark-env if only searched in SPARK_CONF_DIR if defined but for the configs loaded from classpath, I am enriching the CP, so if some file is missing in SPARK_CONF_DIR it will still be taken from SPARK_HOME/conf.

What do you think about that? Should we update the scripts so instead of adding SPARK_CONF_DIR it replaces the default conf dir?

@andrewor14
Copy link
Contributor

add to whitelist

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to pass sys.env to SparkSubmitArguments. You can just get it from there directly since it's the same JVM

@andrewor14
Copy link
Contributor

@EugenCepoi Thanks for fixing this. It seems this PR is only for branch-1.0, but it would be good to fix this for branch-1.1 and master as well. The code has changed quite a bit between now and then, so could you open up a new PR against the master branch?

In response to your question, I think the asymmetry can be a source of confusion. If the user sets SPARK_CONF_DIR, then we should ignore SPARK_HOME/conf altogether, otherwise there could be random configuration files that take effect even if the user explicitly specified SPARK_CONF_DIR.

@andrewor14
Copy link
Contributor

test this please

@andrewor14 andrewor14 mentioned this pull request Sep 3, 2014
@SparkQA
Copy link

SparkQA commented Sep 3, 2014

QA tests have started for PR 997 at commit 186c975.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Sep 3, 2014

QA tests have finished for PR 997 at commit 186c975.

  • This patch passes unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@EugenCepoi
Copy link
Contributor Author

@andrewor14 Agreed it's too verbose. I will make those changes and reopen it against branch-1.1 but I wont have the time to do it right now.

@pwendell
Copy link
Contributor

Let's close this issue then pending follow up from @EugenCepoi.

@EugenCepoi
Copy link
Contributor Author

@andrewor14 @pwendell I updated it and opened a new PR #2481 against master.

asfgit pushed a commit that referenced this pull request Oct 3, 2014
Update of PR #997.

With this PR, setting SPARK_CONF_DIR overrides SPARK_HOME/conf (not only spark-defaults.conf and spark-env).

Author: EugenCepoi <[email protected]>

Closes #2481 from EugenCepoi/SPARK-2058 and squashes the following commits:

0bb32c2 [EugenCepoi] use orElse orNull and fixing trailing percent in compute-classpath.cmd
77f35d7 [EugenCepoi] SPARK-2058: Overriding SPARK_HOME/conf with SPARK_CONF_DIR
asfgit pushed a commit that referenced this pull request Oct 3, 2014
Update of PR #997.

With this PR, setting SPARK_CONF_DIR overrides SPARK_HOME/conf (not only spark-defaults.conf and spark-env).

Author: EugenCepoi <[email protected]>

Closes #2481 from EugenCepoi/SPARK-2058 and squashes the following commits:

0bb32c2 [EugenCepoi] use orElse orNull and fixing trailing percent in compute-classpath.cmd
77f35d7 [EugenCepoi] SPARK-2058: Overriding SPARK_HOME/conf with SPARK_CONF_DIR

(cherry picked from commit f0811f9)
Signed-off-by: Andrew Or <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants