-
Notifications
You must be signed in to change notification settings - Fork 29k
[MINOR][DOCS] Fix a typo for a configuration property of resources allocation #28958
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| The user must configure the Workers to have a set of resources available so that it can assign them out to Executors. The <code>spark.worker.resource.{resourceName}.amount</code> is used to control the amount of each resource the worker has allocated. The user must also specify either <code>spark.worker.resourcesFile</code> or <code>spark.worker.resource.{resourceName}.discoveryScript</code> to specify how the Worker discovers the resources its assigned. See the descriptions above for each of those to see which method works best for your setup. | ||
|
|
||
| The second part is running an application on Spark Standalone. The only special case from the standard Spark resource configs is when you are running the Driver in client mode. For a Driver in client mode, the user can specify the resources it uses via <code>spark.driver.resourcesfile</code> or <code>spark.driver.resource.{resourceName}.discoveryScript</code>. If the Driver is running on the same host as other Drivers, please make sure the resources file or discovery script only returns resources that do not conflict with other Drivers running on the same node. | ||
| The second part is running an application on Spark Standalone. The only special case from the standard Spark resource configs is when you are running the Driver in client mode. For a Driver in client mode, the user can specify the resources it uses via <code>spark.driver.resourcesFile</code> or <code>spark.driver.resource.{resourceName}.discoveryScript</code>. If the Driver is running on the same host as other Drivers, please make sure the resources file or discovery script only returns resources that do not conflict with other Drivers running on the same node. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ur, this looks like not a typo, isn't it? So, is this about changing to CamelCase, resourcesfile -> resourcesFile, only? Is Apache Spark case sensitive?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's not just about notation. Property names in Apache Spark should be case sensitive.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it. Thanks.
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merged to master/3.0.
…location ### What changes were proposed in this pull request? This PR fixes a typo for a configuration property in the `spark-standalone.md`. `spark.driver.resourcesfile` should be `spark.driver.resourcesFile`. I look for similar typo but this is the only typo. ### Why are the changes needed? The property name is wrong. ### Does this PR introduce _any_ user-facing change? Yes. The property name is corrected. ### How was this patch tested? I confirmed the spell of the property name is the correct from the property name defined in o.a.s.internal.config.package.scala. Closes #28958 from sarutak/fix-resource-typo. Authored-by: Kousuke Saruta <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit 5176707) Signed-off-by: Dongjoon Hyun <[email protected]>
|
Test build #124644 has finished for PR 28958 at commit
|
What changes were proposed in this pull request?
This PR fixes a typo for a configuration property in the
spark-standalone.md.spark.driver.resourcesfileshould bespark.driver.resourcesFile.I look for similar typo but this is the only typo.
Why are the changes needed?
The property name is wrong.
Does this PR introduce any user-facing change?
Yes. The property name is corrected.
How was this patch tested?
I confirmed the spell of the property name is the correct from the property name defined in o.a.s.internal.config.package.scala.