You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/install.md
+4-3Lines changed: 4 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ enterprise: 'no'
6
6
---
7
7
8
8
# About Installing Spark on Enterprise DC/OS
9
-
In Enterprise DC/OS `strict`[security mode](https://docs.mesosphere.com/1.8/administration/installing/custom/configuration-parameters/#security), Spark requires a service account. In `permissive`, a service account is optional. Only someone with `superuser` permission can create the service account. Refer to [Provisioning Spark](https://docs.mesosphere.com/1.8/administration/id-and-access-mgt/service-auth/spark-auth/) for instructions.
9
+
In Enterprise DC/OS `strict`[security mode](https://docs.mesosphere.com/1.9/administration/installing/custom/configuration-parameters/#security), Spark requires a service account. In `permissive`, a service account is optional. Only someone with `superuser` permission can create the service account. Refer to [Provisioning Spark](https://docs.mesosphere.com/1.9/administration/id-and-access-mgt/service-auth/spark-auth/) for instructions.
10
10
11
11
# Default Installation
12
12
@@ -17,10 +17,11 @@ server.
17
17
18
18
$ dcos package install spark
19
19
20
-
Go to the **Services** tab of the DC/OS web interface to monitor the deployment. Once it is
20
+
Go to the **Services**> **Deployments**tab of the DC/OS web interface to monitor the deployment. Once it is
21
21
complete, visit Spark at `http://<dcos-url>/service/spark/`.
22
22
23
-
You can also [install Spark via the DC/OS web interface](https://docs.mesosphere.com/1.8/usage/webinterface/#universe).
23
+
You can also [install Spark via the DC/OS web interface](https://docs.mesosphere.com/1.9/usage/webinterface/#universe).
24
+
24
25
**Note:** If you install Spark via the web interface, run the
25
26
following command from the DC/OS CLI to install the Spark CLI:
Copy file name to clipboardExpand all lines: docs/limitations.md
+3-14Lines changed: 3 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,19 +5,8 @@ feature_maturity: stable
5
5
enterprise: 'no'
6
6
---
7
7
8
-
* DC/OS Spark only supports submitting jars and Python scripts. It
9
-
does not support R.
8
+
* Mesosphere does not provide support for Spark app development, such as writing a Python app to process data from Kafka or writing Scala code to process data from HDFS.
10
9
11
-
* Mesosphere does not provide support for Spark app development,
12
-
such as writing a Python app to process data from Kafka or writing
13
-
Scala code to process data from HDFS.
10
+
* Spark jobs run in Docker containers. The first time you run a Spark job on a node, it might take longer than you expect because of the `docker pull`.
14
11
15
-
* Spark jobs run in Docker containers. The first time you run a
16
-
Spark job on a node, it might take longer than you expect because of
17
-
the `docker pull`.
18
-
19
-
* DC/OS Spark only supports running the Spark shell from within a
20
-
DC/OS cluster. See the Spark Shell section for more information.
21
-
For interactive analytics, we
22
-
recommend Zeppelin, which supports visualizations and dynamic
23
-
dependency management.
12
+
* DC/OS Spark only supports running the Spark shell from within a DC/OS cluster. See the Spark Shell section for more information. For interactive analytics, we recommend Zeppelin, which supports visualizations and dynamic dependency management.
Copy file name to clipboardExpand all lines: docs/security.md
+5-8Lines changed: 5 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,6 @@ post_title: Security
3
3
menu_order: 40
4
4
enterprise: 'no'
5
5
---
6
-
7
6
# Mesos Security
8
7
9
8
## SSL
@@ -23,13 +22,11 @@ enterprise: 'no'
23
22
24
23
## Authentication
25
24
26
-
When running in [DC/OS strict security mode](https://docs.mesosphere.com/latest/administration/id-and-access-mgt/), both the dispatcher and jobs must authenticate to Mesos using a [DC/OS Service Account](https://docs.mesosphere.com/1.8/administration/id-and-access-mgt/service-auth/).
25
+
When running in [DC/OS strict security mode](https://docs.mesosphere.com/latest/administration/id-and-access-mgt/), both the dispatcher and jobs must authenticate to Mesos using a [DC/OS Service Account](https://docs.mesosphere.com/1.9/administration/id-and-access-mgt/service-auth/).
27
26
28
27
Follow these instructions to authenticate in strict mode:
1. Create a service account by following the instructions [here](https://docs.mesosphere.com/1.9/administration/id-and-access-mgt/service-auth/universe-service-auth/).
33
30
34
31
1. Assign Permissions
35
32
@@ -47,7 +44,7 @@ Follow these instructions to authenticate in strict mode:
47
44
"$(dcos config show core.dcos_url)/acs/api/v1/acls/dcos:mesos:master:task:user:root/users/${SERVICE_ACCOUNT_NAME}/create"
48
45
```
49
46
50
-
Now you must allow Spark to register under the desired role. This is the value used for `service.role` when installing Spark (default: `*`):
47
+
Now, you must allow Spark to register under the desired role. This is the value used for `service.role` when installing Spark (default: `*`):
51
48
52
49
```
53
50
$ export ROLE=<service.role value>
@@ -88,7 +85,7 @@ Follow these instructions to authenticate in strict mode:
88
85
89
86
1. Submit a Job
90
87
91
-
We've now installed the Spark Dispatcher, which is authenticating itself to the Mesos master. Spark jobs are also frameworks which must authenticate. The dispatcher will pass the secret along to the jobs, so all that's left to do is configure our jobs to use DC/OS authentication:
88
+
We've now installed the Spark Dispatcher, which is authenticating itself to the Mesos master. Spark jobs are also frameworks that must authenticate. The dispatcher will pass the secret along to the jobs, so all that's left to do is configure our jobs to use DC/OS authentication:
Copy file name to clipboardExpand all lines: docs/spark-shell.md
+10-1Lines changed: 10 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ enterprise: 'no'
7
7
# Interactive Spark Shell
8
8
9
9
You can run Spark commands interactively in the Spark shell. The Spark shell is available
10
-
in either Scalaor Python.
10
+
in either Scala, Python, or R.
11
11
12
12
1. SSH into a node in the DC/OS cluster. [Learn how to SSH into your cluster and get the agent node ID](https://dcos.io/docs/latest/administration/access-node/sshcluster/).
0 commit comments