Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Ability to split Spark Master and Workers Nodes and run on different cluster nodes #76

Open
alexeyegorov opened this issue Jan 11, 2021 · 0 comments
Assignees

Comments

@alexeyegorov
Copy link

alexeyegorov commented Jan 11, 2021

Ability to split Spark Master and Workers Nodes and run on different cluster nodes

Description

As this repository is a really nice start for creating a small Spark cluster on Docker, I found it amazing to be able to use this same set up as a start for distributing this on multiple nodes. I was trying to split up the docker compose into Spark Master + Node1 and Node2 as two Docker instances, but was not able to connect Node2 to Master. Do you think it is somehow interesting?

@alexeyegorov alexeyegorov changed the title [FEATURE] Splitting Spark Master and Workers [FEATURE] Ability to split Spark Master and Workers Nodes and run on different cluster nodes Jan 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants