You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ability to split Spark Master and Workers Nodes and run on different cluster nodes
Description
As this repository is a really nice start for creating a small Spark cluster on Docker, I found it amazing to be able to use this same set up as a start for distributing this on multiple nodes. I was trying to split up the docker compose into Spark Master + Node1 and Node2 as two Docker instances, but was not able to connect Node2 to Master. Do you think it is somehow interesting?
The text was updated successfully, but these errors were encountered:
alexeyegorov
changed the title
[FEATURE] Splitting Spark Master and Workers
[FEATURE] Ability to split Spark Master and Workers Nodes and run on different cluster nodes
Jan 11, 2021
Ability to split Spark Master and Workers Nodes and run on different cluster nodes
Description
As this repository is a really nice start for creating a small Spark cluster on Docker, I found it amazing to be able to use this same set up as a start for distributing this on multiple nodes. I was trying to split up the docker compose into Spark Master + Node1 and Node2 as two Docker instances, but was not able to connect Node2 to Master. Do you think it is somehow interesting?
The text was updated successfully, but these errors were encountered: