#Team30 Project LeCloud
Note: Run below commands from the directory where docker-compose.yml file is present.
docker-compose up -d
docker-compose stop
docker-compose start
docker-compose rm -f
docker-compose scale spark-slave=n where n is the new number of containers.
a. Producer Code: ( make sure file exists: project\kafka\data\aminer_papers_0.txt)
python project\kafka\producer.py
b. Consumer Code: Just before running the consumer, run the producer, so that messages are published to Kakfa Queue
- Simple Consumer Test: Connect to Spark Master docker and run
python /opt/spark/code/consumer.py
- Spark Streaming Consumer:
docker exec spark-master bin/spark-submit --verbose --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.3.1 --master spark://spark-master:7077 /opt/spark/code/consumerSpark.py
c. Visualization:
Connect to Neo4j browser using http://localhost:7474/browser with username: neo4j and password: password