This little project is created to teach myself something about swarm applications. I created a master app which has the full control and manage everything. Also a slave app is created which contains the worker units. The master says which site is to crawl and one slave will do it. If he found a new link to another site, he informs the master and the new site will be crawed also. The benefit of having multiple worker units is that you have more power (more cores) and your network card will not limit you.
-
Notifications
You must be signed in to change notification settings - Fork 0
Christopher-06/Swarm-Crawler
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A simple web crawler which works with a swarm of computers to maximize efficiency
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published