Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added schedule and items commands #66

Merged
merged 4 commits into from
Dec 4, 2015
Merged

added schedule and items commands #66

merged 4 commits into from
Dec 4, 2015

Conversation

stummjr
Copy link
Contributor

@stummjr stummjr commented Dec 3, 2015

The shub schedule and shub items commands were added. Here is how they work:

shub schedule

Schedule a spider for execution. Example:

$ #scheduling 'my_spider' for project 26736
$ shub schedule my_spider -p 26736

$ #scheduling 'my_spyder' for project defined in scrapy.cfg
$ shub schedule my_spider

$ #passing arguments to the spider
$ shub schedule my_spider -p 26736 -a arg1=foo arg2=bar

shub items

Fetch items from a given job. Example:

$ #getting the items scraped by job 26736/2/71
$ shub items 26736/2/71

@@ -24,6 +24,8 @@ def cli():
"deploy_reqs": [],
"logout": [],
"version": [],
"items": ["hubstorage"],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hubstorage should be a required dependency, and shipped with shub tool.

These commands should always work, not like deploy that requires scrapy to be present.

@@ -0,0 +1,45 @@
from ConfigParser import NoSectionError, NoOptionError
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Python 3 ConfigParser has been renamed to configparser.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd recommend using configparser from six.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When our dependencies add py3 support we might do the same for shub; so I think we should keep new code py3-compliant.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


@click.command(help='Schedule a spider to run on Scrapy Cloud')
@click.pass_context
@click.argument("spider", type=click.STRING)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a nitpick, but I think we should be consistent in the use of single or double quotes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wow, that's a real nitpick. :D
I stopped worrying about single quotes vs double quotes when I stopped doing Perl and PHP. :)

@eliasdorneles
Copy link
Contributor

Nice work, @stummjr !
I added a bit to the nitpicking, but this looks good, well done! :)

josericardo pushed a commit that referenced this pull request Dec 4, 2015
added schedule and items commands
@josericardo josericardo merged commit 6dcccc4 into scrapinghub:master Dec 4, 2015
@josericardo
Copy link
Contributor

Thanks @stummjr, well done indeed! ;D

@bertinatto
Copy link
Contributor

+1

@stummjr
Copy link
Contributor Author

stummjr commented Dec 4, 2015

Thank you guys for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants