Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass scrapy arguments to ScrapyProcessProtocol #272

Closed
wants to merge 1 commit into from

Conversation

dfrank-a
Copy link

Pass scrapy arguments along to ScrapyProcessProtocol.

This will allow users to use this data in extensions of the minimal admin UI in whatever way they see fit.

pass -a arguments to ScrapyProcessProtocol
@codecov
Copy link

codecov bot commented Jan 26, 2018

Codecov Report

Merging #272 into master will decrease coverage by 0.08%.
The diff coverage is 50%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #272      +/-   ##
==========================================
- Coverage   68.37%   68.29%   -0.09%     
==========================================
  Files          16       16              
  Lines         819      820       +1     
  Branches       96       96              
==========================================
  Hits          560      560              
- Misses        230      231       +1     
  Partials       29       29
Impacted Files Coverage Δ
scrapyd/launcher.py 43.2% <50%> (-0.55%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1713747...c1ddd4e. Read the comment docs.

@Digenis
Copy link
Member

Digenis commented Feb 6, 2018

Hi @dfrank-a
welcome to scrapyd.

This feature is something we want to add eventually.
I only didn't open an issue
because of worries I express in the following 2 comments of issue 123
(sorry for the mess, I didn't have the time to write an issue)
#123 (comment)
#123 (comment)
To sum them up, one day we may need to write a new webservice
to replace schedule.json with one that nests spider arguments & settings
in separate objects, so that their names don't collide with special arguments
for the webservice or other scrapyd components.
Will merging this PR make it harder to replace the webservice later?
I really haven't thought much about it and don't have the time now.
What's your opinion on this?

@Digenis
Copy link
Member

Digenis commented Feb 6, 2018

Actually I just took a look and I think it won't make it harder.
This however only means us. It won't make it harder for us.
If users start writing custom components that use the process protocol's args
their components will break when we reorganize the object into something more nested.
Perhaps we should include it as an experimental change
with a warning about the structure of the args attribute.

@dfrank-a
Copy link
Author

dfrank-a commented Feb 7, 2018

That is something I hadn't considered. How do you feel about declaring a public interface to the data with property methods? This should allow for flexibility in our implementation while maintaining consistency in the external interface.

@Digenis
Copy link
Member

Digenis commented Feb 7, 2018

I see what you mean, it makes sense
but I confused the spider arguments with the scrapy process's args.
Since it's process's args you are adding to the proc. protocol,
the compatibility concerns don't apply here.
To avoid confusion in the future (e.g. if we decide to add the spider arguments)
perhaps args should be named cmdline
both in _spawn_process and ProcessProtocol.

Also, the commit message should fit in a single line up to 50 characters.

@jpmckinney
Copy link
Contributor

Re-done in #474.

@jpmckinney jpmckinney closed this Mar 8, 2023
@jpmckinney jpmckinney added pr: replaced for unmerged PRs that were replaced by a PR or commit and removed type: enhancement labels Jul 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr: replaced for unmerged PRs that were replaced by a PR or commit
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants