-
Notifications
You must be signed in to change notification settings - Fork 572
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scrapyd w/ docker is not persisting *.db file #251
Comments
@ivandir, Did you configure a directory for the db files? By jobs that haven't processed yet, you mean jobs waiting in the pending state, right? By "restart the scrapyd instance" do you mean the whole container? |
Indeed, spider jobs don't persist after restarting the docker container. Here's my config file:
And an example of how I'm deploying it using docker-compose:
A file called |
Any updates on this ticket? |
The "db files" store spider queues. What exactly disappears? If it's the first and your docker volume mount point is correctly configured then it's a bug. |
This sounds like #359, not about Docker, so closing. |
Hi,
I thought the *.db associated with my project was supposed to persist in my local docker volume but it seems scrapyd overrides the project sqlite3 db file. I have jobs that I haven't processed yet and when I restart the scrapyd instance, the jobs disappear (i.e no persistence) due to a new db file being created.
Is this the intended behavior on docker deployments?
The text was updated successfully, but these errors were encountered: