Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation Not Clear #78

Closed
adsteel opened this issue Apr 15, 2015 · 7 comments
Closed

Documentation Not Clear #78

adsteel opened this issue Apr 15, 2015 · 7 comments

Comments

@adsteel
Copy link

adsteel commented Apr 15, 2015

I'm trying to understand what this gem does, but the documentation isn't very clear on a basic level. So we're making jobs unique based on worker and arguments. What does that mean? Is only one unique job allowed in the queue at a time? (Meaning that if I put two different jobs in the queue that each match the unique_args constraint, is the second job never executed?) Or are two unique jobs allowed in the queue, but the latter always executed after the former has concluded?

@mhenrixon
Copy link
Owner

@adsteel the uniqueness is based on arguments you configure in the unique_args method or we will create a unique hash from all the arguments the .perform_async method receives. With no args provided to a job the job will be unique until just before or just after it has been performed depending on configuration.

My best suggestion for figuring it out is to check the test suite. It should cover the basics but you raise a valid point. Maybe we need more of an examples directory with all the various ways to configure uniqueness.

@adsteel
Copy link
Author

adsteel commented May 6, 2015

@mhenrixon Thanks. I think I misspoke. I said 'what does that mean?', but I meant 'how is that implemented'? More specifically, what happens if I put two different jobs in the queue that each match the uniqueness constraint? Does the first job block until the second performs? Is the second never performed?

@Benjamin-Dobell
Copy link

@mhenrixon @adsteel's question is a good one - I'm equally confused about this very same thing.

Also what does this mean (https://github.com/mperham/sidekiq/wiki/Related-Projects) in relation to this project?

@mperham's note: job uniqueness or locking is impossible to implement safely and efficiently in a distributed system. I recommend using optimistic or pessimistic locking with database transactions instead of using these projects.

@Benjamin-Dobell
Copy link

Alright, by looking at the specs it seems that the job is just discarded - not the correct behaviour for my use-case - ah well.

@cjbottaro
Copy link

Agreed. Documentation is unclear. Is this gem for unique queues or unique workers?

The former means (job type, *args) can only exist once per queue, e.g. the queue acts as an ordered set.
Ex: https://github.com/resque/resque-loner

The latter means (job type, *args) can only be worked on by one worker at a time.
Ex: https://github.com/wallace/resque-lonely_job

Which does this gem do?

@mhenrixon
Copy link
Owner

@cjbottaro both!

You can make a job unique in a specific queue or across all queues. You can also decide when the uniqueness is enforced within that restriction like if the job should only be considered unique while it is running or if the uniqueness should be until before or after it is processed.

@mhenrixon
Copy link
Owner

Documentation has been improved and so has test suites, naming etc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants