-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JobRepositoryTestUtils should work against the JobRepository interface #4070
Comments
Extending DefaultBatchConfigurer with your configuration and overriding createJobRepository() should allow you to set a datasource explicitly when there are two datasources and even while using SpringBatchTest and SpringBootTest in combination with eachother, but perhaps there is a way to do this with the application.yml or application.properties file also. |
Not exactly. The problem is not with JobRepository, which can be configured as you said; but with JdbcTemplate, which is injected with the "primary" datasource, not necessarily the same one used by JobRepository. The instance of JdbcTemplate is used for explicit removal of job executions (DELETE statements), since JobRepository interface offers no methods to do that.
JobRepositoryTestUtils cannot be configured to use a secondary datasource for JdbcTemplate, even if annotated with @BatchDataSource. |
In order to DELETE job executions try something like this:
There should be a way to set the datasource with the application.yml or application.properties file. Adding the datasource to the application config would make JobRepository more intuitive, annotating with @BatchDataSource should do the same and JobRepository should have context to be able to get the datasource to make statements. |
IMHO, executing my own SQL statements defeats the whole purpose of using JobRepositoryTestUtils altogether. All I'm saying is that JobRepositoryTestUtils.removeJobExecutions() always uses the primary datasource. Moreover, the remaining methods (such as createJobExecutions()) don't necessarily access the same datasource, depending on how JobRepository is configured. This could lead to unintended results. Anyway, I applied a simple workaround in an auto-configuration class to fix it: But I thought that I should report it, and contribute to Spring Batch in some way. If you ask me, I would move removeJobExecutions() as a new method in JobRepository interface, encapsulating all repository logic behind it. This would guarantee that the same datasource is used every time. That method could be used, not only by JobRepositoryTestUtils, but also Spring Cloud Data Flow (which also has similar DELETE statements). |
@aritzbastida Thank you for reporting this issue! I agree with you. The datasource is an implementation detail of the JDBC-based
So yes, there should be methods in This enhancement will supersede #4178. I will try to include it in the upcoming |
Thank you for considering to implement this issue. :) |
@fmbenhassine @aritzbastida what if we use mongodb? We don't have datasource here |
As mentioned in my previous comment, we should be able to use the utility against any |
The utility class
JobRepositoryTestUtils
(injected via@SpringBatchTest
annotation), uses an incorrect datasource when there is more than one available in the application context, and the one defined as@Primary
is not referring to the Spring Batch schema.Sure, this is already documented in the Javadoc for
@SpringBatchTes
t annotation, but it happens to be a limitation in the infrastructure, and also may lead unexpected behavior:@BatchDataSource
qualifier, or any qualifier, for that matter.In my scenario, the job is launched fine but then cannot be removed, because the DELETE statements are executed against another database schema.
Ideally, JdbcTemplate should not be needed, and should infer the
DataSource
fromJobRepository
, extending it with removal operations. If that's not possible, at least it should be possible to configure the datasource dependency, so that it is used the one annotated with@BatchDataSource
.The text was updated successfully, but these errors were encountered: