-
Notifications
You must be signed in to change notification settings - Fork 38.8k
Description
Nicolas FABRE opened SPR-6334 and commented
Using org.springframework.jdbc.core.JdbcTemplate.batchUpdate(String sql, BatchPreparedStatementSetter pss), if the number of rows I want to insert (update or delete) is greater than the batch size I define in the BatchPreparedStatementSetter, the rows after the batch size limit are not inserted. However if I have a huge amount of rows to insert (for example 100000), I think it is not good to set the batch size to 100000. The database could be oppressed if it receives 100000 rows to process on a single call. Am I wrong ?
So would it be possible to use batchUpdate with a number of updates to be performed not equals to the batch size ? May I request the creation of the following method to manage this problem ?
// the batch size is set in the BatchPreparedStatementSetter, the number of rows we want to process is equal to the nbUpdates parameter
public int[] batchUpdate(String sql, final long nbUpdates, final BatchPreparedStatementSetter pss) throws DataAccessException {
if (logger.isDebugEnabled()) {
logger.debug("Executing SQL batch update [" + sql + "]");
}
return (int[]) execute(sql, new PreparedStatementCallback() {
public Object doInPreparedStatement(PreparedStatement ps) throws SQLException {
try {
int batchSize = pss.getBatchSize();
InterruptibleBatchPreparedStatementSetter ipss = (pss instanceof InterruptibleBatchPreparedStatementSetter ? (InterruptibleBatchPreparedStatementSetter) pss
: null);
if (JdbcUtils.supportsBatchUpdates(ps.getConnection())) {
List<Integer> rowsAffected = new ArrayList<Integer>();
for (int i = 1; i <= nbUpdates; i++) {
pss.setValues(ps, i - 1);
if (ipss != null && ipss.isBatchExhausted(i - 1)) {
if (logger.isDebugEnabled()) {
int batchIdx = (i % batchSize == 0) ? i / batchSize : (i / batchSize) + 1;
logger.debug("Batch exhausted - Sending last SQL batch update #" + batchIdx);
}
int[] res = ps.executeBatch();
for (int j = 0; j < res.length; j++) {
rowsAffected.add(res[j]);
}
break;
}
ps.addBatch();
if (i % batchSize == 0 || i == nbUpdates) {
if (logger.isDebugEnabled()) {
int batchIdx = (i % batchSize == 0) ? i / batchSize : (i / batchSize) + 1;
logger.debug("Sending SQL batch update #" + batchIdx);
}
int[] res = ps.executeBatch();
for (int j = 0; j < res.length; j++) {
rowsAffected.add(res[j]);
}
}
}
int[] result = new int[rowsAffected.size()];
for (int i = 0; i < result.length; i++) {
result[i] = rowsAffected.get(i).intValue();
}
return result;
} else {
List<Integer> rowsAffected = new ArrayList<Integer>();
for (int i = 0; i < nbUpdates; i++) {
pss.setValues(ps, i);
if (ipss != null && ipss.isBatchExhausted(i)) {
break;
}
rowsAffected.add(ps.executeUpdate());
}
int[] rowsAffectedArray = new int[rowsAffected.size()];
for (int i = 0; i < rowsAffectedArray.length; i++) {
rowsAffectedArray[i] = rowsAffected.get(i);
}
return rowsAffectedArray;
}
} finally {
if (pss instanceof ParameterDisposer) {
((ParameterDisposer) pss).cleanupParameters();
}
}
}
});
}
Thanks to this method I can process my 100000 rows with a batchSize = 5000 for example. It avoid to manage sub-collections in the DAO method to invoke the batchUpdate method with just 5000 rows.
Thanks in advance for the attention which will be given to this issue.
Nicolas
Affects: 3.0 RC1
Attachments:
- spring-contribution-JIRA-SPR-6334.zip (13.07 kB)
Issue Links:
- Extend JdbcTemplate to allow larger batch updates [SPR-7290] #11949 Extend JdbcTemplate to allow larger batch updates ("is duplicated by")
Referenced from: commits 0adcb2a
2 votes, 3 watchers