You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When processing multiple small PDF files using setSourceData to supply raw PDF data, uniqid() can return the same value on subsequent calls, resulting in the same PDF being processed more than once. There are a couple of options, which require further investigation:
Pass true as the second parameter ($more_entropy) to uniqid(), which appears to result in sufficiently unique values, at least in most cases
Replace uniqid() with a hash of the input data, which will also allow parser reuse when passing the same data to setSourceData multiple times
The first option is a quick fix, but the second may be more satisfactory - given a hash algorithm with suitable tradeoff between resource consumption and collision probability.
The text was updated successfully, but these errors were encountered:
This issue was discovered from investigation of a weird issue which @gismofx reported by email: attempts to concatenate two single-page PDFs would frequently (but not always) result in the first PDF being doubled up (and the second being skipped). We were able to determine with a bit of investigation that the uniqid() calls were both returning the same value, resulting in the first PDF's parser being reused instead of a new one being initialised.
When processing multiple small PDF files using
setSourceData
to supply raw PDF data,uniqid()
can return the same value on subsequent calls, resulting in the same PDF being processed more than once. There are a couple of options, which require further investigation:true
as the second parameter ($more_entropy
) touniqid()
, which appears to result in sufficiently unique values, at least in most casesuniqid()
with a hash of the input data, which will also allow parser reuse when passing the same data tosetSourceData
multiple timesThe first option is a quick fix, but the second may be more satisfactory - given a hash algorithm with suitable tradeoff between resource consumption and collision probability.
The text was updated successfully, but these errors were encountered: