Skip to content

Commit 4a4dd4f

Browse files
10110346srowen
authored andcommitted
[SPARK-23391][CORE] It may lead to overflow for some integer multiplication
## What changes were proposed in this pull request? In the `getBlockData`,`blockId.reduceId` is the `Int` type, when it is greater than 2^28, `blockId.reduceId*8` will overflow In the `decompress0`, `len` and `unitSize` are Int type, so `len * unitSize` may lead to overflow ## How was this patch tested? N/A Author: liuxian <[email protected]> Closes #20581 from 10110346/overflow2.
1 parent 0e2c266 commit 4a4dd4f

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -202,13 +202,13 @@ private[spark] class IndexShuffleBlockResolver(
202202
// class of issue from re-occurring in the future which is why they are left here even though
203203
// SPARK-22982 is fixed.
204204
val channel = Files.newByteChannel(indexFile.toPath)
205-
channel.position(blockId.reduceId * 8)
205+
channel.position(blockId.reduceId * 8L)
206206
val in = new DataInputStream(Channels.newInputStream(channel))
207207
try {
208208
val offset = in.readLong()
209209
val nextOffset = in.readLong()
210210
val actualPosition = channel.position()
211-
val expectedPosition = blockId.reduceId * 8 + 16
211+
val expectedPosition = blockId.reduceId * 8L + 16
212212
if (actualPosition != expectedPosition) {
213213
throw new Exception(s"SPARK-22982: Incorrect channel position after index file reads: " +
214214
s"expected $expectedPosition but actual position was $actualPosition.")

sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/compression/compressionSchemes.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ private[columnar] case object PassThrough extends CompressionScheme {
116116
while (pos < capacity) {
117117
if (pos != nextNullIndex) {
118118
val len = nextNullIndex - pos
119-
assert(len * unitSize < Int.MaxValue)
119+
assert(len * unitSize.toLong < Int.MaxValue)
120120
putFunction(columnVector, pos, bufferPos, len)
121121
bufferPos += len * unitSize
122122
pos += len

0 commit comments

Comments
 (0)