-
Notifications
You must be signed in to change notification settings - Fork 255
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
initially started with a 3TB dataset, which i then scalled to 200GB. This is the driver and executor config on my end.
--conf spark.driver.memory=10g \
--conf spark.executor.cores=4 \
--conf spark.executor.memory=12g \
--conf spark.driver.memoryOverhead=3000 \
--conf spark.executor.memoryOverhead=4G \
java options
--conf spark.driver.defaultJavaOptions="-XX:OnOutOfMemoryError='kill -9 %p' -XX:+UseParallelGC -XX:InitiatingHeapOccupancyPercent=70" \
--conf spark.executor.defaultJavaOptions="-verbose:gc -XX:+UseParallelGC -XX:InitiatingHeapOccupancyPercent=70" \
Comet configurations are as described in the benchmark section website.
Running this with 40 executors, and observe some OOM, which is intriguing because the dataset is small.
Steps to reproduce
No response
Expected behavior
No OOM
Additional context
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working
