Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Nov 14, 2025

What changes were proposed in this pull request?

This PR aims to make spark.connect.session.planCompression.defaultAlgorithm to support NONE additionally.

Why are the changes needed?

BEFORE

$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
$ scala> spark.range(1).count()
...
Caused by: org.apache.spark.SparkIllegalArgumentException:
[INVALID_CONF_VALUE.OUT_OF_RANGE_OF_OPTIONS]
The value 'NONE' in the config "spark.connect.session.planCompression.defaultAlgorithm" is invalid.
It should be one of 'ZSTD'. SQLSTATE: 22022

AFTER

$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
scala> spark.range(1).count()
val res0: Long = 1

Does this PR introduce any user-facing change?

No behavior change because this is a new option for a new feature.

How was this patch tested?

Pass the CIs.

Was this patch authored or co-authored using generative AI tooling?

No.

@dongjoon-hyun
Copy link
Member Author

Could you review this PR, please, @xi-db and @hvanhovell ?

@dongjoon-hyun
Copy link
Member Author

Could you review this when you have some time, @sryza ?

@dongjoon-hyun
Copy link
Member Author

Could you review this, @dtenedor ?

@dongjoon-hyun
Copy link
Member Author

Could you review this when you have some time, @cloud-fan ?

@dongjoon-hyun
Copy link
Member Author

Could you review this, @gengliangwang ?

@gengliangwang
Copy link
Member

@dongjoon-hyun I’m not familiar with the related changes, so I’ll defer to other committers.

@dongjoon-hyun
Copy link
Member Author

No problem. Thank you for spending your time here, @gengliangwang .

Copy link
Member

@sarutak sarutak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I confirmed this works and NONE seems consistent with other similar configs.

@dongjoon-hyun
Copy link
Member Author

Thank you so much, @sarutak . Merged to master/4.1 for Apache Spark 4.1.0 to provide a way to escape when we hit a regression .

dongjoon-hyun added a commit that referenced this pull request Nov 15, 2025
…faultAlgorithm` to support `NONE`

### What changes were proposed in this pull request?

This PR aims to make `spark.connect.session.planCompression.defaultAlgorithm` to support `NONE` additionally.

### Why are the changes needed?

**BEFORE**

```
$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
$ scala> spark.range(1).count()
...
Caused by: org.apache.spark.SparkIllegalArgumentException:
[INVALID_CONF_VALUE.OUT_OF_RANGE_OF_OPTIONS]
The value 'NONE' in the config "spark.connect.session.planCompression.defaultAlgorithm" is invalid.
It should be one of 'ZSTD'. SQLSTATE: 22022
```

**AFTER**

```
$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
scala> spark.range(1).count()
val res0: Long = 1
```

### Does this PR introduce _any_ user-facing change?

No behavior change because this is a new option for a new feature.

### How was this patch tested?

Pass the CIs.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #53068 from dongjoon-hyun/SPARK-54355.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit e3e9863)
Signed-off-by: Dongjoon Hyun <[email protected]>
@dongjoon-hyun dongjoon-hyun deleted the SPARK-54355 branch November 16, 2025 16:44
@xi-db
Copy link
Contributor

xi-db commented Nov 17, 2025

Thanks @dongjoon-hyun for the PR! Late LGTM

@dongjoon-hyun
Copy link
Member Author

Thank you, @xi-db !

huangxiaopingRD pushed a commit to huangxiaopingRD/spark that referenced this pull request Nov 25, 2025
…faultAlgorithm` to support `NONE`

### What changes were proposed in this pull request?

This PR aims to make `spark.connect.session.planCompression.defaultAlgorithm` to support `NONE` additionally.

### Why are the changes needed?

**BEFORE**

```
$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
$ scala> spark.range(1).count()
...
Caused by: org.apache.spark.SparkIllegalArgumentException:
[INVALID_CONF_VALUE.OUT_OF_RANGE_OF_OPTIONS]
The value 'NONE' in the config "spark.connect.session.planCompression.defaultAlgorithm" is invalid.
It should be one of 'ZSTD'. SQLSTATE: 22022
```

**AFTER**

```
$ bin/spark-connect-shell -c spark.connect.session.planCompression.defaultAlgorithm=NONE
...
scala> spark.range(1).count()
val res0: Long = 1
```

### Does this PR introduce _any_ user-facing change?

No behavior change because this is a new option for a new feature.

### How was this patch tested?

Pass the CIs.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#53068 from dongjoon-hyun/SPARK-54355.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants