Skip to content

Conversation

@amaliujia
Copy link
Contributor

What changes were proposed in this pull request?

  1. Add types.py and move to/from type proto to this file.
  2. Improve Cast proto to support both DataType and DataType as a string.
  3. support cast to the following types in Python:
    ByteType,
    ShortType,
    IntegerType,
    FloatType,
    DayTimeIntervalType,
    MapType,
    StringType,
    DoubleType,
    LongType,
    DecimalType,
    BinaryType,
    BooleanType

Why are the changes needed?

API coverage

Does this PR introduce any user-facing change?

NO

How was this patch tested?

UT

@amaliujia
Copy link
Contributor Author

cc @zhengruifeng

oneof cast_to_type {
DataType type = 2;
// If this is set, Server will use Catalyst parser to parse this string to DataType.
string type_str = 3;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I add this to follow the design principle so far that we should move repeated clients implementation to the server side to reduce client side redundant work.

Otherwise client side will need to implement string to DataType conversion and each client will need to the same thing.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comes back to the discussion that I had in my draft PR if the second argument to the cast() function should be modeled as an expression or not. Right now it takes a value but could be modeled as a string as well.


// (Required) the data type that the expr to be casted to.
DataType cast_to_type = 2;
oneof cast_to_type {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI: this is a breaking change in the proto message. Ideally, we would use

reserved 2;
oneof cast_to_type {
  DataType type = 3;
  string type_str = 4;
}

Copy link
Contributor Author

@amaliujia amaliujia Dec 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah thanks. I didn't know this way to evolve the proto.

Given Spark Connect is still alpha component though for now we don't need to be enforced to maintain the backwards compatibility.

But when we are ready to leave from the alpha component then we should follow this way.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine since this message has not been actually used?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we are ok now. But later there is indeed a need to build a process/good practice etc. for how to evolve proto without breaking older versions (if possible)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're ok now. We can use Buf to check for breaking changes.

oneof cast_to_type {
DataType type = 2;
// If this is set, Server will use Catalyst parser to parse this string to DataType.
string type_str = 3;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comes back to the discussion that I had in my draft PR if the second argument to the cast() function should be modeled as an expression or not. Right now it takes a value but could be modeled as a string as well.

Comment on lines +527 to +529
Cast(
transformExpression(cast.getExpr),
session.sessionState.sqlParser.parseDataType(cast.getTypeStr))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't it be nice if we could just add a second string argument to the Cast expression so it could resolved the expression automatically? Would this make the design easier? Because then you wouldn't even need a custom expression for cast.

Copy link
Contributor Author

@amaliujia amaliujia Dec 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not following this? Can you give some example proto/sample code?

Are you saying we do not use oneof but add the third string field?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I looked at cast as a unicorn in the context of expressions, it's one of the few where not all argument types resolve to expressions. I was wondering if we could simplify the approach to make the type argument of cast an expression that can resolve as a string then we can do the matching of the expression in the analyzer.

This is very similar to why you added the oneof to the proto message.

@grundprinzip
Copy link
Contributor

My discussion was not meant to block the PR, this was more a more general observation.

LGTM, thanks!

cc @HyukjinKwon @zhengruifeng

@zhengruifeng
Copy link
Contributor

merged into master

HyukjinKwon added a commit that referenced this pull request Dec 12, 2022
…e on

### What changes were proposed in this pull request?

This PR is a followup of #38970 which makes the test pass with ANSI mode on.

### Why are the changes needed?

To recover the build with ANSI mode on. Currently it's broke as follows:

```
======================================================================
ERROR [2.651s]: test_cast (pyspark.sql.tests.connect.test_connect_column.SparkConnectTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_connect_column.py", line 119, in test_cast
    df.select(df.id.cast(x)).toPandas(), df2.select(df2.id.cast(x)).toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1466, in toPandas
    return self._session.client._to_pandas(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client.py", line 333, in _to_pandas
    return self._execute_and_fetch(req)
  File "/__w/spark/spark/python/pyspark/sql/connect/client.py", line 418, in _execute_and_fetch
    for b in self._stub.ExecutePlan(req, metadata=self._builder.metadata()):
  File "/usr/local/lib/python3.9/dist-packages/grpc/_channel.py", line 426, in __next__
    return self._next()
  File "/usr/local/lib/python3.9/dist-packages/grpc/_channel.py", line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNKNOWN
	details = "[DATATYPE_MISMATCH.CAST_WITH_CONF_SUGGESTION] Cannot resolve "id" due to data type mismatch: cannot cast "BIGINT" to "BINARY" with ANSI mode on.
If you have to cast "BIGINT" to "BINARY", you can set "spark.sql.ansi.enabled" as 'false'.;
'Project [unresolvedalias(cast(id#31L as binary), None)]
+- SubqueryAlias spark_catalog.default.test_connect_basic_table_1
   +- Relation spark_catalog.default.test_connect_basic_table_1[id#31L,name#32] parquet
"
	debug_error_string = "UNKNOWN:Error received from peer ipv4:127.0.0.1:15002 {created_time:"2022-12-09T01:54:45.378316841+00:00", grpc_status:2, grpc_message:"[DATATYPE_MISMATCH.CAST_WITH_CONF_SUGGESTION] Cannot resolve \"id\" due to data type mismatch: cannot cast \"BIGINT\" to \"BINARY\" with ANSI mode on.\nIf you have to cast \"BIGINT\" to \"BINARY\", you can set \"spark.sql.ansi.enabled\" as \'false\'.;\n\'Project [unresolvedalias(cast(id#31L as binary), None)]\n+- SubqueryAlias spark_catalog.default.test_connect_basic_table_1\n   +- Relation spark_catalog.default.test_connect_basic_table_1[id#31L,name#32] parquet\n"}"
>
```

https://github.com/apache/spark/actions/runs/3671813752

### Does this PR introduce _any_ user-facing change?

No, test-only.

### How was this patch tested?

This PR fixes the unittest to make passed. I manually tested.

Closes #39034 from HyukjinKwon/SPARK-41412-followup.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
beliefer pushed a commit to beliefer/spark that referenced this pull request Dec 18, 2022
### What changes were proposed in this pull request?

1. Add `types.py` and move to/from type proto to this file.
2. Improve Cast proto to support both DataType and DataType as a string.
3.  support cast to the following types in Python:
    ByteType,
    ShortType,
    IntegerType,
    FloatType,
    DayTimeIntervalType,
    MapType,
    StringType,
    DoubleType,
    LongType,
    DecimalType,
    BinaryType,
    BooleanType

### Why are the changes needed?

API coverage

### Does this PR introduce _any_ user-facing change?

NO

### How was this patch tested?

UT

Closes apache#38970 from amaliujia/add_cast.

Authored-by: Rui Wang <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
beliefer pushed a commit to beliefer/spark that referenced this pull request Dec 18, 2022
…e on

### What changes were proposed in this pull request?

This PR is a followup of apache#38970 which makes the test pass with ANSI mode on.

### Why are the changes needed?

To recover the build with ANSI mode on. Currently it's broke as follows:

```
======================================================================
ERROR [2.651s]: test_cast (pyspark.sql.tests.connect.test_connect_column.SparkConnectTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_connect_column.py", line 119, in test_cast
    df.select(df.id.cast(x)).toPandas(), df2.select(df2.id.cast(x)).toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1466, in toPandas
    return self._session.client._to_pandas(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client.py", line 333, in _to_pandas
    return self._execute_and_fetch(req)
  File "/__w/spark/spark/python/pyspark/sql/connect/client.py", line 418, in _execute_and_fetch
    for b in self._stub.ExecutePlan(req, metadata=self._builder.metadata()):
  File "/usr/local/lib/python3.9/dist-packages/grpc/_channel.py", line 426, in __next__
    return self._next()
  File "/usr/local/lib/python3.9/dist-packages/grpc/_channel.py", line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNKNOWN
	details = "[DATATYPE_MISMATCH.CAST_WITH_CONF_SUGGESTION] Cannot resolve "id" due to data type mismatch: cannot cast "BIGINT" to "BINARY" with ANSI mode on.
If you have to cast "BIGINT" to "BINARY", you can set "spark.sql.ansi.enabled" as 'false'.;
'Project [unresolvedalias(cast(id#31L as binary), None)]
+- SubqueryAlias spark_catalog.default.test_connect_basic_table_1
   +- Relation spark_catalog.default.test_connect_basic_table_1[id#31L,name#32] parquet
"
	debug_error_string = "UNKNOWN:Error received from peer ipv4:127.0.0.1:15002 {created_time:"2022-12-09T01:54:45.378316841+00:00", grpc_status:2, grpc_message:"[DATATYPE_MISMATCH.CAST_WITH_CONF_SUGGESTION] Cannot resolve \"id\" due to data type mismatch: cannot cast \"BIGINT\" to \"BINARY\" with ANSI mode on.\nIf you have to cast \"BIGINT\" to \"BINARY\", you can set \"spark.sql.ansi.enabled\" as \'false\'.;\n\'Project [unresolvedalias(cast(id#31L as binary), None)]\n+- SubqueryAlias spark_catalog.default.test_connect_basic_table_1\n   +- Relation spark_catalog.default.test_connect_basic_table_1[id#31L,name#32] parquet\n"}"
>
```

https://github.com/apache/spark/actions/runs/3671813752

### Does this PR introduce _any_ user-facing change?

No, test-only.

### How was this patch tested?

This PR fixes the unittest to make passed. I manually tested.

Closes apache#39034 from HyukjinKwon/SPARK-41412-followup.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants