Skip to content
Closed
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 16 additions & 6 deletions python/pyspark/sql/functions/builtin.py
Original file line number Diff line number Diff line change
Expand Up @@ -15303,7 +15303,7 @@ def shuffle(col: "ColumnOrName") -> Column:
@_try_remote_functions
def reverse(col: "ColumnOrName") -> Column:
"""
Collection function: returns a reversed string or an array with reverse order of elements.
Collection function: returns a reversed string or an array with elements in reverse order.

.. versionadded:: 1.5.0

Expand All @@ -15313,18 +15313,23 @@ def reverse(col: "ColumnOrName") -> Column:
Parameters
----------
col : :class:`~pyspark.sql.Column` or str
name of column or expression
The name of the column or an expression that represents the element to be reversed.

Returns
-------
:class:`~pyspark.sql.Column`
array of elements in reverse order.
A new column that contains a reversed string or an array with elements in reverse order.

Examples
--------
Example 1: Reverse a string

Copy link
Contributor

@LuciferYang LuciferYang Jan 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's standardize the format:

  1. import functions as sf and import other necessaries
  2. try not to use .alias
  3. perhaps show is clearer than collect when displaying results.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

>>> df = spark.createDataFrame([('Spark SQL',)], ['data'])
>>> df.select(reverse(df.data).alias('s')).collect()
[Row(s='LQS krapS')]

Example 2: Reverse an array

>>> df = spark.createDataFrame([([2, 1, 3],) ,([1],) ,([],)], ['data'])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit, is this line duplicated?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For self-contained, the data is also repeated once in each example.

Copy link
Contributor

@zhengruifeng zhengruifeng Jan 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean df is defined twice in this example

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I made a mistake. It was defined twice in this example and let me to remove it.
Done.

>>> df.select(reverse(df.data).alias('r')).collect()
[Row(r=[3, 1, 2]), Row(r=[1]), Row(r=[])]
Expand Down Expand Up @@ -15406,7 +15411,7 @@ def flatten(col: "ColumnOrName") -> Column:
@_try_remote_functions
def map_contains_key(col: "ColumnOrName", value: Any) -> Column:
"""
Returns true if the map contains the key.
Map function: Returns true if the map contains the key.

.. versionadded:: 3.4.0

Expand All @@ -15416,9 +15421,9 @@ def map_contains_key(col: "ColumnOrName", value: Any) -> Column:
Parameters
----------
col : :class:`~pyspark.sql.Column` or str
name of column or expression
The name of the column or an expression that represents the map.
value :
a literal value
A literal value.

Returns
-------
Expand All @@ -15427,6 +15432,8 @@ def map_contains_key(col: "ColumnOrName", value: Any) -> Column:

Examples
--------
Example 1: The key is in the map

>>> from pyspark.sql.functions import map_contains_key
>>> df = spark.sql("SELECT map(1, 'a', 2, 'b') as data")
>>> df.select(map_contains_key("data", 1)).show()
Expand All @@ -15435,6 +15442,9 @@ def map_contains_key(col: "ColumnOrName", value: Any) -> Column:
+-------------------------+
| true|
+-------------------------+

Example 2: The key is not in the map

>>> df.select(map_contains_key("data", -1)).show()
+--------------------------+
|map_contains_key(data, -1)|
Expand Down