Skip to content

Commit

Permalink
Include fixed links in PR.
Browse files Browse the repository at this point in the history
  • Loading branch information
rodrigogiraoserrao committed Oct 4, 2024
1 parent aabbc7f commit 92bffab
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion docs/source/user-guide/expressions/casting.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Casting

Casting converts the underlying [`DataType`](../concepts/data-types/overview.md) of a column to a new one. Polars uses Arrow to manage the data in memory and relies on the compute kernels in the [Rust implementation](https://github.com/jorgecarleitao/arrow2) to do the conversion. Casting is available with the `cast()` method.
Casting converts the [underlying `DataType` of a column](../concepts/data-types-and-structures.md) to a new one. Polars uses Arrow to manage the data in memory and relies on the compute kernels in the [Rust implementation](https://github.com/jorgecarleitao/arrow2) to do the conversion. Casting is available with the `cast()` method.

The `cast` method includes a `strict` parameter that determines how Polars behaves when it encounters a value that can't be converted from the source `DataType` to the target `DataType`. By default, `strict=True`, which means that Polars will throw an error to notify the user of the failed conversion and provide details on the values that couldn't be cast. On the other hand, if `strict=False`, any values that can't be converted to the target `DataType` will be quietly converted to `null`.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/user-guide/io/csv.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ file and instead returns a lazy computation holder called a `LazyFrame`.
{{code_block('user-guide/io/csv','scan',['scan_csv'])}}

If you want to know why this is desirable, you can read more about these Polars
optimizations [here](../concepts/lazy-vs-eager.md).
optimizations [here](../concepts/lazy-api.md).
2 changes: 1 addition & 1 deletion docs/source/user-guide/io/parquet.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,6 @@ Polars allows you to _scan_ a `Parquet` input. Scanning delays the actual parsin

{{code_block('user-guide/io/parquet','scan',['scan_parquet'])}}

If you want to know why this is desirable, you can read more about those Polars optimizations [here](../concepts/lazy-vs-eager.md).
If you want to know why this is desirable, you can read more about those Polars optimizations [here](../concepts/lazy-api.md).

When we scan a `Parquet` file stored in the cloud, we can also apply predicate and projection pushdowns. This can significantly reduce the amount of data that needs to be downloaded. For scanning a Parquet file in the cloud, see [Cloud storage](cloud-storage.md/#scanning-from-cloud-storage-with-query-optimisation).
4 changes: 2 additions & 2 deletions docs/source/user-guide/sql/intro.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
# Introduction

While Polars supports interaction with SQL, it's recommended that users familiarize themselves with
the [expression syntax](../concepts/expressions.md) to produce more readable and expressive code. As the DataFrame
the [expression syntax](../concepts/expressions-and-contexts.md#expressions) to produce more readable and expressive code. As the DataFrame
interface is primary, new features are typically added to the expression API first. However, if you already have an
existing SQL codebase or prefer the use of SQL, Polars does offers support for this.

!!! note Execution

There is no separate SQL engine because Polars translates SQL queries into [expressions](../concepts/expressions.md), which are then executed using its own engine. This approach ensures that Polars maintains its performance and scalability advantages as a native DataFrame library, while still providing users with the ability to work with SQL.
There is no separate SQL engine because Polars translates SQL queries into [expressions](../concepts/expressions-and-contexts.md#expressions), which are then executed using its own engine. This approach ensures that Polars maintains its performance and scalability advantages as a native DataFrame library, while still providing users with the ability to work with SQL.

## Context

Expand Down

0 comments on commit 92bffab

Please sign in to comment.