From 92bffabe48c6c33a9ec5bc003d8683e59c97158c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Rodrigo=20Gir=C3=A3o=20Serr=C3=A3o?= <5621605+rodrigogiraoserrao@users.noreply.github.com> Date: Fri, 4 Oct 2024 15:21:17 +0100 Subject: [PATCH] Include fixed links in PR. --- docs/source/user-guide/expressions/casting.md | 2 +- docs/source/user-guide/io/csv.md | 2 +- docs/source/user-guide/io/parquet.md | 2 +- docs/source/user-guide/sql/intro.md | 4 ++-- 4 files changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/source/user-guide/expressions/casting.md b/docs/source/user-guide/expressions/casting.md index 6deddaecb684e..3fe9b174e9008 100644 --- a/docs/source/user-guide/expressions/casting.md +++ b/docs/source/user-guide/expressions/casting.md @@ -1,6 +1,6 @@ # Casting -Casting converts the underlying [`DataType`](../concepts/data-types/overview.md) of a column to a new one. Polars uses Arrow to manage the data in memory and relies on the compute kernels in the [Rust implementation](https://github.com/jorgecarleitao/arrow2) to do the conversion. Casting is available with the `cast()` method. +Casting converts the [underlying `DataType` of a column](../concepts/data-types-and-structures.md) to a new one. Polars uses Arrow to manage the data in memory and relies on the compute kernels in the [Rust implementation](https://github.com/jorgecarleitao/arrow2) to do the conversion. Casting is available with the `cast()` method. The `cast` method includes a `strict` parameter that determines how Polars behaves when it encounters a value that can't be converted from the source `DataType` to the target `DataType`. By default, `strict=True`, which means that Polars will throw an error to notify the user of the failed conversion and provide details on the values that couldn't be cast. On the other hand, if `strict=False`, any values that can't be converted to the target `DataType` will be quietly converted to `null`. diff --git a/docs/source/user-guide/io/csv.md b/docs/source/user-guide/io/csv.md index 2d7772b45f1f9..f654d970ac817 100644 --- a/docs/source/user-guide/io/csv.md +++ b/docs/source/user-guide/io/csv.md @@ -18,4 +18,4 @@ file and instead returns a lazy computation holder called a `LazyFrame`. {{code_block('user-guide/io/csv','scan',['scan_csv'])}} If you want to know why this is desirable, you can read more about these Polars -optimizations [here](../concepts/lazy-vs-eager.md). +optimizations [here](../concepts/lazy-api.md). diff --git a/docs/source/user-guide/io/parquet.md b/docs/source/user-guide/io/parquet.md index da35ee96f476e..e04c2bdde2e7e 100644 --- a/docs/source/user-guide/io/parquet.md +++ b/docs/source/user-guide/io/parquet.md @@ -20,6 +20,6 @@ Polars allows you to _scan_ a `Parquet` input. Scanning delays the actual parsin {{code_block('user-guide/io/parquet','scan',['scan_parquet'])}} -If you want to know why this is desirable, you can read more about those Polars optimizations [here](../concepts/lazy-vs-eager.md). +If you want to know why this is desirable, you can read more about those Polars optimizations [here](../concepts/lazy-api.md). When we scan a `Parquet` file stored in the cloud, we can also apply predicate and projection pushdowns. This can significantly reduce the amount of data that needs to be downloaded. For scanning a Parquet file in the cloud, see [Cloud storage](cloud-storage.md/#scanning-from-cloud-storage-with-query-optimisation). diff --git a/docs/source/user-guide/sql/intro.md b/docs/source/user-guide/sql/intro.md index 08918e4e6404c..0b762f16cdd92 100644 --- a/docs/source/user-guide/sql/intro.md +++ b/docs/source/user-guide/sql/intro.md @@ -1,13 +1,13 @@ # Introduction While Polars supports interaction with SQL, it's recommended that users familiarize themselves with -the [expression syntax](../concepts/expressions.md) to produce more readable and expressive code. As the DataFrame +the [expression syntax](../concepts/expressions-and-contexts.md#expressions) to produce more readable and expressive code. As the DataFrame interface is primary, new features are typically added to the expression API first. However, if you already have an existing SQL codebase or prefer the use of SQL, Polars does offers support for this. !!! note Execution - There is no separate SQL engine because Polars translates SQL queries into [expressions](../concepts/expressions.md), which are then executed using its own engine. This approach ensures that Polars maintains its performance and scalability advantages as a native DataFrame library, while still providing users with the ability to work with SQL. + There is no separate SQL engine because Polars translates SQL queries into [expressions](../concepts/expressions-and-contexts.md#expressions), which are then executed using its own engine. This approach ensures that Polars maintains its performance and scalability advantages as a native DataFrame library, while still providing users with the ability to work with SQL. ## Context