diff --git a/docs/rdd-programming-guide.md b/docs/rdd-programming-guide.md index a1adcc2f6eb0..400f8a512e7a 100644 --- a/docs/rdd-programming-guide.md +++ b/docs/rdd-programming-guide.md @@ -39,7 +39,7 @@ along with if you launch Spark's interactive shell -- either `bin/spark-shell` f
-Spark {{site.SPARK_VERSION}} works with Python 3.8+. It can use the standard CPython interpreter, +Spark {{site.SPARK_VERSION}} works with Python 3.9+. It can use the standard CPython interpreter, so C libraries like NumPy can be used. It also works with PyPy 7.3.6+. Spark applications in Python can either be run with the `bin/spark-submit` script which includes Spark at runtime, or by including it in your setup.py as: