From d383efba12c66addb17006dea107bb0421d50bc3 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Fri, 31 Mar 2017 21:57:09 +0800 Subject: [PATCH 1/4] [SPARK-20177]Document about compression way has some little detail changes. --- docs/configuration.md | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/docs/configuration.md b/docs/configuration.md index a9753925407d7..156997b539e65 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -639,6 +639,7 @@ Apart from these, the following properties are also available, and may be useful false Whether to compress logged events, if spark.eventLog.enabled is true. + Compression will use spark.io.compression.codec. @@ -773,14 +774,15 @@ Apart from these, the following properties are also available, and may be useful true Whether to compress broadcast variables before sending them. Generally a good idea. + Compression will use spark.io.compression.codec. spark.io.compression.codec lz4 - The codec used to compress internal data such as RDD partitions, broadcast variables and - shuffle outputs. By default, Spark provides three codecs: lz4, lzf, + The codec used to compress internal data such as RDD partitions,event log, broadcast variables + and shuffle outputs. By default, Spark provides three codecs: lz4, lzf, and snappy. You can also use fully qualified class names to specify the codec, e.g. org.apache.spark.io.LZ4CompressionCodec, @@ -881,6 +883,7 @@ Apart from these, the following properties are also available, and may be useful StorageLevel.MEMORY_ONLY_SER in Java and Scala or StorageLevel.MEMORY_ONLY in Python). Can save substantial space at the cost of some extra CPU time. + Compression will use spark.io.compression.codec. From 3059013e9d2aec76def14eb314b6761bea0e7ca0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Sat, 1 Apr 2017 09:38:02 +0800 Subject: [PATCH 2/4] [SPARK-20177] event log add a space --- docs/configuration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/configuration.md b/docs/configuration.md index 156997b539e65..2687f542b8bd3 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -781,7 +781,7 @@ Apart from these, the following properties are also available, and may be useful spark.io.compression.codec lz4 - The codec used to compress internal data such as RDD partitions,event log, broadcast variables + The codec used to compress internal data such as RDD partitions, event log, broadcast variables and shuffle outputs. By default, Spark provides three codecs: lz4, lzf, and snappy. You can also use fully qualified class names to specify the codec, e.g. From 0efb0dd9e404229cce638fe3fb0c966276784df7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Wed, 5 Apr 2017 11:47:53 +0800 Subject: [PATCH 3/4] [SPARK-20218]'/applications/[app-id]/stages' in REST API,add description. --- docs/monitoring.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/monitoring.md b/docs/monitoring.md index 4d0617d253b80..d180d77e2cd4d 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -299,6 +299,7 @@ can be identified by their `[attempt-id]`. In the API listed below, when running /applications/[app-id]/stages A list of all stages for a given application. +
?status=[active|complete|pending|failed] list only stages in the state. /applications/[app-id]/stages/[stage-id] From 0e37fdeee28e31fc97436dabd001d3c85c5a7794 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Wed, 5 Apr 2017 13:22:54 +0800 Subject: [PATCH 4/4] [SPARK-20218] '/applications/[app-id]/stages/[stage-id]' in REST API,remove redundant description. --- docs/monitoring.md | 1 - 1 file changed, 1 deletion(-) diff --git a/docs/monitoring.md b/docs/monitoring.md index d180d77e2cd4d..da954385dc452 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -305,7 +305,6 @@ can be identified by their `[attempt-id]`. In the API listed below, when running /applications/[app-id]/stages/[stage-id] A list of all attempts for the given stage. -
?status=[active|complete|pending|failed] list only stages in the state.