Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -101,8 +101,8 @@ jobs:
run: |
./build/sbt -Pyarn -Pmesos -Pkubernetes -Phive -Phive-thriftserver -Phadoop-cloud -Pkinesis-asl -Pspark-ganglia-lgpl test:package
# Make less noisy
cp conf/log4j.properties.template conf/log4j.properties
sed -i 's/log4j.rootCategory=INFO, console/log4j.rootCategory=WARN, console/g' conf/log4j.properties
cp conf/log4j2.properties.template conf/log4j2.properties
sed -i 's/rootLogger.level = info/rootLogger.level = warn/g' conf/log4j2.properties
# In benchmark, we use local as master so set driver memory only. Note that GitHub Actions has 7 GB memory limit.
bin/spark-submit \
--driver-memory 6g --class org.apache.spark.benchmark.Benchmarks \
Expand Down
2 changes: 1 addition & 1 deletion conf/log4j2.properties.template
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

# Set everything to be logged to the console
rootLogger.level = info
rootLogger.appenderRef.file.ref = console
rootLogger.appenderRef.stdout.ref = console

appender.console.type = Console
appender.console.name = console
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -3081,7 +3081,7 @@ Note: When running Spark on YARN in `cluster` mode, environment variables need t

Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can configure it by adding a
`log4j.properties` file in the `conf` directory. One way to start is to copy the existing
`log4j.properties.template` located there.
`log4j2.properties.template` located there.

By default, Spark adds 1 record to the MDC (Mapped Diagnostic Context): `mdc.taskName`, which shows something
like `task 1.0 in stage 0.0`. You can add `%X{mdc.taskName}` to your patternLayout in
Expand Down
2 changes: 1 addition & 1 deletion external/kafka-0-10-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
<resource>log4j.properties</resource>
<resource>log4j2.properties</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
Expand Down
2 changes: 1 addition & 1 deletion external/kinesis-asl-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
<resource>log4j.properties</resource>
<resource>log4j2.properties</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
Expand Down
2 changes: 1 addition & 1 deletion project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -871,7 +871,7 @@ object Assembly {
=> MergeStrategy.discard
case m if m.toLowerCase(Locale.ROOT).matches("meta-inf.*\\.sf$")
=> MergeStrategy.discard
case "log4j.properties" => MergeStrategy.discard
case "log4j2.properties" => MergeStrategy.discard
case m if m.toLowerCase(Locale.ROOT).startsWith("meta-inf/services/")
=> MergeStrategy.filterDistinctLines
case "reference.conf" => MergeStrategy.concat
Expand Down
2 changes: 1 addition & 1 deletion resource-managers/kubernetes/integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@
<argLine>-ea -Xmx4g -XX:ReservedCodeCacheSize=1g ${extraScalaTestArgs}</argLine>
<stderr/>
<systemProperties>
<log4j.configuration>file:src/test/resources/log4j.properties</log4j.configuration>
<log4j.configurationFile>file:src/test/resources/log4j2.properties</log4j.configurationFile>
<java.awt.headless>true</java.awt.headless>
<spark.kubernetes.test.imageTagFile>${spark.kubernetes.test.imageTagFile}</spark.kubernetes.test.imageTagFile>
<spark.kubernetes.test.unpackSparkDir>${spark.kubernetes.test.unpackSparkDir}</spark.kubernetes.test.unpackSparkDir>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,11 @@
#

# This log4j config file is for integration test SparkConfPropagateSuite.
log4j.rootCategory=DEBUG, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c: %m%n
rootLogger.level = debug
rootLogger.appenderRef.stdout.ref = console

appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c: %m%n
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ private[spark] trait SparkConfPropagateSuite { k8sSuite: KubernetesSuite =>
try {
Files.write(new File(logConfFilePath).toPath, content.getBytes)

sparkAppConf.set("spark.driver.extraJavaOptions", "-Dlog4j.debug")
sparkAppConf.set("spark.executor.extraJavaOptions", "-Dlog4j.debug")
sparkAppConf.set("spark.driver.extraJavaOptions", "-Dlog4j2.debug")
sparkAppConf.set("spark.executor.extraJavaOptions", "-Dlog4j2.debug")
sparkAppConf.set("spark.kubernetes.executor.deleteOnTermination", "false")

val log4jExpectedLog =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,15 +47,21 @@ abstract class BaseYarnClusterSuite
// log4j configuration for the YARN containers, so that their output is collected
// by YARN instead of trying to overwrite unit-tests.log.
protected val LOG4J_CONF = """
|log4j.rootCategory=DEBUG, console
|log4j.appender.console=org.apache.log4j.ConsoleAppender
|log4j.appender.console.target=System.err
|log4j.appender.console.layout=org.apache.log4j.PatternLayout
|log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
|log4j.logger.org.apache.hadoop=WARN
|log4j.logger.org.eclipse.jetty=WARN
|log4j.logger.org.mortbay=WARN
|log4j.logger.org.sparkproject.jetty=WARN
|rootLogger.level = debug
|rootLogger.appenderRef.stdout.ref = console
|appender.console.type = Console
|appender.console.name = console
|appender.console.target = SYSTEM_ERR
|appender.console.layout.type = PatternLayout
|appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
|logger.jetty.name = org.sparkproject.jetty
|logger.jetty.level = warn
|logger.eclipse.name = org.eclipse.jetty
|logger.eclipse.level = warn
|logger.hadoop.name = org.apache.hadoop
|logger.hadoop.level = warn
|logger.mortbay.name = org.mortbay
|logger.mortbay.level = warn
""".stripMargin

private var yarnCluster: MiniYARNCluster = _
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -295,10 +295,12 @@ class YarnClusterSuite extends BaseYarnClusterSuite {
val log4jConf = new File(tempDir, "log4j.properties")
val logOutFile = new File(tempDir, "logs")
Files.write(
s"""log4j.rootCategory=DEBUG,file
|log4j.appender.file=org.apache.log4j.FileAppender
|log4j.appender.file.file=$logOutFile
|log4j.appender.file.layout=org.apache.log4j.PatternLayout
s"""rootLogger.level = debug
|rootLogger.appenderRef.file.ref = file
|appender.file.type = File
|appender.file.name = file
|appender.file.fileName = $logOutFile
|appender.file.layout.type = PatternLayout
|""".stripMargin,
log4jConf, StandardCharsets.UTF_8)
// Since this test is trying to extract log output from the SparkSubmit process itself,
Expand All @@ -307,7 +309,8 @@ class YarnClusterSuite extends BaseYarnClusterSuite {
val confDir = new File(tempDir, "conf")
confDir.mkdir()
val javaOptsFile = new File(confDir, "java-opts")
Files.write(s"-Dlog4j.configuration=file://$log4jConf\n", javaOptsFile, StandardCharsets.UTF_8)
Files.write(s"-Dlog4j.configurationFile=file://$log4jConf\n", javaOptsFile,
StandardCharsets.UTF_8)

val result = File.createTempFile("result", null, tempDir)
val finalState = runSpark(clientMode = false,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1220,11 +1220,13 @@ abstract class HiveThriftServer2TestBase extends SparkFunSuite with BeforeAndAft
val tempLog4jConf = Utils.createTempDir().getCanonicalPath

Files.write(
"""log4j.rootCategory=INFO, console
|log4j.appender.console=org.apache.log4j.ConsoleAppender
|log4j.appender.console.target=System.err
|log4j.appender.console.layout=org.apache.log4j.PatternLayout
|log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
"""rootLogger.level = info
|rootLogger.appenderRef.file.ref = console
|appender.console.type = Console
|appender.console.name = console
|appender.console.target = SYSTEM_ERR
|appender.console.layout.type = PatternLayout
|appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
""".stripMargin,
new File(s"$tempLog4jConf/log4j.properties"),
Copy link
Contributor

@LuciferYang LuciferYang Dec 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change new File(s"$tempLog4jConf/log4j.properties") to new File(s"$tempLog4jConf/log4j2.properties"), then HiveThriftBinaryServerSuite no longer hang

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However, it seems that all logs are printed to the console, which is different from the previous behavior

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SBT cannot reproduce it. But we should use log4j2.properties.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I corrected this together in #34965.

StandardCharsets.UTF_8)
Expand Down