2

After trying every approach shown on this question: How to suppress parquet log messages in Spark? none works with Spark 2.1 - except the blunt instrument approach of disabling all logging below WARN level.

log4j.rootCategory=WARN, console

That is not an acceptable approach (our app writes INFO messages for reason .. ).

Note the first approach taken was to add

log4j.logger.parquet=ERROR
log4j.logger.org.apache.spark.sql.execution.datasources.parquet=ERROR
log4j.logger.org.apache.spark.sql.execution.datasources.FileScanRDD=ERROR
log4j.logger.org.apache.hadoop.io.compress.CodecPool=ERROR

to the log4j.properties. These had no effect. The other approach is also included in my attempts:

org.apache.parquet.handlers=java.util.logging.ConsoleHandler
java.util.logging.ConsoleHandler.level=SEVERE

with the following added to the jvm options

 -Dspark.driver.extraJavaOptions="-Djava.util.logging.config.file=/tmp/parquet.logging.properties"
  -Dspark.executor.extraJavaOptions="-Djava.util.logging.config.file=/tmp/parquet.logging.properties"

Likewise no change.

If anyone has found a magic Quiet down Parquet button please chime in.

WestCoastProjects
  • 58,982
  • 91
  • 316
  • 560

1 Answers1

0

Add:

log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR

to your log4j.properties file

wllmtrng
  • 182
  • 3