You are currently viewing Spark – Stop INFO & DEBUG message logging to console?

Problem: In Spark, wondering how to stop/disable/turn off INFO and DEBUG message logging to Spark console, when I run a Spark or PySpark program on a cluster or in my local, I see a lot of DEBUG and INFO messages in console and I wanted to turn off this logging.

Solution: By default, Spark log configuration has set to INFO hence when you run a Spark or PySpark application in local or in the cluster you see a lot of Spark INFo messages in console or in a log file.

With default INFO logging, you will see the Spark logging message like below.


Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/03/29 17:38:24 INFO SparkContext: Running Spark version 2.4.4
20/03/29 17:38:24 INFO SparkContext: Submitted application: SparkByExamples.com
20/03/29 17:38:24 INFO SecurityManager: Changing view acls to: nnk
20/03/29 17:38:24 INFO SecurityManager: Changing modify acls to: nnk
20/03/29 17:38:24 INFO SecurityManager: Changing view acls groups to: 
20/03/29 17:38:24 INFO SecurityManager: Changing modify acls groups to: 
---
---
---
20/03/29 17:38:25 INFO BlockManagerMasterEndpoint: Registering block manager DELL-ESUHAO2KAJ:63708 with 1989.6 MB RAM, BlockManagerId(driver, DELL-ESUHAO2KAJ, 63708, None)
20/03/29 17:38:25 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DELL-ESUHAO2KAJ, 63708, None)
20/03/29 17:38:25 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, DELL-ESUHAO2KAJ, 63708, None)
20/03/29 17:38:28 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/apps/sparkbyexamples/src/spark-scala-examples-new/spark-warehouse/').
20/03/29 17:38:28 INFO SharedState: Warehouse path is 'file:/C:/apps/sparkbyexamples/src/spark-scala-examples-new/spark-warehouse/'.
20/03/29 17:38:28 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
20/03/29 17:38:29 INFO CodeGenerator: Code generated in 305.738284 ms
20/03/29 17:38:29 INFO CodeGenerator: Code generated in 16.662746 ms
20/03/29 17:38:29 INFO CodeGenerator: Code generated in 14.111423 ms

On DEV and QA environment it’s okay to keep the log4j log level to INFO or DEBUG mode. But, for UAT, live or production application we should change the log level to WARN or ERROR as we do not want to verbose logging on these environments.

Now, Let’s see how to stop/disable/turn off logging DEBUG and INFO messages to the console or to a log file.

Using sparkContext.setLogLevel() method you can change the log level to the desired level. Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

In order to stop DEBUG and INFO messages change the log level to either WARN, ERROR or FATAL. For example, below it changes to ERORR


  val spark:SparkSession = SparkSession.builder()
    .master("local[1]")
    .appName("SparkByExamples.com")
    .getOrCreate()

  spark.sparkContext.setLogLevel("ERROR")

With the last statement from the above example, it will stop/disable DEBUG or INFO messages in the console and you will see ERROR messages along with the output of println() or show(),printSchema() of the DataFrame methods.

Happy learning !!

Naveen Nelamali

Naveen Nelamali (NNK) is a Data Engineer with 20+ years of experience in transforming data into actionable insights. Over the years, He has honed his expertise in designing, implementing, and maintaining data pipelines with frameworks like Apache Spark, PySpark, Pandas, R, Hive and Machine Learning. Naveen journey in the field of data engineering has been a continuous learning, innovation, and a strong commitment to data integrity. In this blog, he shares his experiences with the data as he come across. Follow Naveen @ LinkedIn and Medium

This Post Has 3 Comments

  1. Jon

    What if getOrCreate() is outputting warnings we don’t want to see?

  2. Brad

    Awesome Reference. I was able to create my spark session and setLogLevel to ‘Warn’

    def create_spark_session():
    spark = SparkSession \
    .builder \
    .config(“spark.jars.packages”, “org.apache.hadoop:hadoop-aws:2.7.0”) \
    .getOrCreate()
    spark.sparkContext.setLogLevel(‘WARN’)
    return spark

  3. Bogdan

    Excellent, and thank you very much not only for this but also for the other useful information on this page.

Comments are closed.