How to Check Spark Version

Spread the love

We are often required to check what version of Apache Spark is installed on our environment, depending on the OS (Mac, Linux, Windows, CentOS) Spark installs in different locations hence it’s challenging to find the Spark version.

In this article, I will quickly cover different ways to check the Spark installed version through the command line and in runtime. You can use the options explained here to find the spark version when you are using Hadoop (CDH), Aws Glue, Anaconda, Jupyter notebook e.t.c.

1. Spark Version Check from Command Line

Like any other tools or language, you can use –version option with spark-submit, spark-shell, and spark-sql to find the version.


spark-submit --version
spark-shell --version
spark-sql --version

All above spark-submit command, spark-shell command, and spark-sql return the below output where you can find Spark installed version.

spark version check
spark-submit –version

As you see it displays the spark version along with Scala version 2.12.10 and Java version. For Java, I am using OpenJDK hence it shows the version as OpenJDK 64-Bit Server VM, 11.0-13.

2. Version Check From Spark Shell

Additionally, you are in spark-shell and you wanted to find out the spark version without exiting spark-shell, you can achieve this by using the sc.version. sc is a SparkContect variable that default exists in spark-shell. Use the below steps to find the spark version.

  1. cd to $SPARK_HOME/bin
  2. Launch spark-shell command
  3. Enter sc.version or spark.version
spark shell version check
spark-shell

sc.version returns a version as a String type. When you use the spark.version from the shell, it also returns the same output.

3. Find Version from IntelliJ or any IDE

Imagine you are writing a Spark application and you wanted to find the spark version during runtime, you can get it by accessing the version property from the SparkSession object which returns a String type.


val spark = SparkSession.builder()
      .master("local[1]")
      .appName("SparkByExamples.com")
      .getOrCreate();

print('Apache Spark Version :'+spark.version)
print('Apache Spark Version :'+spark.sparkContext.version)

In this simple article, you have learned to find a spark version from the command line, spark-shell, and runtime, you can use these from Hadoop (CDH), Aws Glue, Anaconda, Jupyter notebook e.t.c

Happy Learning !!

Naveen (NNK)

SparkByExamples.com is a Big Data and Spark examples community page, all examples are simple and easy to understand and well tested in our development environment Read more ..

Leave a Reply

You are currently viewing How to Check Spark Version