When submitting Spark or PySpark application using spark-submit, we often need to include multiple third-party jars in classpath, Spark supports multiple ways to add dependency jars to the classpath.
1. Creating uber or assembly jar
Create an assembly or uber jar by including your application classes and all third-party dependencies. You can do this either using the Maven shade plugin or equivalent SBT assembly, for PySpark create a zip file or egg file.
By doing this, you don’t have to worry about adding jars to the classpath as all dependencies are already part of your uber jar.
2. Adding individual jars to a classpath
Adding multiple third-party jars to classpath can be done using spark-submit, spark-defaults.conf
, and SparkConf properties, before using these options you need to understand the priority of how these apply. Below is the precedence of how they apply in order.
- Properties set directly on the
SparkConf
take the highest precedence. - The second precedence goes to
spark-submit
options. - Finally, properties specified in
spark-defaults.conf
file.
When you are setting jars in different places, remember the precedence it takes. Use spark-submit with --verbose
option to get more details about what jars spark has used.
2.1 Adding jars to the classpath
You can also add jars using Spark submit option--jar
, using this option you can add a single jar or multiple jars by comma-separated.
spark-submit --master yarn
--class com.sparkbyexamples.WordCountExample
--jars /path/first.jar,/path/second.jar,/path/third.jar
your-application.jar
Alternatively, you can also use SparkContext.addJar()
2.2 Adding all jars from a folder to classpath
If you have many jars, imagine using all these jars in a comma-separated and when you have to update the version of the jars, it’s going to be a nightmare to maintain this.
You can use the below snippet to add all jars from a folder automatically, $(echo /path/*.jar | tr ' ' ',')
statement creates a comma-separated string by appending all jar names in a folder.
spark-submit -- class com.sparkbyexamples.WordCountExample \
--jars $(echo /path/*.jar | tr ' ' ',') \
your-application.jar
2.3 Adding jars with spark-defaults.conf
You can also specify jars on $SPARK_HOME/conf/spark-defaults.conf
, but this is not a preferable option and any libraries you specify here take low precedence.
#Add jars to driver classpath
spark.driver.extraClassPath /path/first.jar:/path/second.jar
#Add jars to executor classpath
spark.executor.extraClassPath /path/first.jar:/path/second.jar
On windows, the jar file names should be separated with comma (,) instead of colon (:)
2.4 Using SparkConf properties
This takes the high priority among other configs.
spark = SparkSession \
.builder \
.appName("SparkByExamples.com") \
.config("spark.yarn.dist.jars", "/path/first.jar,/path/second.jar") \
.getOrCreate()
3. Adding jars to Spark Driver
Sometimes you may need to add a jar to only Spark driver, you can do this by using --driver-class-path
or --conf spark.driver.extraClassPath
spark-submit -- class com.sparkbyexamples.WordCountExample \
--jars $(echo /path/jars/*.jar | tr ' ' ',') \
--driver-class-path jar-driver.jar
your-application.jar
5. Adding jars to spark-shell
Options on spark-shell are similar to spark-submit hence you can use the options specified above to add one or multiple jars to spark-shell classpath.
spark-shell --driver-class-path /path/to/example.jar:/path/to/another.jar
6. Other options
--conf spark.driver.extraLibraryPath=/path/
# or use below, both do the same
--driver-library-path /path/
Happy Learning !!
In Yarn mode, it is important that Spark jar files are available throughout the Spark cluster. I have spent a fair bit of time on this and I recommend that you follow this procedure to make sure that the spark-submit job runs ok. Use the spark.yarn.archive configuration option and set that to the location of an archive (you create on HDFS) containing all the JARs in the $SPARK_HOME/jars/ folder, at the root level of the archive. For example:
1) Create the archive:
jar cv0f spark-libs.jar -C $SPARK_HOME/jars/ .
2) Create a directory on HDFS for the jars accessible to the application
hdfs dfs -mkdir /jars
3) Upload to HDFS:
hdfs dfs -put spark-libs.jar /jars
4) For a large cluster, increase the replication count of the Spark archive
so that you reduce the amount of times a NodeManager will do a remote copy
hdfs dfs -setrep -w 10 hdfs:///jars/spark-libs.jar (Change the amount of
replicas proportional to the number of total NodeManagers)
3) In $SPARK_HOME/conf/spark-defaults.conf file set
spark.yarn.archive to hdfs:///rhes75:9000/jars/spark-libs.jar. Similar to
below
spark.yarn.archive=hdfs://rhes75:9000/jars/spark-libs.jar