You are currently viewing SOLVED Can’t assign requested address: Service ‘sparkDriver’

How to resolve java.net.BindException: Can’t assign requested address: Service ‘sparkDriver’ error while running Spark/PySpark application?

Advertisements

The full error message is “Exception in thread “main” java.net.BindException: Can’t assign requested address: Service ‘sparkDriver’ failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service ‘sparkDriver’ (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.”

Solution to resolve Can’t assign requested address: Service ‘sparkDriver’ error.

There are multiple ways to resolve java.net.BindException: Can’t assign requested address: Service ‘sparkDriver’ error. Sometimes the connection might be interrupted by VPN, disconnecting VPN and trying it again it might work.

However, by disconnecting the VPN you will be out of your company network hence, this is not the right solution.

1. Add an Environment Variable

You can resolve this error either by exporting an environment variable SPARK_LOCAL_IP with value 127.0.0.1


// Add an Environment Variable
export SPARK_LOCAL_IP="127.0.0.1"

Alternatively, you can also set this environment in load-spark-env.sh file located at spark/bin directory.

2. Set with SparkConf

You can also add this while creating SparkSession, although this is not the right option to use as you are adding it to the code, still wanted to cover as this solves the problem.


// Set with SparkConf
val spark: SparkSession = SparkSession.builder()
    .appName("SparkByExamples.com")
    .master("local[*]")
    .config("spark.driver.bindAddress", "127.0.0.1")
    .getOrCreate()

3. Using hostname on Server

Running the below command also solves the problem.


// Using hostname on Server
sudo hostname -s 127.0.0.1

Conclusion

In this article, you have learned how to solve the java.net.BindException: Can’t assign requested address: Service ‘sparkDriver’ error in Spark or PySpark. Hope one of the solutions explained here worked for you.

Related Articles