Spark | Hadoop – Exception in thread “main” java.lang.UnsatisfiedLinkError:$Windows.access0(Ljava/lang/String;I)Z

When you run the Spark program on windows OS, you often get the exception “Exception in thread “main” java.lang.UnsatisfiedLinkError:$Windows.access0(Ljava/lang/String;I)Z”

Full-stack trace of the error

Exception in thread “main” java.lang.UnsatisfiedLinkError:$Windows.access0(Ljava/lang/String;I)Z

Solution: In order to run the Spark program on windows, you would need Hadoop winutils.exe file as windows don’t support HDFS and winutils provides a wrapper.

If you don’t have winutils.exe installed, please download the wintils.exe and hadoop.dll files from (select the Hadoop version you are using as winutils are specific to Hadoop versions)

Copy the downloaded files to a folder on your system, for example, let’s say you have copied these files to c:/hadoop/bin, set the below environment variables.

set HADOOP_HOME=c:/hadoop

Close and reload the command line or terminal to initialize these variables. Sometimes you may also need to put hadoop.dll file into the C:/Windows/System32 folder.

Now run your spark program and issue “Windows.access0(Ljava/lang/String;I)Z” should disappear.

However, if it still doesn’t work, try to restart your system as some of the above settings would get the effect with the restart.

Happy Learning !!

Naveen (NNK)

I am Naveen (NNK) working as a Principal Engineer. I am a seasoned Apache Spark Engineer with a passion for harnessing the power of big data and distributed computing to drive innovation and deliver data-driven insights. I love to design, optimize, and managing Apache Spark-based solutions that transform raw data into actionable intelligence. I am also passion about sharing my knowledge in Apache Spark, Hive, PySpark, R etc.

Leave a Reply

This Post Has 12 Comments

  1. diassy_devops

    merci beaucoup la methode fonctionne bien

    1. NNK

      Thank you.

  2. Elena

    Thank you!

  3. Anonymous

    copying dll file to Windows\System32 folder helped me!

    1. NNK

      Glad it worked for you. Thanks for sharing.

  4. Raghu

    Thank you. It worked for me on Windows.

  5. Wing

    Thanks!! Previously I just have the winutils.exe downloaded and was still getting the error. Adding the hadoop.dll helps 🙂

  6. Aditya

    This solution worked fine.
    Thank you!

  7. Corey

    Does this mean hadoop needs to installed on windows? Or, as long as i have installed pyspark and the dll/exe I would be good

    1. NNK

      Technically you don’t need Hadoop to run PySpark on windows. you need winutils dll/exe files in the right path.


    For me winutil.exe + hadoop.dll , Thank you

  9. Anonymous

    A life saver