Spark | Hadoop – Exception in thread “main” java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

  • Post author:

When you run the Spark program on windows OS, you often get the exception “Exception in thread “main” java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z”

Full-stack trace of the error


Exception in thread “main” java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

Solution: In order to run the Spark program on windows, you would need Hadoop winutils.exe file as windows don’t support HDFS and winutils provides a wrapper.

If you don’t have winutils.exe installed, please download the wintils.exe and hadoop.dll files from https://github.com/steveloughran/winutils (select the Hadoop version you are using as winutils are specific to Hadoop versions)

Copy the downloaded files to a folder on your system, for example, let’s say you have copied these files to c:/hadoop/bin, set the below environment variables.


set HADOOP_HOME=c:/hadoop
set PATH=%PATH%;%HADOOP_HOME%/bin;

Close and reload the command line or terminal to initialize these variables. Sometimes you may also need to put hadoop.dll file into the C:/Windows/System32 folder.

Now run your spark program and issue “Windows.access0(Ljava/lang/String;I)Z” should disappear.

However, if it still doesn’t work, try to restart your system as some of the above settings would get the effect with the restart.

Happy Learning !!

NNK

SparkByExamples.com is a Big Data and Spark examples community page, all examples are simple and easy to understand and well tested in our development environment Read more ..

Leave a Reply

This Post Has 11 Comments

  1. ABHISHEK RAJBHANU

    For me winutil.exe + hadoop.dll , Thank you

  2. Corey

    Does this mean hadoop needs to installed on windows? Or, as long as i have installed pyspark and the dll/exe I would be good

    1. NNK

      Technically you don’t need Hadoop to run PySpark on windows. you need winutils dll/exe files in the right path.

  3. Aditya

    This solution worked fine.
    Thank you!

  4. Wing

    Thanks!! Previously I just have the winutils.exe downloaded and was still getting the error. Adding the hadoop.dll helps 🙂

  5. Raghu

    Thank you. It worked for me on Windows.

  6. Anonymous

    copying dll file to Windows\System32 folder helped me!

    1. NNK

      Glad it worked for you. Thanks for sharing.

  7. Elena

    Thank you!

  8. diassy_devops

    merci beaucoup la methode fonctionne bien

    1. NNK

      Thank you.