Here, I will explain how to run Apache Spark Application examples explained in this blog on windows using Scala & Maven from IntelliJ IDEA. Since the articles mentioned in this tutorial uses Apache Maven as the build system, we will use Maven to build the project.
Make sure you have the following before you proceed.
1. Clone Spark Examples GitHub Project into IntelliJ
Let’s clone Spark By Examples Github project into IntelliJ by using the Version Control option.
- Open IntelliJ IDEA
- Create a new project by selecting File > New > Project from Version Control.

Using this option, we are going to import the project directly from GitHub repository.

- On Get from Version Control window, select the Version control as Git and enter the below Github URL for URL and enter the directory where you wanted to clone.
https://github.com/spark-examples/spark-scala-examples.git
- If you don’t have Git installed, select the “Download and Install” option from the above window.
- After Git installation, select the clone option which clones the project into your given folder.
- This creates a new project on IntelliJ and starts cloning.
- Now, wait for a few mins to complete the clone and also import the project into the workspace.
Once the cloning completes, you will see the following project workspace structure on IntelliJ.

2. Run Maven build
Now run the Maven build. First, select the Maven from the right corner, navigate to Lifecycle > install, right-click, and select Run Maven Build.

This downloads all dependencies mentioned in the pom.xml file and compiles all examples in this tutorial. This also takes a few mins to complete and you should see the below message after a successful build.

3. Run Spark Program From IntelliJ
After successful Maven build, run src/main/scala/com.sparkbyexamples.spark.SparkSessionTest
example from IntelliJ.
In case if you still get errors during the running of the Spark application, please restart the IntelliJ IDE and run the application again. Now you should see the below message in the console.

Once you complete the running Spark sample example in IntelliJ, you should read what is Spark Session, what is Spark Context, Spark RDD, Spark RDD Actions, Spark RDD Transformations.
Related Articles
- Apache Spark Installation on Windows
- Spark Start History Server
- How to Run Spark Hello World Example in IntelliJ
- Spark SQL Create a Table
- What is Apache Spark Driver?
- Spark createOrReplaceTempView() Explained
- Install Apache Spark Latest Version on Mac
- Apache Spark 3.x Install on Mac
- How to Check Spark Version
Happy Learning !!
the pom.xml from git needs updated with new location of maven plugins in order to compile.
net.alchim31.maven
scala-maven-plugin
4.4.0
net.alchim31.maven
scala-maven-plugin
4.4.0 …
the pom.xml from git needs updated with new location of maven plugins in order to compile.
net.alchim31.maven
scala-maven-plugin
4.4.0
net.alchim31.maven
scala-maven-plugin
4.4.0 …