There are multiple ways to install PySpark on Mac and run using Jupyter Notebook. Below I have explained the step-by-step of PySpark and Jupyter installation on Mac OS using Homebrew.
Steps to install PySpark & Jupyter on Mac OS
- Step 1 – Install Homebrew
- Step 2 – Install Java
- Step 3 – Install Scala (Optional)
- Step 4 – Install Python
- Step 5 – Install PySpark
- Step 6 – Install Jupyter
- Step 7 – Run an Example in Jupyter
Related: PySpark installation on Windows
Step 1. Install PySpark on Mac using Homebrew
Homebrew is a Missing Package Manager for macOS (or Linux) that is used to install third-party packages like Java, PySpark on Mac OS. In order to use this, first, you need to install it by using the below command.
# Install Homebrew
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
This prompts for the root password. You will need to type your root password to run this command. On a personal laptop, this is the same password you enter when you log into your Mac. If you don’t have root access, contact your system admin. You should see something like this below after the successful installation of homebrew.

Post-installation, you may need to run the below command to set the brew to your $PATH
.
# Set brew to Path
echo 'eval "$(/opt/homebrew/bin/brew shellenv)"' >> /Users/admin/.zprofile
eval "$(/opt/homebrew/bin/brew shellenv)"
If the above command has issues, you can find the latest command from Homebrew.
Step 2. Install Java Version
PySpark uses Java underlying hence you need to have Java on your Mac. Since Java is a third-party, you can install it using the Homebrew command brew
. Since Oracle Java is not open source anymore, I am using the OpenJDK version 11. Run the below command in the terminal to install it.
# Install OpenJDK 11
brew install openjdk@11
Step 3. Install Scala
Since Spark is written in Scala language it is obvious you would need Scala to run Spark programs however to run PySpark this is optional.
# Install Scala (optional)
brew install scala
Step 4. Install Python
As you would know, PySpark is used to run Spark jobs in Python hence you also need Python to install on Mac OS. let’s install it by using Homebrew. If you already have Python 2.7 or the latest then ignore this step.
# install Python
brew install python
Step 5. Install PySpark on Mac
PySpark is a Spark library written in Python to run Python applications using Apache Spark capabilities. Spark was basically written in Scala and later on due to its industry adaptation its API PySpark was released for Python using Py4J. Py4J
 is a Java library that is integrated within PySpark and allows python to dynamically interface with JVM objects, hence to run PySpark you also need Java to be installed along with Python, and Apache Spark. So to use PySpark, let’s install PySpark on Mac.
# Install Apache Spark
brew install apache-spark
This installs the latest version of Apache Spark which ideally includes PySpark.

After successful installation of Apache Spark run pyspark
from the command line to launch PySpark shell.

Note that it displays Spark and Python versions to the terminal.
Step 6. Install Jupyter
In real-time when you are working on data analysis or machine learning, you would be required to run the PySpark application in the Jupyter notebook hence let’s learn how to use Jupyter after installing.
brew install jupyter
This installs Jupyterlab on your Mac OS.

Now let’s start the Jupyter notebook and run the PySpark example. This opens up Jupyter in a default web browser.
jupyter notebook
Step 7. Run PySpark Example in Jupyter Notebook
Now select New -> PythonX and enter the below lines and select Run. On Jupyter, each cell is a statement, so you can run each cell independently when there are no dependencies on previous cells.

Conclusion
In this PySpark installation article, you have learned the step-by-step installation of PySpark and Jupyter on Mac OS. Steps include installing Java, Scala, Python, PySpark, and Jupyter by using Homebrew.
Happy Learning !!
Related Articles
- Apache Spark Setup with Scala and IntelliJ
- Apache Spark Installation on Windows
- Spark Installation on Linux Ubuntu
- Spark Hello World Example in IntelliJ IDEA
- Spark Word Count Explained with Example
- Spark Setup on Hadoop Cluster with Yarn
- Spark Start History Server
- How to Check Spark Version
- Install PySpark on Ubuntu running on Linux
- Install PySpark in Anaconda & Jupyter Notebook
- How to Install PySpark on Mac
- How to Install PySpark on Windows
- Install Pyspark using pip or condo
- Dynamic way of doing ETL through Pyspark
How to Find PySpark Version?
PySpark Shell Command Usage with Examples
Install Anaconda & Run pandas on Jupyter Notebook - Ways to Install Jupyter Notebook on Mac OS
- Update Jupyter Notebook or Jupyterlab