Spark – Extract DataFrame Column as List

  • Post author:
  • Post category:Apache Spark
  • Post last modified:November 7, 2022

Let’s see how to convert/extract the Spark DataFrame column as a List (Scala/Java Collection), there are multiple ways to convert this, I will explain most of them with examples. Remember that when you use DataFrame collect() you get Array[Row] not List[Stirng] hence you need to use a map() function to extract the first column from each row before convert it to a Scala/Java Collection list.

I will also cover how to extract the Spark DataFrame column as list with out duplicates.

Let’s Create a Spark DataFrame

val data = Seq(("James","Smith","USA","CA"),("Michael","Rose","USA","NY"),
val columns = Seq("firstname","lastname","country","state")
import spark.implicits._
val df = data.toDF(columns:_*)
//|    James|   Smith|    USA|   CA|
//|  Michael|    Rose|    USA|   NY|
//|   Robert|Williams|    USA|   CA|
//|    Maria|   Jones|    USA|   FL|

From above data, I will extract the state values as a List.

Example 1 – Spark Convert DataFrame Column to List

In order to convert Spark DataFrame Column to List, first select() the column you want, next use the Spark map() transformation to convert the Row to String, finally collect() the data to the driver which returns an Array[String].

Among all examples explained here this is best approach and performs better with small or large datasets.

//List(CA, NY, CA, FL)

The above examples extract all values from a DataFrame column as a List including duplicate values. If you wanted to remove the duplicates, use distinct.

//List(CA, NY, FL)

The better option would be running distinct() on Spark DataFrame before collecting as List or Array. If you have many values in a list, this performs better.

//List(CA, NY, FL)

Example 2 – Using Typed Dataset to Extract Column List

If you are using Dataset, use the below approach, since we are using Typed String encoders we don’t have to use map() transformation

//List(CA, NY, CA, FL)

Example 3 – Using RDD to Get Column List

In this example, I have used RDD to get Column List and used RDD map() transformation to extract the column we want. RDD collect() action returns Array[Any] . This actually performs better and it is the preferred approach if you are using RDD’s or PySpark DataFrame

val"state") => row(0))
//List(CA, NY, CA, FL)

Example 4 – Uset collectAsList() to Get Column List

Spark also provides collectAsList() action to collect the DataFrame Columns as a java.util.List[Row], If you are using Java this is the way to go.

//List(CA, NY, CA, FL)

Example 5 – Other Alternatives to Convert Column to List

This does not perform better. Here, first we are collecting the DataFrame and then extracting the first column from each row on Driver without utilizing the Spark cluster."state")>f.getString(0)).toList()


In this article, I have provided many examples of how to extract/convert the Spark DataFrame column as a list with or with out duplicates.

Happy Learning !!

NNK is a Big Data and Spark examples community page, all examples are simple and easy to understand and well tested in our development environment Read more ..

Leave a Reply

This Post Has One Comment

  1. John J

    don’t forget ` import spark.implicits._`

You are currently viewing Spark – Extract DataFrame Column as List