Spark SQL “case when” and “when otherwise”

Like SQL "case when" statement and “Swith", "if then else" statement from popular programming languages, Spark SQL Dataframe also supports similar syntax using “when otherwise” or we can also use “case when” statement. So let’s see an example on how to check for multiple conditions and replicate SQL CASE statement.

First Let’s do the imports that are needed and create spark context and DataFrame.

import org.apache.spark.sql.functions.{when, _}
val spark: SparkSession = SparkSession.builder()

import spark.sqlContext.implicits._
val data = List(("James","","Smith","36636","M",60000),

val cols = Seq("first_name","middle_name","last_name","dob","gender","salary")
val df = spark.createDataFrame(data).toDF(cols:_*)

1. Using “when otherwise” on Spark DataFrame.

when is a Spark function, so to use it first we should import using import org.apache.spark.sql.functions.when before. Above code snippet replaces the value of gender with new derived value. when value not qualified with the condition, we are assigning “Unknown” as value.

val df2 = df.withColumn("new_gender", when(col("gender") === "M","Male")
      .when(col("gender") === "F","Female")

when can also be used on Spark SQL select statement.

val df4 ="*"), when(col("gender") === "M","Male")
      .when(col("gender") === "F","Female")

2. Using “case when” on Spark DataFrame.

Similar to SQL syntax, we could use “case when” with expression expr() .

val df3 = df.withColumn("new_gender", 
      expr("case when gender = 'M' then 'Male' " +
                       "when gender = 'F' then 'Female' " +
                       "else 'Unknown' end"))

Using within SQL select.

val df4 ="*"),
      expr("case when gender = 'M' then 'Male' " +
                       "when gender = 'F' then 'Female' " +
                       "else 'Unknown' end").alias("new_gender"))

3. Using && and || operator

We can also use and (&&) or (||) within when function. To explain this I will use a new set of data to make it simple.

val dataDF = Seq(
      (66, "a", "4"), (67, "a", "0"), (70, "b", "4"), (71, "d", "4"
      )).toDF("id", "code", "amt")
       when(col("code") === "a" || col("code") === "d", "A")
      .when(col("code") === "b" && col("amt") === "4", "B")


| id|code|amt|new_column|
| 66|   a|  4|         A|
| 67|   a|  0|         A|
| 70|   b|  4|         B|
| 71|   d|  4|         A|


In this article, we have learned how to use spark “case when” using expr() function and “when otherwise” function on Dataframe also, we’ve learned how to use these functions with && and || logical operators. I hope you like this article.

Happy Learning !!

Naveen (NNK)

Naveen (NNK) is a Data Engineer with 20+ years of experience in transforming data into actionable insights. Over the years, He has honed his expertise in designing, implementing, and maintaining data pipelines with frameworks like Apache Spark, PySpark, Pandas, R, Hive and Machine Learning. Naveen journey in the field of data engineering has been a continuous learning, innovation, and a strong commitment to data integrity. In this blog, he shares his experiences with the data as he come across. Follow Naveen @ @ LinkedIn

Leave a Reply

This Post Has 9 Comments

  1. Anonymous

    just Awesome man!

  2. Vrunda

    Hi! When to use expr and which all functions can be used inside expr? Also, all the built in functions can be used on dataframes?

  3. nausheen aafreen

    I really love the contents. They are self explanatory. I am very happy that you have shared so much to learn ,for people like me who have no idea about spark.
    You are amazing <3

    1. NNK

      Hi Nausheen, Thanks for your comments. I am happy it serving the purpose.

  4. Anonymous

    good article for experienced folks as well

  5. Anonymous

    how to use 3. Using && and || operator in using string expressions

  6. Naveen V

    Very useful information for beginners to practice.
    Thanks for your information.

  7. Maha

    Very nice article.

    1. NNK

      Thanks for reading.