Spark SQL defines built-in standard String functions in DataFrame API, these String functions come in handy when we need to make operations on Strings. In this article, we will learn the usage of some functions with scala example. You can access the standard functions using the following import statement.
When possible try to leverage standard library functions as they are little bit more compile-time safety, handles null and performs better when compared to user-defined functions. If your application is critical on performance try to avoid using custom UDF functions at all costs as these are not guarantee on performance.
Spark SQL String Functions:
Click on each link from below table for more explanation and working examples of String Function with Scala example
|Spark SQL Log Math Functions Signature||Spark Functions Description|
|log(columnName: String): Column||Computes the natural logarithm of the given column.|
|log(base: Double, a: Column): Column|
log(base: Double, columnName: String): Column
|Returns the first argument-base logarithm of the second argument.|
|log10(e: Column): Column|
log10(columnName: String): Column
|Computes the logarithm of the given value in base 10.|
|log1p(e: Column): Column|
log1p(columnName: String): Column
|Computes the natural logarithm of the given value plus one.|
|log2(expr: Column): Column|
log2(columnName: String): Column
|Computes the logarithm of the given column in base 2.|
In this post, I’ve consolidated the complete list of Spark SQL String functions with a description and example of some commonly used functions. You can find more information about these at the following blog
Happy Learning !!